In early December, thousands of machine learning researchers all around the world gathered in Long Beach for 2017’s Neural Information Processing Systems conference. At NIPS 2017 workshop on Bayesian optimation, AItrics research team, collaborated with Pohang University of Science and Technology (POSTECH), presented its Transfer Learning Methodology that can navigate the hyper-parameters of machine learning algorithms in the fast and reliable way through Bayesian optimization. The study was conducted based on the idea that a technique for robust learning is required even in the environment with a small, limited amount of data. The goal of this study was to select initial points for Bayesian hyper-parameter optimization using a neural network that can learn meta-features from datasets and to ultimately optimize hyper-parameters of deep residual networks for image classification. In the process, we displayed meta-learning framework to find the best hyper-parameter for a classifier by directly learning meta-features with a Siamese network, where each network is composed of convolutional bidirectional long short-term memory network (LSTM). In order to support our results, we trained the Siamese convolutional bi-directional LSTM by minimizing the difference between meta-feature distance and ground-truth target distance.