Since I have installed Caffe, here I will adopt it to my own application. The aim of my examples is for binary classification of images. I will exploit the trained bvlc_reference_caffenet model and then fine tune it for my application.
The whole steps are as follows:
[TOC]
Data Preparation
Prepare original data
You need to prepare four files:
- train folder which contains the training images
- val folder which contains the testing images
- train.txt file which contains the labels of training images.
- val.txt file which contains the labels of testing images.
Note that the order of image name in *.txt file is equal to that in train folder and val folder.
The train.txt looks like:
That means train folder includes two subfolders cat and dog. In each subfolder, it contains the cat or dog images relatively.
What you need to do is going to caffe_master/data folder, creating a new folder named myself, then putting the four files above into it.
So now in myself folder, you can see four files: train, val, train.txt, val.txt
Transform data format
Later we will use some tools to transform the image files to the data format which ImageNet model consumes.
- Copy all *.sh files in caffe-master/examples/imagenet to myself folder.
- Change the file path in create_imagenet.sh
- Run it and then it will generate myself_train_lmdb and myself_val_lmdb in myself
Compute the mean value
- Change the file path in make_imagenet_mean.sh
- Run it and it will generate myself_mean.binaryproto in myself
Okay, till now, you have prepared all the data the ImageNet Model needs.
Under myself folder, you can see:
- myself_train_lmdb folder: it contains the training data
- myself_val_lmdb folder: it contains the testing data
- myself_mean.binaryproto: it is the mean value
Fine Tune the trained model
Firstly, we need to download the trained model.
After that you need to fine tune it.
change the train_val.prototxt
(1) change the input data path relatively(2) We change the name of last layer from “fc8” to “fc8_myself”. Because there is no layer named “fc8_myself”, it’s weights will be retrained according to random weights.
(3) change the params of the last layer
- set up the solver.prototxt
Train the model
The results will be showed as follows: