Generates a training library, trains a network and writes it into a file with the specified number. If the specified file exists, the function will load that network file and continue training.
parameters - library and network specification struc- ture (see the header of deer_lib_gen.m) trsets - sizes of independent randomly generated training libraries to sequentially train against. [160k 160k 160k 160k 160k 160k] works well on a Tesla V100 card. file_number - the network object will be saved into a file with this number as the name, this file also serves as a restart checkpoint.
Two more fields are required in the parameters structure:
parameters.lastlayer - activation function to use in the output layer ('tansig', 'logsig', or 'purelin'). parameters.layer_sizes - number of neurons per layer, a row vector where the number of elements is the number of hid- den layers desired.
This function writes a MAT file with the network object and the parameters data structure.
The example below will train a single network using the parameters given in the netset_params.m file for the network ensemble optimized for any peak width.
% Load training set parameters run('net_set_any_peaks/netset_params.m'); % Specify the sizes of training databases to train against trsets=[160e4 160e4 160e4 160e4]; % Run the network training for a a single network train_one_net(parameters,trsets,111);
The function will save the the network into 111.mat file.
- A CUDA capable NVidia GPU is required.
- If the file pre-exists, the network will be loaded and used as the initial guess.