A callback is a set of functions to be applied at given stages of the training procedure. You can use callbacks to get a view on internal states and statistics of the model during training. You can pass a list of callback (as the keyword argument callbacks) to the .fit() method of the Sequential model. The relevant methods of the callbacks will then be called at each stage of the training.
keras.callbacks.Callback()
keras.models.Model. Reference of the model being trained.epoch.epoch.batch.batch.The logs dictionary will contain keys for quantities relevant to the current batch or epoch. Currently, the .fit() method of the Sequential model class will include the following quantities in the logs that it passes to its callbacks:
val_loss (if validation is enabled in fit), and val_accuracy (if validation and accuracy monitoring are enabled).size, the number of samples in the current batch.loss, and optionally accuracy (if accuracy monitoring is enabled).You can create a custom callback by extending the base class keras.callbacks.Callback. A callback has access to its associated model through the class property self.model.
Here's a simple example saving a list of losses over each batch during training:
class LossHistory(keras.callbacks.Callback):
def on_train_begin(self):
self.losses = []
def on_batch_end(self, batch, logs={}):
self.losses.append(logs.get('loss'))
class LossHistory(keras.callbacks.Callback):
def on_train_begin(self):
self.losses = []
def on_batch_end(self, batch, logs={}):
self.losses.append(logs.get('loss'))
model = Sequential()
model.add(Dense(784, 10, init='uniform'))
model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
history = LossHistory()
model.fit(X_train, Y_train, batch_size=128, nb_epoch=20, verbose=0, callbacks=[history])
print history.losses
# outputs
'''
[0.66047596406559383, 0.3547245744908703, ..., 0.25953155204159617, 0.25901699725311789]
'''
from keras.callbacks import ModelCheckpoint
model = Sequential()
model.add(Dense(784, 10, init='uniform'))
model.add(Activation('softmax'))
model.compile(loss='categorical_crossentropy', optimizer='rmsprop')
'''
saves the model weights after each epoch if the validation loss decreased
'''
checkpointer = ModelCheckpoint(filepath="/tmp/weights.hdf5", verbose=1, save_best_only=True)
model.fit(X_train, Y_train, batch_size=128, nb_epoch=20, verbose=0, validation_data=(X_test, Y_test), callbacks=[checkpointer])