Activations can either be used through an Activation
layer, or through the activation
argument supported by all forward layers:
from keras.layers.core import Activation, Dense
model.add(Dense(64, 64, init='uniform'))
model.add(Activation('tanh'))
is equivalent to:
model.add(Dense(20, 64, init='uniform', activation='tanh'))
You can also pass an element-wise Theano function as an activation:
def tanh(x):
return theano.tensor.tanh(x)
model.add(Dense(20, 64, init='uniform', activation=tanh))
model.add(Activation(tanh))
(nb_samples, nb_dims)
).(nb_samples, nb_timesteps, nb_dims)
.Activations that are more complex than a simple Theano function (eg. learnable activations, configurable activations, etc.) are available as Advanced Activation layers, and can be found in the module keras.layers.advanced_activations
. These include PReLU and LeakyReLU.