Change Keras optimizer during training
I'm using keras 2.1.4 with tensorflow backend. I'm thinking about changing keras optimizer algorithm during training. Is it possible in keras?
I'm using keras 2.1.4 with tensorflow backend. I'm thinking about changing keras optimizer algorithm during training. Is it possible in keras?
You need to change the way you train and avoid using model.fit() and model.compile() functions.
Here is an example from Keras documentation, where you can apply different optimizers during different stages of your training procedure.
# Instantiate an optimizer.
optimizer1 = tf.keras.optimizers.Adam()
optimizer1 = tf.keras.optimizers.SGD()
# Iterate over the batches of a dataset.
for x, y in dataset:
# Open a GradientTape.
with tf.GradientTape() as tape:
# Forward pass.
logits = model(x)
# Loss value for this batch.
loss_value = loss_fn(y, logits)
# Get gradients of loss wrt the weights.
gradients = tape.gradient(loss_value, model.trainable_weights)
# Update the weights of the model.
if condition1:
optimizer1.apply_gradients(zip(gradients, model.trainable_weights))
else:
optimizer2.apply_gradients(zip(gradients, model.trainable_weights))