Dynamic learning rate in training

I'm using keras 2.1.* and want to change the learning rate during training. I know about the schedule callback, but I don't use fit function and I don't have callbacks. I use train_on_batch. Is it possible in keras?

keraslearning-rate
4 votesJW234.00
1 Answers
LP190.00
2

If you use .fit function you can use scheduler for it like this.

def scheduler(epoch):
    if epoch == 5:
        model.lr.set_value(.02)
    return model.lr.get_value()

change_lr = LearningRateScheduler(scheduler)

model.fit(x_embed, y, nb_epoch=1, batch_size = batch_size, show_accuracy=True,
       callbacks=[chage_lr])

But if you use other functions like train_on_batch, you can change the learning rate with this way

from keras import backend as K

old_lr = K.get_value(model.optimizer.lr)
new_lr = old_lr * 0.1  # change however you want
K.set_value(model.optimizer.lr, new_lr)

By the way. If you use decay for learning rate, remember that it always calculating during training, and old_lr will be the learning rate you passed from beging.

Reply
Couldn't find what you were looking for?and we will find an expert to answer.
How helpful was this page?