Discussions>Dynamic learning rate in training>

Dynamic learning rate in training

I'm using keras 2.1.* and want to change the learning rate during training. I know about the schedule callback, but I don't use fit function and I don't have callbacks. I use train_on_batch. Is it possible in keras?

4 votesJW275.00
1 Answers

If you use .fit function you can use scheduler for it like this.

def scheduler(epoch):
    if epoch == 5:
    return model.lr.get_value()

change_lr = LearningRateScheduler(scheduler)

model.fit(x_embed, y, nb_epoch=1, batch_size = batch_size, show_accuracy=True,

But if you use other functions like train_on_batch, you can change the learning rate with this way

from keras import backend as K

old_lr = K.get_value(model.optimizer.lr)
new_lr = old_lr * 0.1  # change however you want
K.set_value(model.optimizer.lr, new_lr)

By the way. If you use decay for learning rate, remember that it always calculating during training, and old_lr will be the learning rate you passed from beging.

Couldn't find what you were looking for?and we will find an expert to answer.
How helpful was this page?