PL not honoring the learning rate setting?

Changing the LR to other than the default value of 0.001 seems to have no effect on training, while beta1 and beta2 do. Also, when returning to the Run screen, all parameters are stored from previous run except LR that’s back to 0.001

Hi @birdstream,
Thanks for reporting this!

We found the issue now, it turns out that it keeps resetting the learning rate to whatever you set the first time you run the model. We will be getting a fix out for this asap.

Best,
Robert

1 Like

@Birdstream (min 20 chars, it seems :wink: ) Good catch! :slight_smile:

1 Like