This error is because of
schedule{
step:0
learning_rate: 0.00019...
}
So just change whole optimizer block to latest models optimizer block.
(This code is in the config file!)
before
optimizer {
momentum_optimizer {
learning_rate {
manual_step_learning_rate {
initial_learning_rate: 0.000199999994948
schedule {
step: 0
learning_rate: 0.000199999994948
}
schedule {
step: 900000
learning_rate: 1.99999994948e-05
}
schedule {
step: 1200000
learning_rate: 1.99999999495e-06
}
}
}
momentum_optimizer_value: 0.899999976158
}
use_moving_average: false
}
modified (example)
optimizer {
momentum_optimizer {
learning_rate {
cosine_decay_learning_rate {
learning_rate_base: 0.0399999991059
total_steps: 25000
warmup_learning_rate: 0.0133330002427
warmup_steps: 2000
}
}
momentum_optimizer_value: 0.899999976158
}
use_moving_average: false
}
or make step 0-> step 100 or something else, (not zero)
refer to this article
https://github.com/tensorflow/models/issues/3794
No comments:
Post a Comment