Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

question on the train paramter #35

Closed
JTaozhang opened this issue Apr 19, 2023 · 2 comments
Closed

question on the train paramter #35

JTaozhang opened this issue Apr 19, 2023 · 2 comments

Comments

@JTaozhang
Copy link

To developer,

When I use the examples of Bi and graphene system, I notice some parameters (revert_then_decay, revert_decay_epoch, revert_decay_gamma) in the train.ini file, which are not explained in the manual wbsite. Therefore,if we start a calculation on new system, I want to ask whether these parameters need to be considered.

best regard.

@mzjb
Copy link
Owner

mzjb commented Apr 19, 2023

Hi there,

The parameters you mentioned (revert_then_decay, revert_decay_epoch, revert_decay_gamma) are used during the training process to enhance stability. When the loss sharply increases, these parameters enable the neural network to revert back to an earlier state and decrease the learning rate to continue training, which can help to increase the stability of the training process.

If you are starting a new training process, you can leave these parameters at their default values. However, if you find that the learning rate is decreasing too quickly and leading to a large final loss, you can increase both the revert_decay_epoch and epochs parameters. Alternatively, you can increase the number of steps at which the learning rate decreases, for example by setting

[train]
epochs = 7000
revert_then_decay = True
revert_threshold = 60
revert_decay_epoch = [800, 2000, 4000, 6000]
revert_decay_gamma = [0.4, 0.5, 0.5, 0.5]

@JTaozhang
Copy link
Author

ok. Thanks for your explanation, I will test them when I meet this situation.

@mzjb mzjb closed this as completed Jun 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants