Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

scaling down number of epoch to 1 #494

Open
clemsgrs opened this issue Jan 14, 2025 · 1 comment
Open

scaling down number of epoch to 1 #494

clemsgrs opened this issue Jan 14, 2025 · 1 comment

Comments

@clemsgrs
Copy link

hi, i am using the DINOv2 codebase to train a ViT on a custom dataset of ~100M images.
i would like to do only one pass over this dataset, as it is much more redundant than natural images (hence no clear benefit of seing the same images multiple times).
in the configuration file, there are a few epoch-defined parameters:

  • warmup_epochs
  • warmup_teacher_temp_epochs
  • freeze_last_layer_epochs

when epoch = 100, warmup_epochs = 10, meaning we warm up for 10% of training iterations.
when epoch = 1, my idea is to linearly scale down the other epoch-defined parameter and use percentage values instead of fixed integers.

to give a concrete example, i would use warmup_epochs = 0.1 which would always warm up for 10% training iterations
the question is: shall these 3 epoch-defined parameters be linearly scaled down or not?

thanks for the help!

@Multihuntr
Copy link

The default meaning of epoch in this codebase is "1250 iterations". See here and here

So, you just need to set your "epochs" and "warmup_epochs" appropriately for your batch size. e.g. Assuming a batch size of 1024, you'd use something like:

optim:
    epochs: 100000
    warmup_epochs: 10000

(I'm not affiliated with this codebase at all, I've just been looking at it recently)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants