-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
configs for Hyena Wikitext103 experiments #28
Comments
Can you share the config? Wikitext is quite sensitive to a few hyperparameters. Flash attention will not affect the result for Hyena. |
Thanks for your response. I attach the config file |
You should set dropouts to 0.2 as a first step. After you get to sub 19 ppl you will be in tuning range. |
Thank you, shall I also set the order to be 3 in the Heyna layer? |
Could you please put the configs you used in configs/experiment/wt103? That would be super helpful! |
Did you reproduce the 19 ppl result using dropout=0.2? I still get 22 |
I set the dropout to 0.2 and the order to 3 and get about 20 but still can not get the reported result |
You can look at this config for an independent reproduction that gets to sub 19. Let me know if after this you still have issues with the loss being too high, and I'll rerun experiments in the new codebase. |
Question: |
Thanks for the helpful reference. However, I checked that repo and the released [log] from S5 (https://wandb.ai/jimmysmith1919/S5_ICL/reports/Hyena-red-and-Hyena-S5-blue-on-WikiText-103--Vmlldzo0MTkwODEx?accessToken=pk0zw5w75uo1s4zkn3kh7koum902t4q2yzbm28xk0olzzgxuskoq0g1iyauixlob) which shows that Hyena with test perplexity 19.094. It would be very helpful if you can share the detailed configuration of Hyena on wikitext-103. |
Your work is excellent! I am trying to follow your work and facing some problems. I wonder if you may share the config for the wiki103 dataset of the Hyena. I try to conduct experiments with 125-slim but the test perplexity is higher than the reported result (about 21 with hyena). And I am wondering whether the removal of flash-atten will influence the result or not.
The text was updated successfully, but these errors were encountered: