-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Big accuracy drop after simplify #12
Comments
@AndRayt Hi, thanks for your interest. Could you provide a sample code to reproduce the issue? |
@AndreaBrg https://colab.research.google.com/drive/1uuUxNEuNv9yG46eVQdviSY3ud5ETtY7r?usp=sharing |
Hi! Did you compare the accuracy of the pruned model before and after applying simplify? |
Hi! Do you mean that this is a problem of the torch.prune.ln_structured method as one of the ways to zero the weights of the model? As I know, structured pruning is performed in two stages: 1.Zeroing the weights of the model, 2. Trimming the previously zeroed weights. |
Yes, the aim of Simplify is only to reduce the running time for inference on a pruned model. Simplify should not alter accuracy whatsoever, so it is weird if that happens. Could you check if the accuracy drop is given by torch.prune.ln_structured or simplify? If the cause is ln_structured, you could maybe try with a more advanced pruning scheme |
Hi! Yes, we have a significant accuracy drop after ln_structured method due to the zeroing of big part (0.8) of the weights, but we also have an accuracy drop after Simplify, because we remove the zeroed weights and leave only part of the model weights, so the accuracy drops to almost zero. |
Sorry for the delay. It seems then that there might be a bug with the current Simplify implementation. Under no circumstances the simplify procedure should change the model output. We need to investigate this deeper, thanks for the issue. |
What version of torch, torchvision and simplify are you using? |
Hi guys, congrats for the work, nice library.
I am trying to use Simplify with ResNet on CIFAR-100 dataset. I use prune.ln_structured method from Pytorch with amount value is 0.8 and then apply simplify and do fine-tune of the model. As a result I have got the inference time speed up by two times. But I have big accuracy drop: 75% accuracy before prune.ln_structured + simplify + fine-tune and 65% after prune.ln_structured + simplify + fine-tune. Is it expected result? Have you checked accuracy before and after simplify in your cases?
The text was updated successfully, but these errors were encountered: