You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Running the code faces an error because it cannot import LlamaTokenizer.
File "/local/home/.../SparseGPT/datautils.py", line 6, in <module>
from transformers import AutoTokenizer, LlamaTokenizer
ImportError: cannot import name 'LlamaTokenizer' from 'transformers'
If you update to the latest version you run into.a different issue:raise OSError(
OSError: Unable to load weights from pytorch checkpoint file for '/home/.../.cache/huggingface/hub/models--facebook--opt-125m/snapshots/27dcfa74d334bc871f3234de431e71c6eeba5dd6/pytorch_model.bin' at '/home/.../.cache/huggingface/hub/models--facebook--opt-125m/snapshots/27dcfa74d334bc871f3234de431e71c6eeba5dd6/pytorch_model.bin'. If you tried to load a PyTorch model from a TF 2.0 checkpoint, please set from_tf=True.
The best workaround is to use transformers v4.21.2 and remove the LlamaTokenizer from line 6 of datautils.py.
The text was updated successfully, but these errors were encountered:
Running the code faces an error because it cannot import LlamaTokenizer.
If you update to the latest version you run into.a different issue:raise OSError(
The best workaround is to use transformers v4.21.2 and remove the LlamaTokenizer from line 6 of datautils.py.
The text was updated successfully, but these errors were encountered: