We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Lightning-AI/lit-llama#280
羊驼版本是在嵌入层中添加了一个额外的令牌来训练的,这就是转换失败的原因。一般来说,如果有一个骆驼模型具有不同的权重/词汇大小等配置,则转换为我们的“7B”定义是行不通的,我们不应该期望它。
您可以尝试通过删除转换脚本中的最后一行来截断嵌入层:
点亮骆驼/脚本/convert_hf_checkpoint.py
第 89 行 in5df20分贝
如果名称中的“模型.layers”: 通过添加条件:
if "embed_tokens" in name: sd[weight_map[name]] = param[:32000] (伪代码)
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Lightning-AI/lit-llama#280
羊驼版本是在嵌入层中添加了一个额外的令牌来训练的,这就是转换失败的原因。一般来说,如果有一个骆驼模型具有不同的权重/词汇大小等配置,则转换为我们的“7B”定义是行不通的,我们不应该期望它。
您可以尝试通过删除转换脚本中的最后一行来截断嵌入层:
点亮骆驼/脚本/convert_hf_checkpoint.py
第 89 行 in5df20分贝
如果名称中的“模型.layers”:
通过添加条件:
if "embed_tokens" in name:
sd[weight_map[name]] = param[:32000]
(伪代码)
The text was updated successfully, but these errors were encountered: