You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Create a new folder named xxx_model in src/transformers/models/
Inside this folder, create a new Python file called modular_xxx.py with the following content:
import torch
import torch.nn as nn
try:
import torch.nn.functional as F
except:
pass
from ..llama.modeling_llama import (
LlamaMLP,
)
class Model(nn.Module):
def forward(self, x, w):
return F.linear(x, w)
Run the following command to execute the model converter: python utils/modular_model_converter.py --files_to_parse src/transformers/models/xxx_model/modular_xxx.py
This will generate the modeling file at: src/transformers/models/xxx_model/modeling_xxx.py.
Expected behavior
Expected vs Actual Contents in src/transformers/models/xxx_model/modeling_xxx.py
The expected contents in src/transformers/models/xxx_model/modeling_xxx.py is :
# 🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨
# This file was automatically generated from src/transformers/models/xxx_model/modular_xxx.py.
# Do NOT edit this file manually as any edits will be overwritten by the generation of
# the file from the modular. If any change should be done, please apply the change to the
# modular_xxx.py file directly. One of our CI enforces this.
# 🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨
import torch.nn as nn
try:
import torch.nn.functional as F
except:
pass
class Model(nn.Module):
def forward(self, x, w):
return F.linear(x, w)
However, the actual content generated in src/transformers/models/xxx_model/modeling_xxx.py is :
# 🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨
# This file was automatically generated from src/transformers/models/xxx_model/modular_xxx.py.
# Do NOT edit this file manually as any edits will be overwritten by the generation of
# the file from the modular. If any change should be done, please apply the change to the
# modular_xxx.py file directly. One of our CI enforces this.
# 🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨🚨
import torch.nn as nn
class Model(nn.Module):
def forward(self, x, w):
return F.linear(x, w)
Issue
The lines try: import torch.nn.functional as F except: pass are missing in the actual content, even though it exists in the original modular file.
The text was updated successfully, but these errors were encountered:
System Info
transformers 4.48.0.dev0 d8c1db2
Who can help?
@ArthurZucker
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
How to reproduce?
git clone [email protected]:huggingface/transformers.git && cd transformers && git checkout d8c1db2f568d4bcc254bc046036acf0d6bba8373
Create a new folder named
xxx_model
insrc/transformers/models/
Inside this folder, create a new Python file called
modular_xxx.py
with the following content:python utils/modular_model_converter.py --files_to_parse src/transformers/models/xxx_model/modular_xxx.py
This will generate the modeling file at:
src/transformers/models/xxx_model/modeling_xxx.py
.Expected behavior
Expected vs Actual Contents in src/transformers/models/xxx_model/modeling_xxx.py
The expected contents in
src/transformers/models/xxx_model/modeling_xxx.py
is :However, the actual content generated in
src/transformers/models/xxx_model/modeling_xxx.py
is :Issue
The lines
try: import torch.nn.functional as F except: pass
are missing in the actual content, even though it exists in the original modular file.The text was updated successfully, but these errors were encountered: