Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exception: Could not find the transformer layer class to wrap in the model. #94

Open
Cloopen-ReLiNK opened this issue Mar 19, 2023 · 3 comments

Comments

@Cloopen-ReLiNK
Copy link

Traceback (most recent call last):
File "/root/train.py", line 231, in
train()
File "/root/train.py", line 225, in train
trainer.train()
File "/root/anaconda3/envs/test/lib/python3.10/site-packages/transformers/trainer.py", line 1628, in train
return inner_training_loop(
File "/root/anaconda3/envs/test/lib/python3.10/site-packages/transformers/trainer.py", line 1715, in _inner_training_loop
model = self._wrap_model(self.model_wrapped)
File "/root/anaconda3/envs/test/lib/python3.10/site-packages/transformers/trainer.py", line 1442, in _wrap_model
raise Exception("Could not find the transformer layer class to wrap in the model.")
Exception: Could not find the transformer layer class to wrap in the model.

transformers installed from huggingface/transformers#21955

@Vivecccccc
Copy link

same issue here

@Vivecccccc
Copy link

wonder if you could execute these two with python which is instructed by huggingface/transformers#21955

tokenizer = transformers.LLaMATokenizer.from_pretrained("/output/path/tokenizer/")
model = transformers.LLaMAForCausalLM.from_pretrained("/output/path/llama-7b/")

I discovered that I couldn't, so I consider it might be due to the installation of that forked transformers...

@Vivecccccc
Copy link

Found possible solution here #58 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants