Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clash in requirements for finetuning Starcoder2 #12

Open
Exorust opened this issue Mar 10, 2024 · 3 comments
Open

Clash in requirements for finetuning Starcoder2 #12

Exorust opened this issue Mar 10, 2024 · 3 comments

Comments

@Exorust
Copy link

Exorust commented Mar 10, 2024

#Facing the following error while trying to finetune Starcoder2 with the given script.

Description:

For transformers.AutoModelForCausalLM to recognize Starcoder2 transformers>4.39.0 is required.

But trl is still using transformers==4.38.2. Even if I compile from source & use trl=0.7.12.dev0 I still get an issue.

Here is the error with using transformers==4.38.2

KeyError                                  Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py](https://localhost:8080/#) in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
   1127             try:
-> 1128                 config_class = CONFIG_MAPPING[config_dict["model_type"]]
   1129             except KeyError:

4 frames
KeyError: 'starcoder2'

During handling of the above exception, another exception occurred:

ValueError                                Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py](https://localhost:8080/#) in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
   1128                 config_class = CONFIG_MAPPING[config_dict["model_type"]]
   1129             except KeyError:
-> 1130                 raise ValueError(
   1131                     f"The checkpoint you are trying to load has model type `{config_dict['model_type']}` "
   1132                     "but Transformers does not recognize this architecture. This could be because of an "

ValueError: The checkpoint you are trying to load has model type `starcoder2` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date. 

Here is the error when using transformers==4.39.0

ImportError                               Traceback (most recent call last)
[<ipython-input-2-3ef713ffd06d>](https://localhost:8080/#) in <cell line: 1>()
----> 1 from trl import SFTTrainer
      2 print("trl version:", trl.__version__)

1 frames
[/usr/local/lib/python3.10/dist-packages/trl/__init__.py](https://localhost:8080/#) in <module>
      3 __version__ = "0.7.12.dev0"
      4 
----> 5 from .core import set_seed
      6 from .environment import TextEnvironment, TextHistory
      7 from .extras import BestOfNSampler

[/usr/local/lib/python3.10/dist-packages/trl/core.py](https://localhost:8080/#) in <module>
     23 import torch.nn.functional as F
     24 from torch.nn.utils.rnn import pad_sequence
---> 25 from transformers import top_k_top_p_filtering
     26 
     27 from .import_utils import is_npu_available, is_xpu_available

ImportError: cannot import name 'top_k_top_p_filtering' from 'transformers' (/usr/local/lib/python3.10/dist-packages/transformers/__init__.py)

---------------------------------------------------------------------------
NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.

To view examples of installing some common dependencies, click the
"Open Examples" button below.
---------------------------------------------------------------------------
@Exorust Exorust closed this as completed Mar 10, 2024
@Exorust
Copy link
Author

Exorust commented Mar 10, 2024

Issue is still open: huggingface/trl#1409
It is shown here as well

@Exorust Exorust reopened this Mar 10, 2024
@Exorust
Copy link
Author

Exorust commented Apr 4, 2024

Nothing? @loubnabnl any suggestions on how the default Finetuning script doesn't work out of the box?

@Exorust Exorust closed this as completed Apr 4, 2024
@Exorust Exorust reopened this Apr 4, 2024
@mrmattwright-mt
Copy link

If you change the requirements.txt to reference the latest version of trl this will work fine.

git+https://github.com/huggingface/transformers.git
accelerate==0.27.1
datasets>=2.16.1
bitsandbytes==0.41.3
peft==0.8.2
git+https://github.com/huggingface/trl.git
wandb==0.16.3
huggingface_hub==0.20.3

That should do it. I'm using Rye to install dependencies, but that change worked for me. Rye dependencies in pyproject.toml looks like this...

dependencies = [
    "transformers @ git+https://github.com/huggingface/transformers.git",
    "accelerate==0.27.1",
    "datasets>=2.16.1",
    "bitsandbytes==0.41.3",
    "peft==0.8.2",
    "trl @ git+https://github.com/huggingface/trl.git",
    "wandb==0.16.3",
    "huggingface_hub==0.20.3",
]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants