Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix local installation after Rust 1.79 and transformers 4.41.2 #2071

Closed

Conversation

rYoussefAli
Copy link

I was trying to install text-generation-inference locally without Docker, and I encountered several problems while doing so. I collected all those problems and made the fixes here so that people do not encounter such tedious problems again.

Fixes # (issue)

  1. The rust-toolchain.toml file was overriding Rust version to 1.78.0, making the installation fail as the inline const feature in rust requires 1.79.0.
  2. Upgraded the transformers version.
  3. Fixed the error: "str" has no attribute "logits" in rw.py when the model starts

Before submitting

  • [ X] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • [ X] Did you read the contributor guideline,
    Pull Request section?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@OlivierDehaene
@Narsil

@Narsil
Copy link
Collaborator

Narsil commented Jun 24, 2024

Thanks a lot for the PR !

This PR seems to attempt to fix 3 problems, and while super nice, I think we should have at least 3 PRs given they are touching totally unrelated things.

The rust-toolchain is already fixed on main (and you don't seem to fix it it seems).

Rw.py should be in it's own PR (and with a reproducer, no need for a test since we don't really maintain that hard non flash models, no custom kernels models anymore). I don't think return_dict is necessary whatsoever (if anything that's introducing a bug to the best of my knowledge from transformers side)

As for the transformers version you are simply modifying the benchmark calls which is an optional dependency, and the lockfile only points to an old version because some package seems to depend on an old version.
Regardless we never manually touch the lock file, only pyproject.toml so the solution would need to be in it.

Cheers, happy to help if anything's unclear.

@Narsil
Copy link
Collaborator

Narsil commented Jul 1, 2024

Closing for inactivity, feel free to reopen/comment !
Cheers.

@Narsil Narsil closed this Jul 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants