-
-
Notifications
You must be signed in to change notification settings - Fork 10.4k
[Bugfix] Fix get_quant_config when using modelscope #24421
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request addresses an issue with loading quantization configurations for models from ModelScope. The fix involves ensuring the model is downloaded before its path is checked, which is a sound approach. However, this introduces a new function, maybe_download_from_modelscope
, that is nearly identical to an existing function in another file. I've added a high-severity comment to highlight this code duplication and suggest a refactoring to improve maintainability.
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Look reasonable to me!
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Signed-off-by: wangli <[email protected]>
Purpose
When one want to use models from modelscope, there appears an issue in the quantization scenario which will only get quant_config from huggingface
Test Plan
Test Result
Essential Elements of an Effective PR Description Checklist
supported_models.md
andexamples
for a new model.