-
Notifications
You must be signed in to change notification settings - Fork 6.3k
[quantization] feat: support aobaseconfig classes in TorchAOConfig
#12275
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
HuggingFaceDocBuilderDev
commented
Sep 3, 2025
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.
@jerryzh168
jerryzh168
Sep 8, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we can deprecate this one since this is less scalable than AOBaseConfig
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, will do so after this PR. Meanwhile, if you could review the PR, it'd be helpful.
init Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>
What does this PR do?
The
AOBaseConfig
classes introduced intorchao
(since 0.9.0) are more flexible. Similar to Transformers, this PR adds support for allowing them in Diffusers:@stevhliu, would it be possible for you to propagate the relevant changes to our TorchAO docs from Transformers? Can happen in a later PR.