Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Add pruna integration for loading model through diffusers.from_pretrained / pipeline #11700

Closed
@davidberenstein1957

Description

Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...].

Pruna is an open-source AI model optimisation framework.
As discussed with @SunMarc about doing this for transformers but we could do something similar for diffusers as well. Something like *Pipeline.from_pretrained interface as an alternative to the PrunaModel interface.

Currently, the code looks as follows.

Currently, the code looks as follows.

from pruna import PrunaModel
loaded_model = PrunaModel.from_hub(
 "PrunaAI/FLUX.1-dev-smashed"
)

We could go for something like.

import torch
from diffusers import FluxPipeline
pipe = FluxPipeline.from_pretrained("PrunaAI/FLUX.1-dev-smashed")

Describe the solution you'd like.
It would be a nice integration.

Describe alternatives you've considered.
We could also add the library as an explicit tab selector within the Hub, similar to llama-cpp/unsloth and other frameworks.

Additional context.
NA

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

      Relationships

      None yet

      Development

      No branches or pull requests

      Issue actions

        AltStyle によって変換されたページ (->オリジナル) /