Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

[Feature Request] Settability of Multiple Attention Types in Attention Dispatcher #12210

Closed
@tolgacangoz

Description

Really great release, especially in terms of usability for other attention types natively!

Is your feature request related to a problem? Please describe.
AFAIU, currently in Attention Dispatcher, we cannot set one attention type to attn1 and another one to attn2 at the transformer level. In its original implementation, SkyReels-V2 has "_native_cudnn" for its self-attentions and "flash_varlen" or "_flash_varlen_3" for its cross-attentions.

Describe the solution you'd like.
Maybe something like:

pipeline.transformer.set_attention_backend({
	#"attn": "flash",
 "self-attentions or attn1": "_native_cudnn",
 "cross-attentions or attn2": "flash_varlen"
})

Describe alternatives you've considered.

for block in pipeline.transformer.blocks:
 block.attn1.set_attention_backend("_native_cudnn")
 block.attn2.set_attention_backend("flash_varlen")

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

      Relationships

      None yet

      Development

      No branches or pull requests

      Issue actions

        AltStyle によって変換されたページ (->オリジナル) /