-
Notifications
You must be signed in to change notification settings - Fork 6.3k
Closed
@tolgacangoz
Description
Really great release, especially in terms of usability for other attention types natively!
Is your feature request related to a problem? Please describe.
AFAIU, currently in Attention Dispatcher, we cannot set one attention type to attn1
and another one to attn2
at the transformer
level. In its original implementation, SkyReels-V2 has "_native_cudnn"
for its self-attentions and "flash_varlen"
or "_flash_varlen_3"
for its cross-attentions.
Describe the solution you'd like.
Maybe something like:
pipeline.transformer.set_attention_backend({ #"attn": "flash", "self-attentions or attn1": "_native_cudnn", "cross-attentions or attn2": "flash_varlen" })
Describe alternatives you've considered.
for block in pipeline.transformer.blocks: block.attn1.set_attention_backend("_native_cudnn") block.attn2.set_attention_backend("flash_varlen")
Metadata
Metadata
Assignees
Labels
No labels