Skip to content

flash attention版本 #244

@bigfish77

Description

@bigfish77

你好,我按照pip3 install vllm==0.11.0发现对应的torch版本是2.8,然后我在flash attention官网上发现flash-attn==2.7.4.post1对应支持的torch最高版本是2.7,能告诉我一下具体的flash-attn版本是什么吗

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions