Skip to content

[Bug]: _triton_mixed_sparse_attention in pit_sparse_flash_attention_v2.py doesn't work with triton 3.2.0 #196

@zhengqigao

Description

@zhengqigao

Describe the bug

I am doing my research project and use the _triton_mixed_sparse_attention function in pit_sparse_flash_attention_v2.py. It works fine with triton==3.0.0, but it returns with an LLVM type error with triton==3.2.0. Below is the raised error:

python: /source/llvm-project/llvm/include/llvm/Support/Casting.h:566: decltype(auto) llvm::cast(const From &) [To = mlir::ShapedType, From = mlir::Type]: Assertion `isa<To>(Val) && "cast<Ty>() argument of incompatible type!"' failed

Steps to reproduce

No response

Expected Behavior

No response

Logs

No response

Additional Information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions