Describe the bug
I am doing my research project and use the _triton_mixed_sparse_attention function in pit_sparse_flash_attention_v2.py. It works fine with triton==3.0.0, but it returns with an LLVM type error with triton==3.2.0. Below is the raised error:
python: /source/llvm-project/llvm/include/llvm/Support/Casting.h:566: decltype(auto) llvm::cast(const From &) [To = mlir::ShapedType, From = mlir::Type]: Assertion `isa<To>(Val) && "cast<Ty>() argument of incompatible type!"' failed
Steps to reproduce
No response
Expected Behavior
No response
Logs
No response
Additional Information
No response