The folder contains example SparseTIR implementations of typical sparse operators in Deep Learning.
- SpMM: Sparse-Dense Matrix Multiplication
- We introduce how to use composable formats/transformations to optimize SpMM in SparseTIR, we also demonstrate how to formulate TC-GNN in SparseTIR.
- SDDMM: Sampled Dense Dense Matrix Multiplication
- We demonstrate how to use composable transformations to formulate PRedS in SparseTIR, and perform some parameter search for optimization.
- Block Sparse: Sparse operators on block sparse format
- Sparse operators on block sparse formats.
- RGMS: Relational Gather-Matmul-Scatter.
- Notable examples of RGMS are Relational Graph Convolutional Networks (RGCN) and Sparse Convolution for point cloud processing.
- We show how to fuse Gather, Matrix Multiplication and Scatter in a single kernel and uses SparseTIR's composable formats/transformations to optimize it.
More examples are coming, including FusedMM+FlashAttention for Sparse Matrix.
We welcome contributions from community, please create a pull request if you find a better schedule for any of existing examples or have a SparseTIR implementation of new sparse operators.