-
Notifications
You must be signed in to change notification settings - Fork 51
Description
ideas/discussion - "sparsity aware element feedback"
use case would be learning rules for AI using sparse matrices, e.g. steps of backprop modifying a matrix of weights, or whatever else AI researchers may imagine. (eg backprop: 'vec_a' would be previous layer activations, 'vec_b' would be error values). Also further bridges the gap between a "sparse matrix lib" and 'graph processing'
- is there an existing interface in any existing matrix libraries that supports this functionality?
- are there other ways of expressing this (like "a hadamard product of a sparse matrix and (the tensor product of vec_a,vec_b)")?
- is this already possible?
my prefered option would be an interface that takes a lambda function, then people could use that to apply whatever operations they liked (eg expressing a product operation). (Personally I also think it would also be interesting to implement matrix multiply via a traversal taking generalised element combination & reduction functions aswel..)
This is trivial enough for dense matrices, fairly trivial for COO vs dense vectors, trickier for any compressed sparse formats X sparse vectors, and where it would get extremely useful (and difficult) is threaded implementations of these.
One may also want to consider different permuations of what to do with empty elements (eg would we want to apply this with all 'a[j]','b[j]',and m[i][j] occupied, or any occupied m[i][j] for either a[i] or b[j] occupied
