You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
An intelligent block matrix library for numpy, PyTorch, and beyond.
Crafted by Brandon Amos with significant
contributions by Eric Wong.
Why do we need an intelligent block matrix library?
Let's try to construct the KKT matrix from Mattingley and Boyd's
CVXGEN
paper in numpy and PyTorch:
Without block, there is no way to infer the appropriate sizes of
the zero and identity matrix blocks.
It is an inconvenience to think about what size these
matrices should be.
What does block do?
Block acts a lot like np.bmat and replaces:
Any constant with an appropriately shaped block matrix
filled with that constant.
The string 'I' with an appropriately shaped identity matrix.
The string '-I' with an appropriately shaped negated identity matrix.
Isn't constructing large block matrices with a lot of zeros inefficient?
Yes, block is meant to be a quick prototyping tool and
there's probably a more efficient way to solve your system
if it has a lot of zeros or identity elements.
How does block handle numpy and PyTorch with the same interface?
I wrote the logic to handle matrix sizing to be agnostic
of the matrix library being used.
numpy and PyTorch are just backends.
More backends can easily be added for your favorite
Python matrix library.