-
Notifications
You must be signed in to change notification settings - Fork 359
Implemented Cholesky decomposition to numpy and tensorflow backends #890
base: master
Are you sure you want to change the base?
Conversation
…kends, fixing various pylint errors and adding test to each respective backend function
@alewis @mganahl Hello! I implemented the Cholesky decomposition to various backends following the idea of a pivot value for the tensors, with their respective tests where I defined a matrix that is assured to be positive-definite and also a hermitian matrix:
I assumed we won't be validating if the matrices are hermitian or positive-definite since there could be some performance cost in doing so (at least for positive-definite checking) the addition of a hermitian check would be nice to have. A Pylint error has been occurring in the test runs, but it is somewhere in the code where I haven't made changes to. the file in question is |
|
||
|
||
def cholesky( | ||
tf: Any, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Rename tf
to np
(the user passes the numpy
module here)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ya
@@ -52,6 +52,13 @@ def test_qr(self): | |||
q, r = decompositions.qr(np, random_matrix, 1, non_negative_diagonal) | |||
self.assertAllClose(q.dot(r), random_matrix) | |||
|
|||
def test_cholesky(self): | |||
#Assured positive-definite hermitian matrixs | |||
random_matrix = [[1.0, 0], [0, 1.0]] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using the identity matrix as "random" matrix is not an ideal test case. You can generate a positive definite matrix e.g. by
A = np.random.rand(D,D)
A = A @ A.T.conj()
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
btw. also test this for all supported dtypes
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do np.random.rand(D, D).astype(dtype)
to get your fav type. if it's imaginary you might need to initialize the real and imaginary parts separately.
put np.random_seed(10)
(or any other fixed integer) before the rand
so that the matrix is fixed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is no dtype
defined in any of the tests, should I define one myself?
[tf.reduce_prod(left_dims), | ||
tf.reduce_prod(right_dims)]) | ||
L = tf.linalg.cholesky(tensor) | ||
if non_negative_diagonal: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is unnecessary. The cholesky decomposition should already return a lower-triangular matrix with a real, positive diagonal.
tf: Any, | ||
tensor: Tensor, | ||
pivot_axis: int, | ||
non_negative_diagonal: bool |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove non_negative
argument
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(it only makes sense for QR)
torch: Any, | ||
tensor: Tensor, | ||
pivot_axis: int, | ||
non_negative_diagonal: bool = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove non_negative_diagonal
(see above)
@@ -42,6 +42,13 @@ def test_expected_shapes_rq(): | |||
assert r.shape == (2, 3, 6) | |||
assert q.shape == (6, 4, 5) | |||
|
|||
def test_cholesky(): | |||
#Assured positive-definite hermitian matrix | |||
random_matrix = np.array([[1.0, 0], [0, 1.0]]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
use better test matrix (see above)
#Assured positive-definite hermitian matrix | ||
random_matrix = np.array([[1.0, 0], [0, 1.0]]) | ||
random_matrix = torch.from_numpy(random_matrix) | ||
for non_negative_diagonal in [True, False]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
see above
tf: Any, | ||
tensor: Tensor, | ||
pivot_axis: int, | ||
non_negative_diagonal: bool |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
see above
[tf.reduce_prod(left_dims), | ||
tf.reduce_prod(right_dims)]) | ||
L = tf.linalg.cholesky(tensor) | ||
if non_negative_diagonal: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
see above
@@ -54,6 +54,13 @@ def test_qr(self): | |||
q, r = decompositions.qr(tf, random_matrix, 1, non_negative_diagonal) | |||
self.assertAllClose(tf.tensordot(q, r, ([1], [0])), random_matrix) | |||
|
|||
def test_cholesky(self): | |||
#Assured positive-definite hermitian matrix | |||
random_matrix = [[1.0, 0], [0, 1.0]] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
modify test, see above
Thanks for pointing this out. Likely this is due to a new tensorflow release. I'll fix it. |
Codecov Report
@@ Coverage Diff @@
## master #890 +/- ##
=======================================
Coverage 98.00% 98.00%
=======================================
Files 129 129
Lines 22625 22665 +40
=======================================
+ Hits 22173 22213 +40
Misses 452 452
Continue to review full report at Codecov.
|
|
||
|
||
def cholesky( | ||
tf: Any, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ya
tf: Any, | ||
tensor: Tensor, | ||
pivot_axis: int, | ||
non_negative_diagonal: bool |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(it only makes sense for QR)
@@ -52,6 +52,13 @@ def test_qr(self): | |||
q, r = decompositions.qr(np, random_matrix, 1, non_negative_diagonal) | |||
self.assertAllClose(q.dot(r), random_matrix) | |||
|
|||
def test_cholesky(self): | |||
#Assured positive-definite hermitian matrixs | |||
random_matrix = [[1.0, 0], [0, 1.0]] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do np.random.rand(D, D).astype(dtype)
to get your fav type. if it's imaginary you might need to initialize the real and imaginary parts separately.
put np.random_seed(10)
(or any other fixed integer) before the rand
so that the matrix is fixed.
L = phases[:, None] * L | ||
center_dim = tf.shape(L)[1] | ||
L = tf.reshape(L, tf.concat([left_dims, [center_dim]], axis=-1)) | ||
return L |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add a newline here (at the end of the file)
…i/TensorNetwork into CholeskyDecomposition
#Assured positive-definite hermitian matrixs | ||
random_matrix = np.random.rand(10, 10) | ||
random_matrix = random_matrix @ random_matrix.T.conj() | ||
L = decompositions.cholesky(tf, random_matrix, 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you are passing the tensorflow module here, but it should be numpy
left_dims = np.shape(tensor)[:pivot_axis] | ||
right_dims = np.shape(tensor)[pivot_axis:] | ||
tensor = np.reshape(tensor, | ||
[np.reduce_prod(left_dims), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
np
is the numpy
module. numpy
does not have reduce_prod
, that's a tensorflow
routine. This only passes the test because in decompositions_test.py
you are erroneously passing the tensorflow
module instead of the numpy
module to numpy.decompositions.cholesky
. pls fix this
right_dims = list(tensor.shape)[pivot_axis:] | ||
|
||
tensor = torch.reshape(tensor, (np.prod(left_dims), np.prod(right_dims))) | ||
L = np.linalg.cholesky(tensor) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you have to call the torch
version of choleksy
here, not numpy
.
Hi @LuiGiovanni, once the comments are fixed we can merge this |
Added the decomposition function following the idea of having a pivot on the tensor, this is in reference to issue #852. which we then reshape into a matrix for the Cholesky function used in NumPy linear algebra. which you may find here for reference.
I Will start working on adding the function to the missing backend which is symmetric since Jax uses NumPy's decomposition file. I await any feedback you may have and that you for the help!