You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to get my hands on Gaussian processes and Jax, I'm quite new to these fields. Specifically, I am trying to learn as I go by adding complexity to my models iteratively.
I'm not far in that process, as of right now I'm trying to understand how to implement a simple multi-task GP with a lookup table giving covariance between tasks (as in Bonilla et al.). However, I found this was particularly hard with the current documentation.
On the other hand, I can see that much more complex things are already explained in GPJax guides (sparse GPs, deep kernel learning, ...). I can only assume I'm missing something obvious and simple.
Things I'm struggling with:
the concrete shape of the dataset (as GPJax datasets seem only to accept 2D matrices, how should I express the "task" dimension ?)
the way GPJax handles NaN values in observation points
the current state of development of GPJax. I appreciate this library's mathematical approach, but I think it's hard to understand what parts of the code are already robust enough and what is still missing.
If anyone has good pointers to, for example, implementations of multi-task GPs in GPJax, or simple advice, that would help me a lot.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi,
I'm trying to get my hands on Gaussian processes and Jax, I'm quite new to these fields. Specifically, I am trying to learn as I go by adding complexity to my models iteratively.
I'm not far in that process, as of right now I'm trying to understand how to implement a simple multi-task GP with a lookup table giving covariance between tasks (as in Bonilla et al.). However, I found this was particularly hard with the current documentation.
On the other hand, I can see that much more complex things are already explained in GPJax guides (sparse GPs, deep kernel learning, ...). I can only assume I'm missing something obvious and simple.
Things I'm struggling with:
If anyone has good pointers to, for example, implementations of multi-task GPs in GPJax, or simple advice, that would help me a lot.
Thanks for your consideration
Beta Was this translation helpful? Give feedback.
All reactions