HSSM Seems to Return Priors as Posterior #606
-
I've been working on implementing an RLDDM in HSSM following some advice from @krishnbera which so far has worked great! One of the issues I've run into though has been speed so I've tried working on setting up a pytensor version of the code so I can take advantage of the faster samplers and a GPU. To test this approach out, I've started by working with a simple RL-only case rather than the full RLDDM. However, whenever I try sampling this model, I seem to just get the priors back. I've included my code and some other model details below. Any help/advice would be greatly appreciated!
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
Hi @theonlydvr, I don't have an immediate answer looking at this, need to actually look at the likelihood. However can't confirm this is the case here from just looking at it. Will get back. |
Beta Was this translation helpful? Give feedback.
Hi @theonlydvr,
I don't have an immediate answer looking at this, need to actually look at the likelihood.
I did see something like this before, and in that case there was a mix-up of columns so that the connection between parameters and output didn't respect the expected relation at all --> parameters didn't really affect anything, so you would get back the prior.
However can't confirm this is the case here from just looking at it. Will get back.