You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am opening an issue based on this discourse thread in which I ask about estimating intractable likelihood functions with normalizing flows. As described in this paper, normalizing flows can approximate the likelihood function of a model by learning the relationship between model parameters and the output of a model. Having a working example would be of broad interest given that many scientific fields work with complex models for which a likelihood function is unknown.
The paper is associated with a Python package called SBI. Here is a simple working example based on a LogNormal distribution. I pasted my attempt at replicating it in Julia below. Note that the package generalizes to processes that emit multiple distributions, but I have used a single distribution for simplicity.
The architectural details in the article were a little sparse, citing:
For the neural spline flow architecture (Durkan et al., 2019), we transformed the reaction time data to the log-domain, used a standard normal base distribution, 2 spline transforms with 5 bins each and conditioning networks with 3 hidden layers and 10 hidden units each, and rectified linear unit activation functions. The neural network training was performed using the sbi package with the following settings: learning rate 0.0005; training batch size 100; 10% of training data as validation data, stop training after 20 epochs without validation loss improvement.
I am opening an issue based on this discourse thread in which I ask about estimating intractable likelihood functions with normalizing flows. As described in this paper, normalizing flows can approximate the likelihood function of a model by learning the relationship between model parameters and the output of a model. Having a working example would be of broad interest given that many scientific fields work with complex models for which a likelihood function is unknown.
The paper is associated with a Python package called SBI. Here is a simple working example based on a LogNormal distribution. I pasted my attempt at replicating it in Julia below. Note that the package generalizes to processes that emit multiple distributions, but I have used a single distribution for simplicity.
The architectural details in the article were a little sparse, citing:
For the neural spline flow architecture (Durkan et al., 2019), we transformed the reaction time data to the log-domain, used a standard normal base distribution, 2 spline transforms with 5 bins each and conditioning networks with 3 hidden layers and 10 hidden units each, and rectified linear unit activation functions. The neural network training was performed using the sbi package with the following settings: learning rate 0.0005; training batch size 100; 10% of training data as validation data, stop training after 20 epochs without validation loss improvement.
It appears that the architecture is heavily influenced by Sequential Neural Likelihood:
Fast Likelihood-free Inference with Autoregressive Flows. What appears to be the core Python code can be found here.
Also for additional background, Hossein has started a related package, but it is experimental and has no documentation.
Thank you for looking into this. I don't know much about neural network and Flux, but let me know if I can be helpful at all.
WIP Code
The text was updated successfully, but these errors were encountered: