Skip to content

Commit

Permalink
Update documentation to reflect choice of autodiff backend
Browse files Browse the repository at this point in the history
  • Loading branch information
mrazomej committed Dec 28, 2023
1 parent 756068e commit 3c9a274
Show file tree
Hide file tree
Showing 2 changed files with 47 additions and 27 deletions.
40 changes: 27 additions & 13 deletions docs/src/examples.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,26 +26,43 @@ import CSV

# Import library to perform Bayesian inference
import Turing
import AdvancedVI

# Import AutoDiff backend
using ReverseDiff

# Import Memoization
using Memoization

# Impor statistical libraries
import Random
import StatsBase
import Distributions

Random.seed!(42)
```

## Selecting the AutoDiff backend

Recall that we can use either `ForwardDiff` or `ReverseDiff` as the AutoDiff
backend. The choice usually depends on the number of parameters to be inferred.
For large number of parameters, `ReverseDiff` is usually faster, while
`ForwardDiff` is faster for small number of parameters. To choose the backend,
we use the `:advi` keyword argument in the `BarBay.vi.advi` function. For
`ForwardDiff`, we use

```julia
:advi => Turing.ADVI(n_samples, n_steps)
```

as `ForwardDiff` is the default backend. For `ReverseDiff`, we use

# Set AutoDiff backend
Turing.setadbackend(:reversediff)
# Allow system to generate cache to speed up computation
Turing.setrdcache(true)
```julia
:advi => Turing.ADVI{AdvancedVI.ReverseDiffAD{false}}(n_samples, n_steps)
```

where the `false` indicates that we are not using the cache for the random
number tape. See the
[`AdvancedVI.jl`](https://github.com/TuringLang/AdvancedVI.jl/tree/master)
repository for more information.

## Single dataset single environment variational inference

For the case where there is a single dataset produced with a series of
Expand Down Expand Up @@ -139,8 +156,7 @@ param = Dict(
:logλ_prior => logλ_prior,
),
:advi => Turing.ADVI(n_samples, n_steps),
:opt => Turing.TruncatedADAGrad(),
:fullrank => false
:opt => Turing.TruncatedADAGrad()
)

# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% #
Expand Down Expand Up @@ -333,8 +349,7 @@ param = Dict(
),
:env_col => :env,
:advi => Turing.ADVI(n_samples, n_steps),
:opt => Turing.TruncatedADAGrad(),
:fullrank => false
:opt => Turing.TruncatedADAGrad()
)

# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% #
Expand Down Expand Up @@ -544,8 +559,7 @@ param = Dict(
),
:genotype_col => :genotype,
:advi => Turing.ADVI(n_samples, n_steps),
:opt => Turing.TruncatedADAGrad(),
:fullrank => false
:opt => Turing.TruncatedADAGrad()
)

# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% #
Expand Down
34 changes: 20 additions & 14 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -273,27 +273,33 @@ parameters we need to define are:
samples and steps.
- `opt`: Optimization algorithm for ADVI.

To speed-up the computation, we will use
[`ReverseDiff.jl`](https://github.com/JuliaDiff/ReverseDiff.jl) as the auto
differentiation backend (see
To speed-up the computation for a large number of parameters, we will use
[`ReverseDiff.jl`](https://github.com/JuliaDiff/ReverseDiff.jl) as the automatic
differentiation backend, also known as backpropagation (see
[`Turing.jl`](https://turing.ml/v0.22/docs/using-turing/autodiff) documentation
for more information on this). Let's import the necessary packages and set the
differentiation backend options.
for more information on this). This is generally a good practice if the number
of barcodes is large. However, for small datasets, we recommend using
[`ForwardDiff.jl`](https://github.com/JuliaDiff/ForwardDiff.jl) instead.

!!! note
The AutoDiff backend for ADVI is set using the `AdvancedVI` module. This is
done in the `:advi` option of the `param` dictionary. For `ForwardDiff.jl`,
we can do `:advi => Turing.ADVI(n_samples, n_steps)`, as `ForwardDiff.jl` is
the default backend. For `ReverseDiff.jl`, we need to do
`:advi => Turing.ADVI{AdvancedVI.ReverseDiffAD{false}}(n_samples, n_steps)`,
where the `false` indicates that we won't use the cache for the random
number tape. See the
[`AdvancedVI.jl`](https://github.com/TuringLang/AdvancedVI.jl/tree/master)
repository for more information.

```julia
# Import library to perform Bayesian inference
import Turing
# Import library to set AutoDiff backend for ADVI
import AdvancedVI

# Import AutoDiff backend
using ReverseDiff

# Import Memoization
using Memoization

# Set AutoDiff backend
Turing.setadbackend(:reversediff)
# Allow system to generate cache to speed up computation
Turing.setrdcache(true)
```

For this dataset, we use the [`BarBay.model.fitness_normal`](@ref)
Expand All @@ -318,7 +324,7 @@ param = Dict(
:s_bc_prior => [0.0, 1.0],
:logλ_prior => logλ_prior,
),
:advi => Turing.ADVI(n_samples, n_steps),
:advi => Turing.ADVI{AdvancedVI.ReverseDiffAD{false}}(n_samples, n_steps),
:opt => Turing.TruncatedADAGrad(),
)
```
Expand Down

2 comments on commit 3c9a274

@mrazomej
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator register()

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Error while trying to register: "Tag with name v0.0.1 already exists and points to a different commit"

Please sign in to comment.