Skip to content

Commit

Permalink
Merge pull request #46 from DARPA-ASKEM/calibrate
Browse files Browse the repository at this point in the history
Update Calibrate topocs
  • Loading branch information
mecrouch authored Jan 22, 2025
2 parents 67f7ba6 + df6e2ae commit c921608
Show file tree
Hide file tree
Showing 16 changed files with 291 additions and 29 deletions.
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/img/simulation/calibrate/error.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/img/simulation/calibrate/loss.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/img/simulation/calibrate/mapping.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/img/simulation/calibrate/variables.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 4 additions & 0 deletions docs/simulation/calibrate-ensemble.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,10 @@ title: Calibrate ensemble

Calibrate ensemble extends the calibration process by working across multiple models simultaneously, allowing you to explore how different configurations collectively align with historical data. By aggregating results from multiple models, Calibrate ensemble can provide a more comprehensive understanding of system behavior.

???+ tip

You can quickly create an ensemble calibration using the [Calibrate an ensemble model workflow template](../workflows/index.md#create-new-workflows-based-on-templates).

## Calibrate ensemble operator

In a workflow, the Calibrate ensemble operator takes two or more model configurations and a dataset as inputs. It outputs a calibrated dataset.
Expand Down
263 changes: 236 additions & 27 deletions docs/simulation/calibrate-model.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,17 @@ title: "Calibrate a model"

Calibration lets you improve the performance of a model by updating the value of configuration parameters. You can calibrate a model with a reference dataset of observations and an optional intervention policy.

![Output ports (left) for a model and a dataset connected to the input ports of a calibrate operation](../img/workflows/nodes.png)
## Calibrate operator

## Calibrate
In a workflow, the Calibrate operator takes a model configuration, a dataset, and optional interventions as inputs. It outputs a calibrated model configuration.

Calibrate takes a model configuration and a dataset as an input.
???+ tip

<figure markdown>![](../img/models/model-calibrate-pyciemss-operator.png)<figcaption markdown>How it works: [PyCIEMSS](https://github.com/ciemss/pyciemss/blob/main/pyciemss/interfaces.py#L529) :octicons-link-external-24:{ alt="External link" title="External link" }</figcaption></figure>
At least one parameter in the configuration must be defined as a [uniform distribution](../config-and-intervention/configure-model.md#edit-or-create-a-model-configuration), and the dataset must have a timestamp column.

Once you've completed the calibration, the thumbnail preview shows the results charts.

<figure markdown>![](../img/simulation/calibrate/calibrate-operator.png)<figcaption markdown>How it works: [PyCIEMSS](https://github.com/ciemss/pyciemss/blob/main/pyciemss/interfaces.py#L529) :octicons-link-external-24:{ alt="External link" title="External link" }</figcaption></figure>

<div class="grid cards" markdown>

Expand All @@ -32,33 +36,238 @@ Calibrate takes a model configuration and a dataset as an input.

</div>

???+ list "Calibrate a model"
??? list "Add a Calibrate operator to a workflow"

1. Add the model and reference dataset to a workflow graph.
2. Connect the Model operator output to a Configure model operator. See [Configure a model](../config-and-intervention/configure-model.md) for information on selecting a model configuration.

??? tip

At least one parameter in the configuration needs to be defined as a distribution, and there needs to be a timestamp column.
- Do one of the following actions:

3. Right-click anywhere on the workflow graph and select **Simulation** > **Calibrate**.
4. Connect the outputs of the Configure model operator and Dataset resource to the inputs on the Calibrate operator.
5. Click **Edit** on the Calibrate operator.
6. In the Mapping section, map the model variables to the columns in the dataset.

??? tip
- On an operator that outputs a model configuration, click <span class="sr-only" id="link-icon-label">Link</span> :octicons-plus-24:{ title="Link" aria-labelledby="link-icon-label" } > **Calibrate**.
- Right-click anywhere on the workflow graph, select **Simulation** > **Calibrate**, and then connect a model configuration and a dataset to the Calibrate input.

## Calibrate a model

The Calibrate operator allows you to define how to:

- [Map your model configurations and dataset](#map-dataset-columns-and-model-variables).
- [Choose how to run the calibration](#configure-the-run-settings).

??? list "Open a Calibrate operator"

1. Make sure you've connected a model configuration and a dataset to the Calibrate operator.
2. Click **Open**.

### Map dataset columns and model variables

To begin, map the observed data (such as number of cases) to the corresponding model states (such as detected cases).

![](../img/simulation/calibrate/mapping.png)

Only relevant variables need to be mapped. For example, if the model includes susceptible and recovered states, but the data only includes infected, you only need to map the infected state. States like susceptible populations that are typically not observed may not be mappable.

??? list "Automatically map the dataset and model configuration"

If you enriched the model and dataset with concepts, click **Auto map** to speed the alignment process.

1. Click **Auto map**.
2. Review and edit the mappings as needed.

??? list "Manually map between the data and model configurations"

1. Select the Timestamp column from the dataset.
2. For each variable of interest:

If the model concepts are assigned to variables, click **Auto map** to speed the alignment process.
1. Click :octicons-plus-24:{ aria-hidden="true" } **Add mapping**.
2. Select the corresponding state from the model configuration.

### Configure the run settings

The Calibrate run settings allow you to fine-tune the time frame, solver behavior, and inference process. By adjusting these settings, you can balance performance and precision.

![](../img/simulation/calibrate/calibration-settings.png)

??? list "Configure the run settings"

The run presets help you quickly choose between fast calibrations, which process quickly but are less accurate, and the normal setting, which is slower but more precise.

1. Choose the **Start** and **End time**.
2. Select a **Preset**, Fast or Normal.

??? list "Advanced settings"

Using the following advanced settings, you can further optimize the computational efficiency and thoroughness of the calibration:

7. Configure the calibration options as needed:
- **Chains**: Number of parallel chains to run during the calibration.
- **Iterations**: Number of steps each chain should take.
- **ODE method**: Method to solve ordinary differential equations.
- **Calibrate method**: Approach to calibration (*bayesian*, *local*, or *global*).
8. Click :material-play-outline:{ aria-hidden="true" } **Run**.
- **Number of samples**: Number of calibration attempts made to explore the parameter space and identify the best fit.
- ODE solver options determine the approach for solving the system's equations during calibration:
- **Solver method**: *dopri5* provides more accurate results with finer calculations, while *euler* performs simpler, faster calculations.
- **Solver step size**: Interval between calculation steps, influencing precision and computational cost.
- Inference options control how model parameters are estimated during calibration:
- **Number of solver iterations**: Number of steps to take to converge on a solution.
- **Learning rate**: Step size for updating parameters during the optimization process.
- **Inference algorithm**: Stochastic Variational Inference (SVI), which estimates parameters probabilistically.
- **Loss function**: Evidence Lower Bound (ELBO), which guides parameter updates by balancing data fit and model complexity.
- **Optimize method**: ADAM, an algorithm for efficient parameter updates.

## Create the calibrated configuration

Once you've configured all the calibration settings, you can run the operator to generate a new calibrated configuration. The new configuration becomes a temporary output for the Calibrate operator; you can connect it to other operators in the same workflow. If you want to use it in other workflows, you can save it for reuse.

![](../img/config-and-intervention/validate/run.png)

??? list "Create a new calibrated configuration"

- Click :material-play-outline:{ aria-hidden="true" } **Run**.

??? list "Choose a different output for the Calibrate operator"

- Use the **Select an output** dropdown.

## Understand the results

When the calibration is complete, Terarium creates an AI-generated description of the results.

![](../img/simulation/calibrate/calibrate-description.png)

Results are also presented as a series of customizable charts that show:

<div class="grid cards" markdown>

- __Loss__

---

The loss chart shows the error between the model's output and the calibration data. A decreasing loss indicates successful calibration.

![](../img/simulation/calibrate/loss.png)

- __Parameter distributions__

---

The parameter distribution plots show the range of parameter values before (grey) and after (green) calibration. A table below the plot also shows the mean and variance.

![](../img/simulation/calibrate/parameter-distributions.png)

- __Interventions over time__

---

The interventions over time charts show any selected interventions before (grey) and after (green) calibration.


![](../img/simulation/calibrate/interventions-over-time.png)

- __Variables over time__

---

To aid visual validation, the variables over time charts compare the effects of calibration for state variables, observables, and the historical data.

- The grey line represents the model before calibration.
- The colored line represents the model after calibration.

![](../img/simulation/calibrate/variables.png)

- __Error__

---

The error plots show the mean absolute error (MAE) for each variable of interest.

![](../img/simulation/calibrate/error.png)

- __Comparison charts__

---

The comparison charts let you plot two or more parameters, model states, or observables to visualize how they changed after calibration.

![](../img/simulation/calibrate/comparison-charts.png)

Additional options for comparison charts let you split the selected variables into separate small multiples charts. You can further customize the small multiples charts to show the same Y axis for all charts or incorporate plots of the variables before calibration.

![](../img/simulation/calibrate/comparison-small-multiples.png)

</div>

??? list "Access the Output settings"

Settings for the various chart types are available in the Output settings panel.

- Click <span class="sr-only" id="expand-icon-label">Expand</span> :fontawesome-solid-angles-left:{ title="Expand" aria-labelledby="expand-icon-label" } to expand the Output settings.

??? list "Choose which variables to plot"

- Select the variables from the dropdown list.

??? list "Access additional chart settings"

Some chart sections let you select additional options for each chart or variable. To access these settings:

- Click <span class="sr-only" id="options-icon-label">Options</span> :octicons-gear-24:{ title="Options" aria-labelledby="options-icon-label" }.

#### Annotate charts

Adding annotations to charts helps highlight key insights and guide interpretation of data. You can create annotations manually or using AI assistance.

??? list "Add annotations that call out key values and timesteps"

To highlight notable findings, you can manually add annotations that label plotted values at key timesteps on loss, interventions over time, variables over time, and comparison charts.

1. Click anywhere on the chart to add a callout.
2. To add more callouts without clearing the first one, hold down ++shift++ and click a new area of the chart.

??? list "Prompt an AI assistant to add chart annotations"

You can prompt an AI assistant to automatically create annotations on the variables over time and comparison charts. Annotations are labelled or unlabelled lines that mark specific timestamps or peak values. Examples of AI-assisted annotations are listed below.

- Describe the annotations you want to add and press ++enter++.

```{ .text .wrap }
Draw a vertical line at day 100
```
```{ .text .wrap }
Draw a line at the peak S after calibration
```
```{ .text .wrap }
Draw a horizontal line at the peak of default configuration Susceptible after calibration. Label it as "important"
```
```{ .text .wrap }
Draw a vertical line at x is 10. Don't add the label
```
```{ .text .wrap }
Draw a line at x = 40 only after calibration
```

#### Display options

You can customize the appearance of your charts to enhance readability and organization of the results.

??? list "Change the chart scale"

By default, charts are shown in linear scale. You can switch to log scale to view large ranges, exponential trends, and improve visibility of small variations.

- Select or clear **Use log scale**.

??? list "Hide in node"

The variables you choose to plot appear in the results panel and as thumbnails on the Calibrate operator in the workflow. You can hide the thumbnail preview to minimize the space the Calibrate node takes up.

- Select **Hide in node**.

??? list "Change parameter colors"

You can change the color of any variable on the parameter distribution, interventions over time, and variables over time charts to make your charts easier to read.

- Click the color picker and choose a new color from the palette or use the eye dropper to select a color shown on your screen.

#### Save charts

## Intermediate results
You can save Calibrate charts for use outside of Terarium. Download charts as images that you can share or include in reports, or access structured JSON that you can edit with [Vega](https://vega.github.io/) :octicons-link-external-24:{ alt="External link" title="External link" }.

???+ list "View calibration results"
??? list "Save a chart for use outside Terarium"

- Use the **Select variables to plot** dropdown to choose the variables you want to see, or click **Add Chart** to add another chart.
- Click <span class="sr-only" id="menu-icon-label">Menu</span> :fontawesome-solid-ellipsis-vertical:{ title="Menu" aria-labelledby="menu-icon-label" } and then choose one of the following options:
- Save as SVG
- Save as PNG
- View source (Vega-Lite JSON)
- View compiled Vega (JSON)
- Open in [Vega Editor](https://vega.github.io/editor/#/) :octicons-link-external-24:{ alt="External link" title="External link" }
2 changes: 1 addition & 1 deletion docs/simulation/simulate-model.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ The Simulate run settings allow you to fine-tune the time frame and solver behav

???+ note

If you included a [starting timestep in your model configuration](configure-model.md#edit-or-create-a-model-configuration), the start and end dates also appear in your simulation.
If you included a [starting timestep in your model configuration](../config-and-intervention/configure-model.md#edit-or-create-a-model-configuration), the start and end dates also appear in your simulation.

??? list "Advanced settings"

Expand Down
51 changes: 50 additions & 1 deletion docs/workflows/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -265,6 +265,45 @@ The following workflow templates streamline the process of building common model
3. Open and edit the Configure model operators.
4. Open and run the Simulate operators.

??? list "Calibrate an ensemble model"

Use this template to create a more accurate model by combining multiple models in an ensemble. For example, you can determine how to:

- Leverage the strengths of each model to make the most accurate model possible.

<h3>Fill out the Calibrate an ensemble model template</h3>

To use the Calibrate an ensemble model template, select the following inputs and outputs:

<div class="grid cards" markdown>

- :material-arrow-collapse-right:{ .lg .middle aria-hidden="true" } __Inputs__

---

- A historical dataset
- Two or more models, each with their own model configurations
- A mapping of the timestamp values that the dataset and models share
- Additional mappings for each variable of interest that the dataset and models share

- :material-arrow-expand-right:{ .lg .middle aria-hidden="true" } __Outputs__

---

- Simulation results each selected model configuration
- Calibrations against the historical for each of the selected model configurations
- Calibrated ensemble model based on each of the calibrations

</div>

<h3>Complete the Calibrate an ensemble model workflow</h3>

The new workflow first simulates and calibrates each model individually, then calibrates the ensemble. To see the results, you first need to:

1. Open and run each Simulate operator.
2. Open and run each Calibrate operator.
3. Open and run the Calibrate ensemble operator.

## Add resources and operators to a workflow

Workflows consist of resources (models, datasets, and documents) that you can feed into a series of operators that transform or simulate them.
Expand Down Expand Up @@ -341,4 +380,14 @@ To organize your workflow graph, you can move, rearrange, or remove any of the o

??? list "Remove a workflow operator"

* Click <span class="sr-only" id="menu-icon-label">Menu</span> :fontawesome-solid-ellipsis-vertical:{ title="Menu" aria-labelledBy="menu-icon-label" } > :fontawesome-regular-trash-can:{ aria-hidden="true" } **Remove**.
* Click <span class="sr-only" id="menu-icon-label">Menu</span> :fontawesome-solid-ellipsis-vertical:{ title="Menu" aria-labelledBy="menu-icon-label" } > :fontawesome-regular-trash-can:{ aria-hidden="true" } **Remove**.

??? list "Zoom to fit workflow"

You can quickly zoom the canvas to fit your whole workflow to the current window.

???+ note

In some cases, parts of your workflow may be just off screen after the zoom.

- Click :fontawesome-solid-expand:{aria-hidden="true"} **Reset zoom**.

0 comments on commit c921608

Please sign in to comment.