Skip to content

Commit

Permalink
V0.13.0 (ThummeTo#133)
Browse files Browse the repository at this point in the history
* fixed new train method layout in examples

* further adjustments

* allow for DiffEq default solver heurisitc

* fix action

* revert change

* Patch example action (ThummeTo#128)

* changes for debugging

* test non escape for awk

* changed escapes for debugging

* cleanup

* doc fix for FMIFlux.train! (ThummeTo#127)

* changes fro FMIBase

* adaptions for FMIBase

* adjustment for Julia 1.9+ extension system

* modified examples

* revert tutorial/workshop

* Added checks to Example action (ThummeTo#144)

* Update Example.yml

Added check for success of jupyter examples, fail action if example-building fails, prevents autocommit to examples branch

* relaxed compats

Update Project.toml

---------

Co-authored-by: ThummeTo <[email protected]>

* minor adaptions

* updated codecov action v4

* testing tests

* modified state change sampling

* test

* switched to linux FMU

* make assert a warning

* fixed some tests

* fixed tests, examples, workshop, typos

* updated tutorials

---------

Co-authored-by: Simon Exner <[email protected]>
  • Loading branch information
ThummeTo and 0815Creeper authored Sep 4, 2024
1 parent dbadbed commit f1db228
Show file tree
Hide file tree
Showing 39 changed files with 1,250 additions and 1,289 deletions.
48 changes: 35 additions & 13 deletions .github/workflows/Example.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,24 +20,24 @@ jobs:
fail-fast: false
matrix:
os: [windows-latest] # , ubuntu-latest]
file-name: [growing_horizon_ME, modelica_conference_2021, simple_hybrid_CS, simple_hybrid_ME, mdpi_2022, juliacon_2023]
file-name: [simple_hybrid_CS, simple_hybrid_ME, juliacon_2023]
julia-version: ['1.8']
julia-arch: [x64]
experimental: [false]

steps:
- name: "Check out repository"
uses: actions/checkout@v3

- name: "Set up Julia"
uses: julia-actions/setup-julia@v1
with:
version: ${{ matrix.julia-version }}
arch: ${{ matrix.julia-arch }}

- name: "Install dependencies"
run: julia --project=examples/ -e 'using Pkg; Pkg.develop(PackageSpec(path=pwd())); Pkg.instantiate()'

- name: "Install packages"
run: pip install jupyter nbconvert

Expand All @@ -48,7 +48,7 @@ jobs:
jupyter nbconvert --ExecutePreprocessor.kernel_name="julia-1.8" --to notebook --inplace --execute ${{ env.FILE }}
jupyter nbconvert --to script ${{ env.FILE }}
jupyter nbconvert --to markdown ${{ env.FILE }}
- name: "Fix GIFs"
run: |
echo "starting gif fixing"
Expand All @@ -57,7 +57,7 @@ jobs:
awk '{if($0~/<img src="data:image\/gif;base64,[[:alpha:],[:digit:],\/,+,=]*" \/>/) {sub(/<img src="data:image\/gif;base64,[[:alpha:],[:digit:],\/,+,=]*" \/>/,"![gif](${{ matrix.file-name }}_files\/gif_"++i".gif)")}}1' examples/jupyter-src/${{ matrix.file-name }}.md > examples/jupyter-src/tmp_${{ matrix.file-name }}.md
mv -Force examples/jupyter-src/tmp_${{ matrix.file-name }}.md examples/jupyter-src/${{ matrix.file-name }}.md
echo "gifs should be fixed"
- name: Archive examples artifacts (success)
if: success() && matrix.os == 'windows-latest'
uses: actions/upload-artifact@v3
Expand All @@ -70,30 +70,52 @@ jobs:
steps:
- name: "Check out repository"
uses: actions/checkout@v3

- name: "Set up Julia"
uses: julia-actions/setup-julia@v1
with:
version: '1.10'

- run: julia -e 'using Pkg; Pkg.add("PlutoSliderServer"); Pkg.add("FMIFlux")'
- run: julia -e 'using PlutoSliderServer; PlutoSliderServer.export_directory("examples/pluto-src")'

- name: Archive examples artifacts (success)
if: success()
uses: actions/upload-artifact@v3
with:
name: pluto-examples
path: examples/pluto-src/*

auto-commit:
filecheck:
needs: [jypiter, pluto]
runs-on: ubuntu-latest
steps:
- name: Download jupyter examples
uses: actions/download-artifact@v3
with:
name: jupyter-examples
path: examples/jupyter-src/

- name: Download pluto examples
uses: actions/download-artifact@v3
with:
name: pluto-examples
path: examples/pluto-src/

- name: Check if the example files generated are valid (if jupyter-examples failed, svgs are missing; jupyter command does not fail even if examples fail)
uses: andstor/file-existence-action@v3
with:
files: "examples/jupyter-src/*/*.svg"
fail: true

auto-commit:
needs: [jypiter, pluto, filecheck]
if: github.event_name != 'pull_request'
runs-on: ubuntu-latest
steps:
- name: Check out repository
uses: actions/checkout@v3

- name: Download jupyter examples
uses: actions/download-artifact@v3
with:
Expand Down Expand Up @@ -130,7 +152,7 @@ jobs:
git add ${{ env.EXAMPLES_PATH }}
git commit -m "${{ env.CI_COMMIT_MESSAGE }}"
git push origin examples
call-docu:
needs: auto-commit
if: github.event_name != 'pull_request'
Expand Down
12 changes: 1 addition & 11 deletions .github/workflows/TestLTS.yml
Original file line number Diff line number Diff line change
Expand Up @@ -54,14 +54,4 @@ jobs:

# Run the tests
- name: "Run tests"
uses: julia-actions/julia-runtest@v1

# Preprocess Coverage
- name: "Preprocess Coverage"
uses: julia-actions/julia-processcoverage@v1

# Run codecov
- name: "Run CodeCov"
uses: codecov/codecov-action@v3
with:
file: lcov.info
uses: julia-actions/julia-runtest@v1
6 changes: 4 additions & 2 deletions .github/workflows/TestLatest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,8 @@ jobs:

# Run codecov
- name: "Run CodeCov"
uses: codecov/codecov-action@v3
uses: codecov/codecov-action@v4
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
with:
file: lcov.info
file: lcov.info
29 changes: 16 additions & 13 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,30 +1,33 @@
name = "FMIFlux"
uuid = "fabad875-0d53-4e47-9446-963b74cae21f"
version = "0.12.2"
version = "0.13.0"

[deps]
Colors = "5ae59095-9a9b-59fe-a467-6f913c188581"
DifferentiableEigen = "73a20539-4e65-4dcb-a56d-dc20f210a01b"
DifferentialEquations = "0c46a032-eb83-5123-abaf-570d42b7fbaa"
FMIImport = "9fcbc62e-52a0-44e9-a616-1359a0008194"
FMISensitivity = "3e748fe5-cd7f-4615-8419-3159287187d2"
Flux = "587475ba-b771-5e3f-ad9e-33799f191a9c"
Optim = "429524aa-4258-5aef-a3af-852621145aeb"
OrdinaryDiffEq = "1dea7af3-3e70-54e6-95c3-0bf5283fa5ed"
Printf = "de0858da-6303-5e67-8744-51eddeeeb8d7"
ProgressMeter = "92933f4c-e287-5a05-a399-4b506db050ca"
Requires = "ae029012-a4dd-5104-9daa-d747884805df"
Statistics = "10745b16-79ce-11e8-11f9-7d13ad32a3b2"
ThreadPools = "b189fb0b-2eb5-4ed4-bc0c-d34c51242431"

[weakdeps]
JLD2 = "033835bb-8acc-5ee8-8aae-3f567f8a3819"

[extensions]
JLD2Ext = ["JLD2"]

[compat]
Colors = "0.12.8"
Colors = "0.12"
DifferentiableEigen = "0.2.0"
DifferentialEquations = "7.7.0 - 7.12"
FMIImport = "0.16.4"
FMISensitivity = "0.1.4"
Flux = "0.13.0 - 0.14"
Optim = "1.7.0"
ProgressMeter = "1.7.0 - 1.9"
Requires = "1.3.0"
ThreadPools = "2.1.1"
FMIImport = "1.0.0"
FMISensitivity = "0.2.0"
Flux = "0.9 - 0.14"
Optim = "1.6"
OrdinaryDiffEq = "6.0"
Statistics = "1"
ThreadPools = "2.1"
julia = "1.6"
27 changes: 18 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,15 +38,19 @@ You can evaluate FMUs inside of your loss function.
## What is currently supported in FMIFlux.jl?
- building and training ME-NeuralFMUs (NeuralODEs) with support for event-handling (*DiffEqCallbacks.jl*) and discontinuous sensitivity analysis (*SciMLSensitivity.jl*)
- building and training CS-NeuralFMUs
- building and training NeuralFMUs consisiting of multiple FMUs
- building and training NeuralFMUs consisting of multiple FMUs
- building and training FMUINNs (PINNs)
- different AD-frameworks: ForwardDiff.jl (CI-tested), ReverseDiff.jl (CI-tested, default setting), FiniteDiff.jl (not CI-tested) and Zygote.jl (not CI-tested)
- use `Flux.jl` optimisers as well as the ones from `Optim.jl`
- using the entire *DifferentialEquations.jl* solver suite (`autodiff=false` for implicit solvers)
- use `Flux.jl` optimizers as well as the ones from `Optim.jl`
- using the entire *DifferentialEquations.jl* solver suite (`autodiff=false` for implicit solvers, not all are tested, see following section)
- ...

## (Current) Limitations

- Not all implicit solvers work for challenging, hybrid models (stiff FMUs with events), currently tested are: `Rosenbrock23(autodiff=false)`.

- Implicit solvers using `autodiff=true` is not supported (now), but you can use implicit solvers with `autodiff=false`.

- Sensitivity information over state change by event $\partial x^{+} / \partial x^{-}$ can't be accessed in FMI.
These sensitivities are simplified on basis of one of the following assumptions (defined by user):
(1) the state after event depends on nothing, so sensitivities are zero or
Expand All @@ -55,13 +59,11 @@ The second is often correct for e.g. mechanical contacts, but may lead to wrong
However even if the gradient might not be 100% correct in any case, gradients are often usable for optimization tasks.
This issue is also part of the [*OpenScaling*](https://itea4.org/project/openscaling.html) research project.

- Discontinuous systems with implicite solvers use continuous adjoints instead of automatic differentiation through the ODE solver.
This might lead to issues, because FMUs are by design not simulatable backward in time.
On the other hand, many FMUs are capabale of doing so.
- Discontinuous systems with implicit solvers use continuous adjoints instead of automatic differentiation through the ODE solver.
This might lead to issues, because FMUs are by design not capable of being simulated backwards in time.
On the other hand, many FMUs are capable of doing so.
This issue is also part of the [*OpenScaling*](https://itea4.org/project/openscaling.html) research project.

- Implicit solvers using `autodiff=true` is not supported (now), but you can use implicit solvers with `autodiff=false`.

- For now, only FMI version 2.0 is supported, but FMI 3.0 support is coming with the [*OpenScaling*](https://itea4.org/project/openscaling.html) research project.

## What is under development in FMIFlux.jl?
Expand All @@ -82,12 +84,19 @@ To keep dependencies nice and clean, the original package [*FMI.jl*](https://git
- [*FMI.jl*](https://github.com/ThummeTo/FMI.jl): High level loading, manipulating, saving or building entire FMUs from scratch
- [*FMIImport.jl*](https://github.com/ThummeTo/FMIImport.jl): Importing FMUs into Julia
- [*FMIExport.jl*](https://github.com/ThummeTo/FMIExport.jl): Exporting stand-alone FMUs from Julia Code
- [*FMIBase.jl*](https://github.com/ThummeTo/FMIBase.jl): Common concepts for import and export of FMUs
- [*FMICore.jl*](https://github.com/ThummeTo/FMICore.jl): C-code wrapper for the FMI-standard
- [*FMISensitivity.jl*](https://github.com/ThummeTo/FMISensitivity.jl): Static and dynamic sensitivities over FMUs
- [*FMIBuild.jl*](https://github.com/ThummeTo/FMIBuild.jl): Compiler/Compilation dependencies for FMIExport.jl
- [*FMIFlux.jl*](https://github.com/ThummeTo/FMIFlux.jl): Machine Learning with FMUs (differentiation over FMUs)
- [*FMIFlux.jl*](https://github.com/ThummeTo/FMIFlux.jl): Machine Learning with FMUs
- [*FMIZoo.jl*](https://github.com/ThummeTo/FMIZoo.jl): A collection of testing and example FMUs

## Video-Workshops
### JuliaCon 2024 (Eindhoven University of Technology, Netherlands)
[![YouTube Video of Workshop](https://img.youtube.com/vi/sQ2MXSswrSo/0.jpg)](https://www.youtube.com/watch?v=sQ2MXSswrSo)
### JuliaCon 2023 (Massachusetts Institute of Technology, United States)
[![YouTube Video of Workshop](https://img.youtube.com/vi/X_u0KlZizD4/0.jpg)](https://www.youtube.com/watch?v=X_u0KlZizD4)

## How to cite?
Tobias Thummerer, Johannes Stoljar and Lars Mikelsons. 2022. **NeuralFMU: presenting a workflow for integrating hybrid NeuralODEs into real-world applications.** Electronics 11, 19, 3202. [DOI: 10.3390/electronics11193202](https://doi.org/10.3390/electronics11193202)

Expand Down
9 changes: 5 additions & 4 deletions docs/src/examples/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,15 +10,16 @@ The examples show how to combine FMUs with machine learning ("NeuralFMU") and il
## Examples
- [__Simple CS-NeuralFMU__](https://thummeto.github.io/FMIFlux.jl/dev/examples/simple_hybrid_CS/): Showing how to train a NeuralFMU in Co-Simulation-Mode.
- [__Simple ME-NeuralFMU__](https://thummeto.github.io/FMIFlux.jl/dev/examples/simple_hybrid_ME/): Showing how to train a NeuralFMU in Model-Exchange-Mode.
- [__Growing Horizon ME-NeuralFMU__](https://thummeto.github.io/FMIFlux.jl/dev/examples/growing_horizon_ME/): Growing horizon training technique for a ME-NeuralFMU.

## Advanced examples: Demo applications
- [__JuliaCon 2023: Using NeuralODEs in real life applications__](https://thummeto.github.io/FMIFlux.jl/dev/examples/juliacon_2023/): An example for a NeuralODE in a real world engineering scenario.
- [__Modelica Conference 2021: NeuralFMUs__](https://thummeto.github.io/FMIFlux.jl/dev/examples/modelica_conference_2021/): Showing basics on how to train a NeuralFMU (Contribution for the *Modelica Conference 2021*).

## Workshops
[Pluto](https://plutojl.org/) based notebooks, that can easily be executed on your own Pluto-Setup.
- [__Scientific Machine Learning using Functional Mock-up Units__](../pluto-src/SciMLUsingFMUs/SciMLUsingFMUs.html): Workshop at JuliaCon 2024 (Eindhoven University, Netherlands)

## Archived
- [__MDPI 2022: Physics-enhanced NeuralODEs in real-world applications__](https://thummeto.github.io/FMIFlux.jl/dev/examples/mdpi_2022/): An example for a NeuralODE in a real world modeling scenario (Contribution in *MDPI Electronics 2022*).

## Workshops
[Pluto](https://plutojl.org/) based notebooks, that can easyly be executed on your own Pluto-Setup.
- [__Growing Horizon ME-NeuralFMU__](https://thummeto.github.io/FMIFlux.jl/dev/examples/growing_horizon_ME/): Growing horizon training technique for a ME-NeuralFMU.
- [__HybridModelingUsingFMI__](../pluto-src/HybridModelingUsingFMI/HybridModelingUsingFMI.html): Workshop at MODPROD 2024 (Linköping University, Sweden)
3 changes: 2 additions & 1 deletion examples/jupyter-src/.gitignore
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
params/
params/
*.png
6 changes: 6 additions & 0 deletions examples/jupyter-src/growing_horizon_ME.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,12 @@
"# ME-NeuralFMUs using Growing Horizon\n",
"Tutorial by Johannes Stoljar, Tobias Thummerer\n",
"\n",
"----------\n",
"\n",
"📚📚📚 This tutorial is archieved (so keeping it runnable is low priority) 📚📚📚\n",
"\n",
"----------\n",
"\n",
"*Last edit: 08.11.2023*\n",
"\n",
"## LICENSE\n"
Expand Down
Loading

0 comments on commit f1db228

Please sign in to comment.