Skip to content

Windows Create and Debug Model

esseff edited this page Oct 19, 2021 · 6 revisions

Where is OpenM++

Before you begin

Download and unzip openM++ Windows desktop binaries into any directory, for example: C:\openmpp_win_20210112\

Create New Model

  • create new directory for your model under models subfolder i.e.: C:\openmpp_win_20210112\models\MyModel. It is not required, but recommended to have folder name same as your model name.
  • copy one of the test model VC++ project files into your model subfolder, i.e.: from C:\openmpp_win_20210112\models\NewCaseBased\ompp\* into C:\openmpp_win_20210112\models\MyModel\ompp
  • copy your model files *.ompp *.mpp and custom.h files into C:\openmpp_win_20210112\models\MyModel\code\ subfolder
  • copy your data files *.odat *.dat files into C:\openmpp_win_20210112\models\MyModel\parameters\Default\ subfolder
  • start Visual Studio and open C:\openmpp_win_20210112\models\MyModel\ompp\Model.vcxproj project
  • save your new Model.sln solution
  • build your model

You can set model name of your new model using Visual Studio menu: Project -> Properties -> Configuration Properties -> OpenM++ -> Name -> Model Name: MyModel

Set model name in Visual Studio Project Properties

Create multiple input sets of parameters (multiple scenarios)

In example above we were creating only one "Default" scenario for our model from *.dat files in parameters/Default directory. It is also possible to create multiple input sets of parameters (multiple scenarios) when you are building the model:

  • go to menu: Project -> Properties -> Configuration Properties -> OpenM++
  • change: Names -> Scenario Names -> Default;CSV_Values
  • Rebuild the project

As result you will create two input sets of parameters in the model.sqlite database:

  • scenario "Default" from *.dat, *.odat, *.csv and *.tsv files in ..\parameters\Default directory
  • scenario "CSV_Values" from *.csv and *.tsv files in ..\parameters\CSV_Values directory

Please notice: additional scenario directory can contain only CSV or TSV files and not .dat or .odat files.

To find out more about CSV and TSV parameter files please read: How to use CSV or TSV files for input parameters values

Visual Studio Multiple Scenarios Project Properties

Debug your Model

  • build your model as described above
  • open any model.ompp or *.mpp file and put breakpoint in it
  • start debugger
  • to inspect model parameters go to Watch tab and do "Add item to watch"

View RiskPaths model parameters in Visual Studio

Model run options

As described at Windows: Quick Start for Model Users you can run the model with different options. For example, you can calculate 8 sub-values (a.k.a. sub-samples, members, replicas), use 4 threads and simulate 8000 cases:

MyModel.exe -OpenM.SubValues 8 -OpenM.Threads 4 -Parameter.SimulationCases 8000

You can supply run options as model command line arguments or by using model.ini file:

[OpenM]
SubValues = 8
Threads = 4

[Parameter]
SimulationCases=8000
MyModel.exe -ini MyModel.ini

There are two possible ways to specify model ini-file using Visual Studio menu:

  • Project -> Properties -> Configuration Properties -> OpenM++ -> Run Options
    • Model ini file = MyModel.ini
    • Run scenario after build = Yes
  • Project -> Properties -> Configuration Properties -> Debugging -> Command Arguments = -ini MyModel.ini

Use model ini-fle as openM++ property in Visual Studio

Use model ini-fle as command argument in Visual Studio

Debug Model with microdata files

If your BestModel is using microdata file(s) then it is possible to start microdata path with environment variable:

  input_csv in_csv;
  in_csv.open("$OM_BestModel/microdata/OzProj22_5K.csv");
  ............

You may need to export that OM_BestModel variable in order to debug the model under Visual Studio. For example, if your model location is: C:\my-models\BestModel then add: OM_BestModel=C:\my-models\BestModel into the model Debugging Environment:

Set OM_Model environment variable in Visual Studio project

Debug model c++ code

By default model compiled to debug only *.ompp and *.mpp source code, not a model C++ code. Normally it is enough to debug only *.ompp and *.mpp code but in some exceptional circumstances you may also want to debug model c++ code, generated by openM++ omc compiler.

C++ model files are located in ompp/src directory, for example, if you have openM++ installed in C:\openmpp_win_20210112 directory then model Chapter5 *.cpp and *.h source files are in C:\openmpp_win_20210112\models\Chapter5\ompp\src folder:

Model c++ source files on Windows

In order to debug model c++ code do following:

  • go to menu: Project -> Properties -> Configuration Properties -> OpenM++ -> Disable generation of #line directives = Yes

Visual Studio: disable #line directives

  • Rebuild the model project by going to menu Build -> Rebuild Solution
  • put debug breakpoints at the om_developer.cpp RunSimulation() or other entry points of your choice, e.g.: om_definitions.cpp RunModel()
  • start debugger

Debug model in Visual Studio

Use AddressSanitizer to catch memory violation bugs

Starting from version 16.9 Visual Studio include AddressSanitizer tool which allow to catch most of memory violation bugs. For example:

int x[10];
int main (int argc, char ** argv)
{
    x[20] = 20;  // error: global buffer overflow
    ........
}

If you want to add AddressSanitizer to your existing pre-version 16.9 Visual Studio installation start Visual Studio Installer, choose Modify and select "C++ AddressSanitizer":

Add AddressSanitizer to existing Visual Studio installation

To build your model with AddressSanitizer do following:

  • exit from Visual Studio
  • copy your existing model project to some backup location:
copy C:\openmpp_win_20210112\models\MyModel\ompp\Model.vcxproj* C:\my\safe\place\
  • copy AddressSanitizer version of model project. For example if openM++ installed into C:\openmpp_win_20210112 directory and your model directory is MyModel then do:
copy C:\openmpp_win_20210112\props\ompp-asan\Model.vcxproj C:\openmpp_win_20210112\models\MyModel\ompp\
copy C:\openmpp_win_20210112\props\ompp-asan\Model.vcxproj.filters C:\openmpp_win_20210112\models\MyModel\ompp\
  • start Visual Studio, open your model openM++ solution C:\openmpp_win_20210112\models\MyModel\MyModel-ompp.sln
  • Important: clean existing model build. You can do it by Menu -> Build -> Clean Solution
  • build your model

Now you can run your model from Visual Studio as usually, with or without debugger.

To run model.exe with AddressSanitizer from command line (outside of Visual Studio) use VS 2019 Native Tools command prompt:

  • open command line prompt
  • set 64 or 32 bit environment:
    • "C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Auxiliary\Build\vcvars64.bat"
    • "C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Auxiliary\Build\vcvars32.bat"
  • run your model.exe:
cd \openmpp_win_20210112\models\MyModel\ompp\bin
MyModel64D.exe

Restore your original Model project from C:\my\safe\place\Model.vcxproj* after you done with AddressSanitizer.

Home

Getting Started

Model development in OpenM++

Using OpenM++

Model Development Topics

OpenM++ web-service: API and cloud setup

Using OpenM++ from Python and R

Docker

OpenM++ Development

OpenM++ Design, Roadmap and Status

OpenM++ web-service API

GET Model Metadata

GET Model Extras

GET Model Run results metadata

GET Model Workset metadata: set of input parameters

Read Parameters, Output Tables or Microdata values

GET Parameters, Output Tables or Microdata values

GET Parameters, Output Tables or Microdata as CSV

GET Modeling Task metadata and task run history

Update Model Profile: set of key-value options

Update Model Workset: set of input parameters

Update Model Runs

Update Modeling Tasks

Run Models: run models and monitor progress

Download model, model run results or input parameters

Upload model runs or worksets (input scenarios)

Download and upload user files

User: manage user settings

Model run jobs and service state

Administrative: manage web-service state

Clone this wiki locally