Skip to content

Example: matbii

Benedict Wilkins edited this page Jun 25, 2024 · 5 revisions

Matbii

This example can be found at: here, see the README for additional details. Below is a short guide for getting started with matbii.

Install

This example can be installed as a pip package, it should install all the required dependencies and work out of the box. To from source, clone this repository, navigate to the example directory

git clone https://github.com/dicelab-rhul/icua2.git
cd example
pip install ./matbii

If you are interested in using eyetracking, install with the optional extra:

pip install ./matbii[tobii]

Note

only tobii pro eyetrackers are supported currently.

Run & Configure

Running is as simple as:

python -m matbii --config ./experiment-1-config.json

The config option specifies the path to a config file that contains various setup options, see table below:

Configuration Option Type Description
experiment_info dict experiment configuration (see below)
experiment_info.path string path to directory containing task files.
experiment_info.id string ID of this experiment (typically associated with the experiment path).
experiment_info.duration integer duration of the experiment (seconds), will auto-close upon reaching this time.
experiment_info.meta dict any meta data associated with the experiment.
participant_info dict participant information (see below)
participant_info.id string ID of the participant
participant_info.meta dict any meta data associated with the participant (e.g. age)
enable_tasks list tasks to enable, one of more of: "tracking", "system_monitoring", "resource_management"
window_x integer x position of the UI window on the screen.
window_y integer y position of the UI window on the screen.
window_width integer width of the UI window.
window_height integer height of the UI window.
window_title string title of the UI window.
window_background_color color background color of the UI.
window_fullscreen boolean fullscreen the UI?
window_resizable boolean allow resizing of the UI?
enable_video_recording boolean record the screen during the experiment?
eyetracking_enabled boolean enable eyetracking? (requires a physical eyetracker & relevant dependencies)
eyetracking_calibration_check boolean run an eyetracking calibration check before running the experiment?
eyetracking_throttle integer throttle high sampling rate eyetrackers to the given value (events/second)
eyetracking_moving_average_n integer window size for eyetracking moving average smoothing
eyetracking_velocity_threshold float velocity threshold for fixation/saccade classification (NOTE: currently specified in normalized screen coordinates, this will be updated in latest version to use the gaze angle.)
logging_level string logging level, one of: "info", "debug"

See experiment-1-config.json for an example. Note that not all values need to be specified. Default values can be found: here.

Experiment Configuration

Your experiment configuration directory should look like:

experiment-1
    ├── experiment-1-config.json
    ├── resource_management.json
    ├── resource_management.sch
    ├── system_monitoring.json
    ├── system_monitoring.sch
    ├── tracking.json
    └── tracking.sch

This directory contains files for configuring each task. These will override the default task files that are defined as part of the matbii package.

Note

experiment-1-config.json does not need to be part of this directory, but it is often the most convenient place for it. Additionally, not all files need to be specified, only those that you want to override.

Task Configuration

Task configuration files (.json extension) are used to configure a tasks initial state and UI. The options are documented below:

System Monitoring

COMING SOON

Resource Management

COMING SOON

Tracking

COMING SOON

Task Schedules

Task schedule files define how each task evolves over time. For a comprehensive guide on defining schedules, see pyfuncschedule. Each task has a set of actions that can be taken, these are provided in a table in the respective sections below. An example schedule file is also provided.

System Monitoring

Method Name Description Example
on_light(target) Turn on the given light target = 1 or 2 on_light(1) @ [10]:*
off_light(target) Turn off the given light target = 1 or 2 off_light(1) @ [10]:*
toggle_light(target) Toggle (on->off, off->on) the given light target = 1 or 2 toggle_light(2) @ [10]:*
perturb_slider(target) Perturb the given slider by +/- 1 slot, target = 1, 2, 3, or 4 perturb_slider(1) @ [uniform(5,6)]:*

Example file system_monitoring.sch:

####  System Monitoring Task Schedule ####

# This will flip a light to its failures state at intervals between 5 and 10 repeatedly (until the end of the experiment).
off_light(1) @ [uniform(5,10)]:*    # this means failure for light 1
on_light(2) @ [uniform(5,10)]:*     # this means failure for light 2

# this randomly moves the sliders (up/down by 1) at intervals between 3 and 6 seconds repeatedly (until the end of the experiment).
perturb_slider(1) @ [uniform(3,6)]:*
perturb_slider(2) @ [uniform(3,6)]:*
perturb_slider(3) @ [uniform(3,6)]:*
perturb_slider(4) @ [uniform(3,6)]:*

Resource Management

Method Name Description Examples
burn_fuel(target, burn) Burns a given amount of fuel in the target tank. COMING SOON
pump_fuel(target, flow) Pumps a given amount of fuel to/from the given tanks. COMING SOON
toggle_pump_failure(target) Toggle a pump failure (on/off -> failure, failure -> off) COMING SOON
toggle_pump(target) Toggle pump state (on->off, off->on) COMING SOON

EXAMPLE COMING SOON

Tracking

Method Name Description Examples
move_target(direction, speed) Move the target in a given direction at a given speed. move_target((0,1), 5) @ [0.1]:*
perturb_target(speed) Perturb the target in a random direction at a given speed. perturb_target(5) @ [0.1]:*

Note

The speed parameter is treated as a scaling of the direction. Each time the action is triggered the target will move direction * speed units. These units are defined in the task definition (svg coordinate space). In the examples above the event is triggered event 0.1 seconds, we can therefore think of the speed as 50 units per second.

EXAMPLE COMING SOON

Eyetracking

COMING SOON