diff --git a/alpha-lab/.vitepress/config.mts b/alpha-lab/.vitepress/config.mts index d5ba84546..e98080c8a 100644 --- a/alpha-lab/.vitepress/config.mts +++ b/alpha-lab/.vitepress/config.mts @@ -14,6 +14,7 @@ let theme_config_additions = { { text: "Map Gaze Onto a 3D Model of an Environment", link: "/nerfs/" }, { text: "Map Gaze Into a User-Supplied 3D Model", link: "/tag-aligner/" }, { text: "Map Gaze Onto Facial Landmarks", link: "/gaze-on-face/" }, + { text: "Map Gaze Onto Website AOIs", link: "/web-aois/" }, ], }, { diff --git a/alpha-lab/public/web-aoi.webp b/alpha-lab/public/web-aoi.webp new file mode 100644 index 000000000..92f6b4e8f Binary files /dev/null and b/alpha-lab/public/web-aoi.webp differ diff --git a/alpha-lab/web-aois/heatmap-gazes-overlaid.png b/alpha-lab/web-aois/heatmap-gazes-overlaid.png new file mode 100644 index 000000000..d4a0f7c69 Binary files /dev/null and b/alpha-lab/web-aois/heatmap-gazes-overlaid.png differ diff --git a/alpha-lab/web-aois/heatmap_output.png b/alpha-lab/web-aois/heatmap_output.png new file mode 100644 index 000000000..a588e32ef Binary files /dev/null and b/alpha-lab/web-aois/heatmap_output.png differ diff --git a/alpha-lab/web-aois/index.md b/alpha-lab/web-aois/index.md new file mode 100644 index 000000000..cc06fa52e --- /dev/null +++ b/alpha-lab/web-aois/index.md @@ -0,0 +1,115 @@ +--- +title: Map Gaze Onto Website AOIs +description: "Define areas of interest on a website and map gaze onto them using our Web-AOI tool. " +permalink: /alpha-lab/web-aois +meta: + - name: twitter:card + content: player + - name: twitter:image + content: "https://i.ytimg.com/vi/1yJfhtdJoMA/maxresdefault.jpg" + - name: twitter:player + content: "https://www.youtube.com/embed/DzK055NbRPM" + - name: twitter:width + content: "1280" + - name: twitter:height + content: "720" + - property: og:image + content: "https://i.ytimg.com/vi/1yJfhtdJoMA/maxresdefault.jpg" +tags: [Neon] +--- + + + +# Map Gaze Onto Website AOIs + + + + + +::: tip +Want to see your website through your users' eyes? Discover what really captures their attention as they scroll using our website AOI Tool + Neon eye tracking! +::: + +## Understanding Gaze Patterns in Web Interaction + +Understanding where people focus their attention on websites is key to optimizing user interfaces and improving user experiences, whether individuals are making decisions about online shopping or simply browsing content. This knowledge is also useful for educational technology, commonly referred to as EdTech. Consider interactive displays in classrooms or online learning platforms—grasping how users interact with these tools is fundamental for enhancing learning outcomes. + +By gathering data on gaze patterns during these interactions, designs can be tailored to prioritize user needs. This results in websites that are easier to navigate, applications with better user interfaces, online stores that offer more intuitive shopping experiences, and EdTech applications that foster more effective learning environments. + +In this guide, we will introduce a desktop application featuring an integrated browser that incorporates AprilTag markers onto webpages. This tool integrates with Neon, facilitating the recording and visualization of gaze data mapped onto areas of interest on a website. + +## Introducing Web Gaze Mapping + +Pupil Cloud offers powerful tools like the [Marker Mapper](https://docs.pupil-labs.com/neon/pupil-cloud/enrichments/marker-mapper/) and [Reference Image Mapper](https://docs.pupil-labs.com/neon/pupil-cloud/enrichments/reference-image-mapper/) enrichments, which enable users to map gaze onto areas of interest. However, they do not provide a turnkey solution for defining AOIs on a website and importantly, maintaining gaze mapping even during scrolling—a behavior typical of regular website usage. + +By following this guide, you can easily define AOIs on websites of your choice and record Neon data. Importantly, with this tool, the AOIs are not lost as you scroll. Afterward, you'll receive gaze mapping for each AOI, including individual AOI heatmaps and a full-page heatmap. + +### How Does This Tool Work? + +We leverage [Playwright](https://playwright.dev/), an open-source automation library for browser testing and web scraping, alongside AprilTags automatically placed on the webpage within the browser interface. Through Playwright, we generate AOIs using selectable web elements, while the AprilTags facilitate the real-time transformation of gaze data from *scene-camera* to *screen-based* coordinates. For a deeper understanding of this transformation, refer to [the documentation](https://docs.pupil-labs.com/alpha-lab/gaze-contingency-assistive/#how-to-use-a-head-mounted-eye-tracker-for-screen-based-interaction). + +## Steps To Recreate + +Explore the [GitHub repository](https://github.com/pupil-labs/web-aois) and follow these simple steps: + +1. Define Areas of Interest (AOIs) on your selected website. +2. Put Neon and start collecting data. +3. Process your recording to generate CSV files with gaze mapped onto AOI coordinates. +4. Visualize the data with customized heatmaps. + +## Visualize Mapped Gaze Data + +After running the code, new files will be generated. + +- For every AOI, you will get `png` files with transparent and overlaid heatmaps and csv files with the gaze data mapped on the AOI coordinates. +- A transparent and overlaid heatmap will also be provided for the full page (e.g., see below), along with a `gaze.csv` file that will include the mapped gaze data to the webpage in full. + +
+ +
+
+ Heatmap generated for one AOI +
+
+ Heatmap generated over the page +
+
+ Heatmaps generated for one AOI (left) and for the entire page (right). + +You can find a detailed list of the outputs [in this section of our Github repository](https://github.com/pupil-labs/web-aois?tab=readme-ov-file#output-files). + + + +This data can be used to generate outcome metrics like time to first gaze (e.g., how long it took for the user to gaze at each AOI for the first time) or dwell time/total gaze duration (e.g., sum of the gaze sample durations, defined as the period between the timestamps of consecutive gaze samples). + +::: tip +Need guidance in calculating even more metrics for your website AOIs? Reach out to us [by email](mailto:info@pupil-labs.com), on our [Discord server](https://pupil-labs.com/chat/), or visit our [Support Page](https://pupil-labs.com/products/support/) for dedicated support options. +::: + + diff --git a/neon/.vitepress/config.mts b/neon/.vitepress/config.mts index a964d0b1e..f16b69cb3 100644 --- a/neon/.vitepress/config.mts +++ b/neon/.vitepress/config.mts @@ -103,8 +103,8 @@ let theme_config_additions = { link: "/data-collection/offset-correction/", }, { - text: "Backlight Compensation", - link: "/data-collection/backlight-compensation/", + text: "Scene Camera Exposure", + link: "/data-collection/scene-camera-exposure/", }, ], }, diff --git a/neon/data-collection/backlight-compensation/index.md b/neon/data-collection/backlight-compensation/index.md deleted file mode 100644 index 46dde1e7b..000000000 --- a/neon/data-collection/backlight-compensation/index.md +++ /dev/null @@ -1,2 +0,0 @@ -# Backlight Compensation -Coming soon! \ No newline at end of file diff --git a/neon/data-collection/data-streams/index.md b/neon/data-collection/data-streams/index.md index d9c8fc3bd..c02327540 100644 --- a/neon/data-collection/data-streams/index.md +++ b/neon/data-collection/data-streams/index.md @@ -18,11 +18,11 @@ The Neon Companion app can provide gaze data in real-time at up to 200 Hz. Gaze ![Gaze](./gaze.jpg) -The achieved framerate can vary based on what Companion device is used and environmental conditions. On the OnePlus 10, the full 200 Hz can generally be achieved outside of especially hot environments. On the OnePlus 8, the framerate typically drops to ~120 Hz within a few minutes of starting a recording. Other apps running simultaneously on the phone may decrease the framerate. +The achieved framerate can vary based on what Companion device is used and environmental conditions. On the OnePlus 10 and Motorola Edge 40 Pro, the full 200 Hz can generally be achieved outside of especially hot environments. On the OnePlus 8, the framerate typically drops to ~120 Hz within a few minutes of starting a recording. Other apps running simultaneously on the phone may decrease the framerate. -After a recording is uploaded to Pupil Cloud, gaze data is automatically re-computed at the full 200 Hz framerat and can be downloaded from there. +After a recording is uploaded to Pupil Cloud, gaze data is automatically re-computed at the full 200 Hz framerate and can be downloaded from there. -The gaze estimation algorithm is based on end-2-end deep learning and provides gaze data robustly without requiring a calibration. We are currently working on a white paper that thoroughly evaluated the algorithm and will link it here once it is published. +The gaze estimation algorithm is based on end-2-end deep learning and provides gaze data robustly without requiring a calibration. You can find a high-level description as well as a thorough evaluation of the accuracy and robustness of the algorithm in our [white paper](https://zenodo.org/doi/10.5281/zenodo.10420388). ## Fixations & Saccades @@ -31,7 +31,7 @@ The two primary types of eye movements exhibited by the visual system are fixati ![Fixations](./fixations.jpg) -Fixations and saccades are calculated automatically in Pupil Cloud after uploading a recording and are included in the recording downloads. The deployed fixation detection algorithm was specifically designed for head-mounted eye trackers and offers increased robustness in the presence of head movements. Especially movements due to vestibulo-ocular reflex are compensated for, which is not the case for most other fixation detection algorithms. You can learn more about it in the [Pupil Labs fixation detector whitepaper](https://docs.google.com/document/d/1dTL1VS83F-W1AZfbG-EogYwq2PFk463HqwGgshK3yJE/export?format=pdf) and in our [publication](https://link.springer.com/article/10.3758/s13428-024-02360-0) in *Behavior Research Methods* discussing fixation detection strategies. +Fixations and saccades are calculated automatically in Pupil Cloud after uploading a recording and are included in the recording downloads. The deployed fixation detection algorithm was specifically designed for head-mounted eye trackers and offers increased robustness in the presence of head movements. Especially movements due to vestibulo-ocular reflex are compensated for, which is not the case for most other fixation detection algorithms. You can learn more about it in the [Pupil Labs fixation detector whitepaper](https://docs.google.com/document/d/1CZnjyg4P83QSkfHi_bjwSceWCTWvlVtbGWtuyajv5Jc/export?format=pdf) and in our [publication](https://link.springer.com/article/10.3758/s13428-024-02360-0) in *Behavior Research Methods* discussing fixation detection strategies. We detect saccades based on the fixation results, considering the gaps between fixations to be saccades. Note, that this assumption is only true in the absence of smooth pursuit eye movements. Additionally, the fixation detector does not compensate for blinks, which can cause a break in a fixation and thus introduce a false saccade. diff --git a/neon/data-collection/scene-camera-exposure/Balance.webp b/neon/data-collection/scene-camera-exposure/Balance.webp new file mode 100644 index 000000000..07062d06e Binary files /dev/null and b/neon/data-collection/scene-camera-exposure/Balance.webp differ diff --git a/neon/data-collection/scene-camera-exposure/Highlight.webp b/neon/data-collection/scene-camera-exposure/Highlight.webp new file mode 100644 index 000000000..3a53b099a Binary files /dev/null and b/neon/data-collection/scene-camera-exposure/Highlight.webp differ diff --git a/neon/data-collection/scene-camera-exposure/Manual.webp b/neon/data-collection/scene-camera-exposure/Manual.webp new file mode 100644 index 000000000..52a163b7e Binary files /dev/null and b/neon/data-collection/scene-camera-exposure/Manual.webp differ diff --git a/neon/data-collection/scene-camera-exposure/Shadow.webp b/neon/data-collection/scene-camera-exposure/Shadow.webp new file mode 100644 index 000000000..f9a5f531d Binary files /dev/null and b/neon/data-collection/scene-camera-exposure/Shadow.webp differ diff --git a/neon/data-collection/scene-camera-exposure/UI.webp b/neon/data-collection/scene-camera-exposure/UI.webp new file mode 100644 index 000000000..ed724355d Binary files /dev/null and b/neon/data-collection/scene-camera-exposure/UI.webp differ diff --git a/neon/data-collection/scene-camera-exposure/UI_small.webp b/neon/data-collection/scene-camera-exposure/UI_small.webp new file mode 100644 index 000000000..bccaa32e4 Binary files /dev/null and b/neon/data-collection/scene-camera-exposure/UI_small.webp differ diff --git a/neon/data-collection/scene-camera-exposure/index.md b/neon/data-collection/scene-camera-exposure/index.md new file mode 100644 index 000000000..a2276e518 --- /dev/null +++ b/neon/data-collection/scene-camera-exposure/index.md @@ -0,0 +1,32 @@ +# Scene Camera Exposure +The [scene camera’s](https://docs.pupil-labs.com/neon/data-collection/data-streams/#scene-video) exposure can be adjusted to improve image quality in different lighting conditions. There are four modes: + +- **Manual:** This mode lets you set the exposure time manually. +- **Automatic**: `Highlights`, `Balanced`, and `Shadows` automatically adjust exposure according to the surrounding lighting. + +::: tip +The mode you choose should depend on the lighting conditions in your environment. The images below provide some +examples and important considerations. +::: + +## Changing Exposure Modes +From the home screen of the Neon Companion app, tap +the [Scene and Eye Camera preview](https://docs.pupil-labs.com/neon/data-collection/first-recording/#_4-open-the-live-preview), +and then select `Balanced` to reveal all four modes. + +## Manual Exposure Mode +Allows you to set the exposure time between 1 ms and 1000 ms. + +::: tip +Exposure duration is inversely related to camera frame rate. Exposure values above 330 ms will reduce the scene camera rate below 30fps. +::: + +## Automatic Exposure Modes +`Highlights`- optimizes the exposure to capture bright areas in the environment, while potentially underexposing dark areas. +![This mode optimizes the exposure to capture bright areas in the environment, while potentially underexposing dark areas.](Highlight.webp) + +`Balanced` - optimizes the exposure to capture brighter and darker areas equally. +![This mode optimizes the exposure to capture brighter and darker areas in the environment equally.](./Balance.webp) + +`Shadows` - optimizes the exposure to capture darker areas in the environment, while potentially overexposing brighter areas. +![This mode optimizes the exposure to capture darker areas in the environment, while potentially overexposing bright areas.](Shadow.webp) diff --git a/neon/hardware/compatible-devices/index.md b/neon/hardware/compatible-devices/index.md index 9951c2605..e6e59f30a 100644 --- a/neon/hardware/compatible-devices/index.md +++ b/neon/hardware/compatible-devices/index.md @@ -1,7 +1,7 @@ # Companion Device The Companion device is a flagship Android smartphone. It is a regular phone that is not customized or modified in any way. To ensure maximum stability and performance we can only support a small number of carefully selected and tested models. The Neon Companion app is tuned to work with these particular models as we require full control over various low-level functions of the hardware. -The supported models are: OnePlus 8, OnePlus 8T, OnePlus 10 Pro, and Motorola Edge 40 Pro. Currently, Neon ships with a Motorola Edge 40 Pro. +The supported models are: OnePlus 8, OnePlus 8T, OnePlus 10 Pro, and Motorola Edge 40 Pro. Currently, Neon ships with a Motorola Edge 40 Pro. We highly recommend the Edge 40 Pro, giving you the best performance, endurance and stability. If you want to replace or add an extra Companion device you can purchase it [directly from us](https://pupil-labs.com/products/neon) or from any other distributor. The Neon Companion app is free and can be downloaded from the [Play Store](https://play.google.com/store/apps/details?id=com.pupillabs.neoncomp). diff --git a/neon/pupil-cloud/enrichments/reference-image-mapper/index.md b/neon/pupil-cloud/enrichments/reference-image-mapper/index.md index 35024ec56..a7aa6e550 100644 --- a/neon/pupil-cloud/enrichments/reference-image-mapper/index.md +++ b/neon/pupil-cloud/enrichments/reference-image-mapper/index.md @@ -177,6 +177,10 @@ This file contains all the mapped gaze data from all sections. | **fixation id** | If this gaze sample belongs to a fixation event, this is the corresponding id of the fixation. Otherwise, this field is empty. | | **blink id** | If this gaze samples belongs to a blink event, this is the corresponding id of the blink. Otherwise this field is empty. | +::: info +This CSV file only contains data-points where the reference image has been localised in the scene. Looking for all the gaze points? Check [this file.](/data-collection/data-format/#gaze-csv) +::: + ### fixations.csv This file contains fixation events detected in the gaze data stream and mapped to the reference image. diff --git a/neon/pupil-cloud/visualizations/areas-of-interest/index.md b/neon/pupil-cloud/visualizations/areas-of-interest/index.md index 8d5e5f0cd..c20ee5a1c 100644 --- a/neon/pupil-cloud/visualizations/areas-of-interest/index.md +++ b/neon/pupil-cloud/visualizations/areas-of-interest/index.md @@ -59,6 +59,5 @@ This file contains standard fixation and gaze metrics on AOIs. | **average fixation duration [ms]** | Average fixation duration for the corresponding area of interest in milliseconds. | | **total fixations** | Total number of fixations for the corresponding area of interest in milliseconds. | | **time to first fixation [ms]** | Average time in milliseconds until the corresponding area of interest gets fixated on for the first time in a recording. | -| **time to first gaze [ms]** | Average time in milliseconds until the corresponding area of interest gets gazed at for the first time in a recording. | | **total fixation duration [ms]** | Total fixation duration for the corresponding area of interest in milliseconds. | -| **total gaze duration [ms]** | Total fixation duration for the corresponding area of interest in milliseconds. | \ No newline at end of file +