🔥 Official repository for the paper "Meta-trainer: an augmented reality trainer for home fitness with real-time feedback" published at IEEE Star 2023 conference. 🌟 this repo!
🔥 Honorable mention in Xreal (formerly Nreal) AR JAM 2022 Challenge
Table of Contents
The project aims to create an in-door fitness application for the AR JAM Challenge 2022 proposed by Nreal.
We were invited to join this challenge after our final submission of the Computer Vision project available at github.com/laitifranz/AR-SMPLX.
We had the opportunity to work with the latest AR technologies and gain knowledge and experience with Unity.
The Nreal AR Jam is an international online challenge which was proposed in order to attract AR developers to its growing community. The competition is divided into 8 macro categories: At-Home Fitness, Art, Games, Screen 2.0, Port, Social, NFT and Student.
We decided to apply for At-Home Fitness category because we believe that the COVID pandemic has drastically changed our habits and many of us have taken the isolation as a chance to improve their fitness, but improvising can be ineffective or, even worse, dangerous.
We presented our idea with this incipit:
You are at home, alone. You are not motivated. You feel pain during exercises and wonder if you are doing it correctly. Have you ever had a personal trainer in your pocket? Improve your fitness with your metatrainer, we present a revolutionary way to workout.
Therefore, we developed an idea for helping people at home and don't feel alone during workouts.
Official repository of IEEE Star 2023 paper "Meta-trainer: an augmented reality trainer for home fitness with real-time feedback". Presented at the International Workshop on Sport Technology and Research in Cavalese, Italy the 15th of September 2023.
The challenge was divided into 3 milestones:
- Milestone 1: propose an idea and a pitch that describe our application.
- Milestone 2: provide concept arts (screenshots and videos of the application) and update the description.
- Milestone 3: submission of the final build traiNreal.apk with an updated description.
In this section we will provide some information about how we obtained characters and how we implemented animations.
We thank the YouTube channel iHeartGameDev for the amazing tutorials that he made. We suggest you to see these videos to learn more about Animation Controllers and how to manage Characters, in particular this playlist.
In the source code we left references, as comments, to resources where we found how to do a certain task.
We used two main sources for downloading our characters:
- Mixamo from Adobe - mixamo.com
- Settings
- Format: FBX for Unity
- Pose: T-Pose
- Settings
- Unity Asset Store - assetstore.unity.com
We implemented 4 characters, available here:
- Adam from Mixamo
- Sophie from Mixamo
- Mousey from Mixamo
- Space Robot Kyle from Unity
To correctly import the character, it is important to follow these steps:
- Import the .fbx file inside the project
- Open the Inspector of the game object
- Set the boxes like this
- If you would like to add the default skin (it works with Mixamo characters), select Material, apply the settings as the image explain, and then Apply
- Now you should find a new prefab inside the same folder of you character!
We used two main sources for our animations:
- Mixamo from Adobe - mixamo.com
- Settings
- Format: FBX for Unity
- Skin: without skin
- FPS: 30
- Keyframe Reduction: none
- Settings
- OptiTrack systems @ Multisensory Interactions Lab UniTN - optitrack.com
We implemented several animations for our project available here.
To properly use the animation from Mixamo, it is important to do these steps:
- Go to your file animation that you have downloaded from Mixamo
- Expand the object by using the arrow
- Select the animation object (teal triangle)
- Copy and paste the animation in another folder dedicated to animations
⚠️ Be aware that some animations could have the Loop Time checkbox selected. This is essential for some animations that needs to be repeat forever- Create an Animation Controller by right-click on the Controller folder and choose Create. You need to do this because you have to create a step-by-step animations plan for you characters (see examples provided by us)
- Add your sequence of animations in the Animation Controller (tutorial)
We exploit the NRSDK provided by Nreal on developer.nreal.ai/download. The SDK package is already included in our project, therefore you should run it without reinstalling it.
If you would like to update the NRSDK (you can check the version of the NRSDK that we used in the README file), follow these steps:
- Download the new version from the official site of Nreal
- Open your project in Unity
- Right click on
Assets > Import Package > Custom Package
- Select the new package just downloaded
- Apply
To use the Nreal technology, it is important to add to the scene two objects:
- NRCameraRig
- NRInput
We focused our attention on NRInput, because we needed to switch the input source type from Controller to Hands. We did it via code:
bool switchToHandTracking = NRInput.SetInputSource(InputSourceEnum.Hands);
We structured our project in scenes, and we move the player across scenes during the workout.
This is the scheme of the scenes that you can find in Assets/Scenes:
.
├── Start
│ ├── Start # menu
│ ├── Settings # change preferences
│ └── Helper # briefly recap of the app and how to use it
├── Exercises
│ ├── Warm up # pre training warm up exercises to follow
│ ├── Main # core exercises of our app
│ ├── Lunges # 3D visualization of the exercise
│ ├── SquatView # see squat movement from different viewpoints
│ └── SquatAnalysis # real-time feedback of squat execution
│ └── Stretching # final training exercises
└── End # gamification and summary moment
Now we will present how the scenes are connected:
Start
|
---------------------
| | |
| | |
Settings Warm up Helper
|
SquatView
|
SquatAnalysis
|
Lunges
|
Stretching
|
End
|
Start
|
... # loop
We saved the preferences of the user using PlayerPrefs class (reference: docs.unity3d.com/ScriptReference/PlayerPrefs.html).
With this class we easely created different key-value data to store information like the age, the name, the volume level and so on.
We faced reference errors while launching the app for the first time, due to the initialization of the values. To overcome it, we used the Awake class to set default values before the app is launched.
Gamification is a fundamental factor to create an engaging experience. In our version we have created the possibility to use fantasy avatars as personal trainers. Moreover, the fact of giving a final reward to the user is weighted on the quality of the exercises.
We presented the final reward as a score, analysing the quality of the squat. We adopted the following function:
where d is the distance computed between the center of the circle presented in the Squat Analysis scene and the current position of the head. The value is collected every half second. The final score is in a range between
We decided to adopt that function because it has a non-linear behaviour that penalizes the user's error more when it fails to stay in the green area. We empirically found the boundaries and adjusted the function for our scope.
We provide real-time feedbacks to the user in order to evaluate how well the athlete is performing the squat exercise. For the evaluation, we take into consideration the movement of the head as suggested in this scientific paper "The back squat: A proposed assessment of functional deficits and technical factors that limit performance".
With this feature, we provide the possibility to explore the 3D exercises from all perspectives in order to help the user understand how to perform correctly the exercise. The personal trainer can be stopped in a certain position to see exactly the body pose.
The trainer has to behave similar to a real human being, emulating a real workout with a personal trainer.
We decided to use an AR experience instead of a VR one because with a VR experience the user will be detached from reality. This entails the loss of the perception of the surrounding environment. Due to this, the user may lose the orientation and the perception of the room while it is moving and performing the exercise.
-
Install Unity version 2020.3
NOTE: a different version is not guaranteed to work properly
-
Clone this repository
-
Open the scene Start in Unity
-
Go to
File > Build Settings > choose Android Platform > switch platform
-
Go to
File > Build Settings > Player Settings > Player
-
Check if all the scenes mentioned above are in the scene builder
File > Build Settings > Scenes In Build
-
Build your application and save the .apk on your computer
-
Now you are ready to deploy the application on your device! 🚀
-
We used
adb
from the terminal to run the app on Android platform (more info available at developer.android.com/studio/command-line/adb)
If you would like to test the scene in the Unity simulator, you can do it by running the project on your local machine. You can find how to use the emulator of Nreal here.
NOTE:
- For the iOS world we are waiting for an update from Nreal to make the technology compatible with Apple devices, as mentioned by them on Twitter
- For the Android world, it is already possible to use the Nreal smart glasses. Please, refer to this link for compatibility Android version and devices
- The application was tested on OnePlus 8T with Nreal Light - nreal.ai/light
- Open the application. If it is the first time that you open the app, you will be guided inside the app
- Choose your personal trainer, among those proposed, in Settings
- Start the workout and enjoy the hands-free experience
- Enjoy your workout and do your best! ⚡
We show how the app looks like when wearing the glasses:
warmup_init.mp4
warmup_stop_go.mp4
settings.mp4
squat_analysis.mp4
end_scene.mp4
If for some reason the above section does not work, please follow these links for the demo:
Video | Link |
---|---|
Warm up init | Link 1 |
Warm up stop and go | Link 2 |
Settings | Link 3 |
Squat analysis | Link 4 |
End scene | Link 5 |
- Add plane detection
- Improve the interface
- Support for MRTK input systems
- Create a social part inside the app
- Extend the project to more AR glasses
- Publish the app on Google Play Store
Honorable mention by Nreal during the announcement of the final winner list! Twitter post
Distributed under the MIT License. See LICENSE
for more information.
Lorenzo Orlandi - Github - LinkedIn -mail [email protected]
Francesco Laiti - Github - LinkedIn
Davide Lobba - Github - LinkedIn
The project is only for educational purposes. We don't monetize or get profit from it.
If you use this code, or you find some of this repository helpful, please cite us:
@INPROCEEDINGS{10302670,
author={Orlandi, Lorenzo and Martinelli, Giulia and Laiti, Francesco and Lobba, Davide and Bisagno, Niccoló and Conci, Nicola},
booktitle={2023 IEEE International Workshop on Sport, Technology and Research (STAR)},
title={Meta-Trainer: An Augmented Reality Trainer for Home Fitness with Real-Time Feedback},
year={2023},
volume={},
number={},
pages={90-93},
doi={10.1109/STAR58331.2023.10302670}}
We thank the team from Arcoda s.r.l & Terranova Software & MMLab@UniTN for the active collaboration and the opportunity to use the OptiTrack system available at the Multisensory Interactions Lab.
This work is the result of the collaboration between Lorenzo Orlandi and Giulia Martinelli (PhD students at the University of Trento), Francesco Laiti and Davide Lobba(Master Degree stundents of University of Trento). Thanks also to Giuseppe Spallita (PhD student) for helping us with the first and third milestones of the challenge.
Background music: Mixture of NCS music from Youtube.
“No man has the right to be an amateur in the matter of physical training. It is a shame for a man to grow old without seeing the beauty and strength of which his body is capable.”
- Socrates -