Link to our official documentation for the MIT 2023 CRE[AT]E CHALLENGE
This is where we organize the code for our project. This is a README file in the repository. It informs you what this repository is all out. Repositories track the code and its history. Code can be managed and changed in different versions. To nagivate changes and different versions, Github uses a system of managing branches and utilizing pulls and commits Most updated branch is the main branch
For running machine learning
- Upload FallAllD2.csv and machine learning.ipynb to Google Colab
- Run machine learning.ipynb to create the machine learning model. Right now models are already created so you don't need to do this step. For running prediction script
- run sensordata.py to run the prediction script. it will automaticaly collect data from the raspberry pi's sensehat
For Connecting Raspberry Pi
link to instructions
- ctrl p (microsoft)
- command shift p (linux)
- remote ssh either connect or add new host
- username@ip address
-
input password
-
connect to linux
-
enter password
-
on raspberry pi terminal:
- connect to wifi
- ifconfig
- to run on local terminal:
- open terminal
type:
- python3 name of file
- to stop file:
- ctrl c
folder that holds all the models
-
teacher models
teacher_model4_device2_aggregate
is the current teacher model AND model being tested onsensordata.py
teacher_model0_device2_aggregate
is the teacher model with all three sensors
-
student models they all hold aggregate data from device 2 (wrist data) they are numbered with from 0-6. a number corresponding to how many sensors are configured. Below is the pseudo code snippet that you can also find in
machine learning.ipynb
(in python) that explains the layout0: ['Device','Acc_x','Acc_y','Acc_z', 'Gyr_x', 'Gyr_y', 'Gyr_z', 'Bar_x', 'Bar_y']
1: ['Device','Acc_x','Acc_y','Acc_z', 'Gyr_x', 'Gyr_y', 'Gyr_z']
2: ['Device','Gyr_x', 'Gyr_y', 'Gyr_z', 'Bar_x', 'Bar_y']
3: ['Device','Acc_x','Acc_y','Acc_z', 'Bar_x', 'Bar_y']
4: ['Device','Acc_x','Acc_y','Acc_z']
5: ['Device','Gyr_x', 'Gyr_y', 'Gyr_z']
6: ['Device','Bar_x', 'Bar_y']
a folder that holds the raw data from FallAllD dataset. It is stored in csv files
M. Saleh, M. Abbas and R. L. B. Jeannès, "FallAllD: An Open Dataset of Human Falls and Activities of Daily Living for Classical and Deep Learning Applications," in IEEE Sensors Journal, doi: 10.1109/JSEN.2020.3018335.
a folder that holds the description and code for the data extraction from the FALLALLID dataset
extracts data from all csv files from FallAllID
and turns it into a panda dataframe on a csv file for machinelearning.ipynb
this was modified from FallALlID's data extraction file.
file from FALLALLID website that provides an example of the data extraction (From their original website)
a text file that describes the methodology of the sensors and sound in our product
machine learning modeling on jupyter notebook if your device doesn't have GPU, you can upload the jupyter notebook onto Google Colab In different branches there are different models. Right now there are 4 models:
- Simple Neural Network Model
- LSTM Model with Kaggle Data
- LSTM Model with FallAllD data
- LSTM Model with Knowledge Distillation [MAIN BRANCH]
where Accelerometer and Gyroscope data is collected using raspberry pi's sensor then, the data is run through the model and predictions are made
testing python file to debug our student model
What is the MIT Create Challenge?
Create an assistive piece of technology that can sense when a senior is in danger, and contacts for help
The base product works without wifi connection The entire product works will work with wifi connection
There are three major components
- Sensors that detect falling movement
- A device that connects the sensor to a communicative application
- The program
to articulate: as of now we are planning to program either in Raspberry Pi 4 + Python
Please look at these sample projects to understand what we are doing:
Our recent slideshow Understanding Rasberry Pi
M. Saleh, M. Abbas and R. L. B. Jeannès, "FallAllD: An Open Dataset of Human Falls and Activities of Daily Living for Classical and Deep Learning Applications," in IEEE Sensors Journal, doi: 10.1109/JSEN.2020.3018335.