Photo by Mark Daynes
This project aims to decipher human emotions through real-time facial expressions. It encompasses steps to acquire the dataset, train and validate the model, develop a Streamlit application for real-time emotion analysis, and deploy the application on Google Cloud.
- Download the AffectNet dataset from the provided link: AffectNet Dataset.
- Extract the dataset files and divide it into training and validation sets.
- Conduct label-to-tag mapping for better comprehension.
=> The model underwent multiple training sessions to acquire the weights used in
utils.py
, subsequently called inapp.py
.
- Utilize Google Colab's computational resources for training and validation.
- Implement necessary pre-processing steps and utilize deep learning frameworks for fine-tuning.
- Preserve the architecture and weights of your trained model for future use.
- Use Streamlit to create an interactive web application for real-time emotion analysis.
- Integrate your trained emotion detection model for real-time analysis via webcam input.
- Enhance the user experience by incorporating visual elements such as bounding boxes and emotion labels.
- Prepare your Streamlit application for deployment by specifying dependencies in a
requirements.txt
file. - Utilize Google Cloud's infrastructure to deploy your Streamlit app, making it accessible via a web URL.
- Share your deployed app with the world to explore the fascinating realm of emotion decoding.
This project is dedicated to individuals with autism, demonstrating the potential of AI to enhance emotional understanding and interaction.