Skip to content

Latest commit

 

History

History
36 lines (23 loc) · 1.09 KB

README.md

File metadata and controls

36 lines (23 loc) · 1.09 KB

camemoji 😄🙂😐🙁🙁😡

Experimental project that uses Google Cloud Vision API in nodejs to recognize faces' emotions and map these results to emojis.

Demo

camemoji demo

How it works

Uses firebase to upload pictures from the webcam and Vision API to get a result of the emotion.

Currently the emotion mappers looks like this:

"NORMAL": "😐",
"HAPPY": "🙂",
"VERY_HAPPY": "😄",
"SAD": "🙁",
"VERY_SAD": "😭",
"ANGRY": "😠",
"VERY_ANGRY": "😡",
"SURPRISED": "😮",
"HEADWEAR": "🤠"

Running the project

Google Vision API

To run this project you'll need to set up a Google Vision Credential account, enable the API, create the authentication file and add it in your root project called auth.json.

Firebase

Additionally, for file upload I set up a firebase project for storage, you'll need the API Keyand the Storage Bucket in a firebase_auth.js file in the root of the project.