Project for Hack The North 2021 ~ Touch screen capabilities on desktop
- Includes savedmodel for the model created using Customvision.ai (Was not used in the project it self; just for refrence)
- Includes script used to collect data from webcam
- Includes Hand Detection and Mouse controlling / Magic Mouse Script
We used customvision.ai to create an object detection model, and was trained to detect hands using a custom dataset, collected using a python script\n
- We created 7 Iterations
- Total of 758 tagged Images (Overall)
- Used well over 1000 Images to train and test the project
Refrence Link(s): https://docs.microsoft.com/en-us/azure/cognitive-services/custom-vision-service/export-model-python?WT.mc_id=academic-10877-marouill
We used python libraries mainly including:
- The Azure provided libary (to get predictions)
- OpenCV
- PyAutoGUI
- Imutils
- OS
- Fun to play with :)
- Would work better with sites that do not require percision, example: Youtube
- Easy to use, simply raise ur hand so that it is seen by the webcam and simply move your mouse
- Hard to get used to
- Could be buggy when trying to click
- Cannot Reach the edges with it
- Extremely inprecise
- Webcam is always on
- Crashes and Lags constantly
Since the code includes the prediction and training keys which should not be shared, we couldn't share the model itself, so we provided the savedmodel download, and the capture images python script which allows you to collect your own data set and allows you to test the amazing customvision.ai website where you could try training your own model.
Feel free to reach out to us on discord if you want to ask us questions or want to learn more about the project
Emojis Source: https://emojipedia.org/