- The front end was created utilizing Angular v18. This application will require access to a webcam/camera.
- Run
npm installto install dependencies. To run the front end, runng serveornpm start.
- The back end utilizes FastApi and Uvicorn running with OpenCV, GestureRecognizer, and Keras. Run
pip install -r requirements.txtto install all dependencies. Then, runmain.pyto run the API client. - Note: Inside the backend folder exists a notebook folder with two notebooks that are not used in the API calls/model prediction.
main.ipynbutilizes the OpenCV capture and the hand prediction can be done locally.crop.ipynbuses GestureRecognizer to crop out hands from a dataset, which was used to train the model. This code is not part of anything running in the model prediction.
- Raise a hand up and choose a paper, scissors, or rock form with it. Click
Take Pictureand face against the computer in a simple game of Rock Paper Scissors!
- This model was trained utilizing a CNN for image classification utilizing a variety of Rock Paper Scissors Datasets. The Kaggle/Jupyter Notebook can be found here.
- Overall, the model was trained several times and gained 92.73% accuracy on the test data: