First official release of hamoco

hamoco v.1.0.1 is released on PyPI.

hamoco (handy mouse controller) allows you to take control of your mouse by using hand gestures that are captured in real time by your webcam. It relies on MediaPipe to track hands, and the nature of the different hand poses are predicted by a small neural network built with TensorFlow. Basically, I thought that it might be fun to try and replicate the famous scene from the movie Minority Report (spoiler: it’s cooler when Tom Cruise does it).

You can perform all the basic mouse actions: motion, left/right click, vertical scrolling and drag & drop. There are many options to adjust the experience to your liking (e.g. sensitivity, motion smoothing) and even an automated pipeline to record your own data and train a custom neural network tailored to your needs.

The code is now available on PyPI in version 1.0.1, and you can also check out the page of the project on GitHub.