Overview
With smart phones and tablets touch interactions have become common and users are accustomed to
them. ABUSIZZ wants to bring touch interactions to the table without having to resort to expensive
hardware build into the table surface, but rather project the display from an overhead projector built into a
lamp together with sensors and computing to recognise hands and gestures and interact with the user.
This is technically demanding, as range sensors have limited spatial and depth resolution making it difficult
to achieve a natural interaction with the user. Preliminary work has demonstrated, that it is possible to track
hand movement and recognise when a user performs a finger movement to click using classic image
processing techniques. We aim to expand this first work by making the solution more robust and more
natural and add more gestures for controlling the interface such as zoom and scroll. We will address these issues by using a machine learning approach. In a training app, we will ask
people to perform certain tasks like selecting a GUI element on the image, moving an object or zooming in
and out, and so get labelled interaction data for deep learning approaches.
The second part of the project aims to observe the interaction between two people, for example in a sales
conversation and automatically provide a summary of it, as well as to recognize some typical behaviours.