Researchers claim to have found a unique way to allow smartphones owners to operate their devices with hand gestures, using a pair of sunglasses, the in-built front-facing camera, and software.
The project – dubbed GlassHands – is led by the associate professor of human-computer interaction in the internet of things at Germany’s Coburg University, and counts two Microsoft research affiliates among its co-contributors.
Their method involves using the front-facing camera of a device to capture and recognise the gestures of the user reflected off the surface of sunglasses they are wearing.
“We propose to enrich the sensing capabilities of unmodified mobiles by everyday common apparels such as sunglasses or common reflective visors," the researchers said in a paper.
“The reflection will contain the phone itself, the surface around the phone, and the user’s hands.
“To use the reflection image as an input modality, we need to be able to detect the reflected image in the camera image and extract the relative location of objects that need to be sensed.”
While the idea isn’t new, the implementation is: other attempts involve modifications to the smart device or utilisation of its sensors.
The researchers said they wanted to overcome the limited “interactive surface area” of smart devices by allowing them to recognise gestures made around the device.
“While our system was implemented on an Amazon Fire phone, it can be employed on other commodity smartphones as well,” they said.
The researchers demonstrated using the system to navigate maps, multi-task, and browse music on the devices.
They said GlassHands could be particularly useful to people that already wore reflective eyewear for work or play, such as "safety goggles, skiers, divers, [and] motorcyclists".
Though early days, the researchers said they hoped to make GlassHands compatible with “a wider variety of glasses models with different reflection and curvature properties”.
“Moreover, the use of reflections directly of the user’s eye using corneal imaging could overcome the need for eyewear,” they noted.
The paper also said the algorithms behind GlassHands would need to be toughened to cope with recognising hand gestures in more complex, real-world situations.
The research was originally presented in Canada in November.