Precise touch screens thanks to AI
Scientists from ETH Computer have developed an AI solution that enables touchscreens to sense with eight times higher resolution than current devices.
While touch sensors, which detect finger input on the touch screen, have not changed much since they were first released in the mid-2000s smartphone and tablet screens are now providing unprecedented visual quality, which is even more evident with each new generation of devices whether that’s higher colour fidelity, higher resolution or crisper contrast.
Christian Holz, ETH computer science professor from the Sensing, Interaction & Perception Lab (SIPLAB), together with doctoral student Paul Streli, Holz have developed an artificial intelligence (AI) called CapContact that is able to give touch screens super-resolution so that they can reliably detect when and where fingers actually touch the display surface, with much higher accuracy than is the case with current devices.
The AI was designed for capacitive touch screens, which are the types of touch screens used in most mobile phones, tablets, and laptops. Here the sensor detects the position of the fingers by the fact that the electrical field between the sensor lines changes due to the proximity of a finger when touching the screen surface but it cannot actually detect true contact.
CapContact uses the touch screens as image sensors. According to Holz, a touch screen is essentially a very low-resolution depth camera that can see about eight millimetres far. A depth camera does not capture coloured images, but records an image of how close objects are. CapContact looks to exploit this to accurately detect the contact areas between fingers and surfaces through a novel deep learning algorithm that the researchers developed.
“First, ‘CapContact’ estimates the actual contact areas between fingers and touchscreens upon touch,” said Holz, “Second, it generates these contact areas at eight times the resolution of current touch sensors, enabling our touch devices to detect touch much more precisely.”
To train the AI the team used data from a multitude of touches from several test participants, the researchers captured a training dataset, from which CapContact learned to predict super-resolution contact areas from the coarse and low-resolution sensor data of today’s touch devices.
The researchers were able to show that one-third of errors on current devices are due to the low-resolution input sensing. CapContact can remove these errors through the researcher’s deep learning approach.
They were also able to demonstrate that CapContact reliably distinguishes the touch surfaces even when fingers touch the screen very close together. This is the case, for example, with the pinch gesture, when you move your thumb and index finger across a screen to enlarge texts or images. Today’s devices can hardly distinguish close-by adjacent touches.
This research suggests that this AI solution could pave the way for a new touch sensing in future mobile phones and tablets capable of operating more reliably and precisely, yet at a reduced footprint and complexity in terms of sensor manufacturing.