Because of its user friendliness and ease of use, AI has been integrated into a wide range of industries. Al gives machines the ability to mimic human intelligence and assists them in learning from data fed into it as well as from experience. Artificial intelligence has proven to be beneficial in a variety of industries, facilitating work and reducing the burden on human workers.
Computer scientists at ETH have created a new AI solution that allows touchscreens to sense with eight times the resolution of current devices. Their solution can infer much more precisely where fingers touch the screen thanks to AI.
When typing a message quickly on a smartphone, it is common to hit the wrong letters on the small keyboard or other input buttons in an app. Since they were first used in mobile phones in the mid-2000s, touch sensors that detect finger input on the touch screen haven’t changed much.
Scientists have developed a new AI solution that enables touchscreens to sense with eight times higher resolution than current devices.
In contrast, smartphone and tablet screens now offer unprecedented visual quality, which is becoming more apparent with each new generation of devices: higher color fidelity, higher resolution, and crisper contrast. The display resolution of the most recent iPhone, for example, is 2532×1170 pixels. But the touch sensor it integrates can only detect input with a resolution of around 32×15 pixels—that’s almost 80 times lower than the display resolution: “And here we are, wondering why we make so many typing errors on the small keyboard? We think that we should be able to select objects with pixel accuracy through touch, but that’s certainly not the case,” says Christian Holz, ETH computer science professor from the Sensing, Interaction & Perception Lab (SIPLAB) in an interview in the ETH Computer Science Department’s “Spotlights” series.
Together with his doctoral student Paul Streli, Holz has now developed CapContact, an artificial intelligence (AI) that gives touch screens super-resolution, allowing them to detect when and where fingers actually touch the display surface with far greater accuracy than current devices. They presented their new AI solution at ACM CHI 2021, the premier conference on Human Factors in Computing Systems, earlier this week.
Recognizing where fingers touch the screen
The AI was created by ETH researchers for capacitive touch screens, which are used in all of our mobile phones, tablets, and laptops. The sensor detects the position of the fingers by detecting changes in the electrical field between the sensor lines caused by a finger touching the screen surface. Because capacitive sensing captures this proximity inherently, it cannot detect true contact—which is required for interaction—because the intensity of a measured touch decreases exponentially with increasing finger distance.
According to Holz, capacitance sensing was never intended to pinpoint the exact location of a touch on the screen: “It only detects the proximity of our fingers.” Thus, today’s touch screens interpolate the location of the input with the finger from coarse proximity measurements. The researchers aimed to address both of these ubiquitous sensors’ shortcomings in their project: On the one hand, they needed to improve the sensors’ current low resolution, and on the other, they needed to figure out how to precisely infer the respective contact area between finger and display surface from capacitive measurements.
As a result, CapContact, the novel method developed by Streli and Holz for this purpose, combines two approaches: On the one hand, touch screens are used as image sensors. A touch screen, according to Holz, is essentially a low-resolution depth camera that can see about eight millimeters away. A depth camera does not capture colored images but rather records information about how close objects are to each other. CapContact, on the other hand, uses this insight to accurately detect the contact areas between fingers and surfaces using a novel deep learning algorithm developed by the researchers.
“CapContact” estimates the actual contact areas between fingers and touchscreens upon touch,” says Holz. “Second, it generates these contact areas at eight times the resolution of current touch sensors, allowing our touch devices to detect touch much more precisely.”
To train the AI, the researchers built a custom apparatus that records capacitive intensities or the types of measurements our phones and tablets take, as well as true contact maps using an optical high-resolution pressure sensor. The researchers created a training dataset by capturing a large number of touches from several test participants, from which CapContact learned to predict super-resolution contact areas from the coarse and low-resolution sensor data of today’s touch devices.
Low touch screen resolution as a source of error
“In our paper, we show that we can derive touch-input locations with higher accuracy than current devices from the contact area between your finger and a smartphone’s screen as estimated by CapContact,” Paul Streli adds. According to the researchers, one-third of current device errors are caused by low-resolution input sensing. CapContact, using the researcher’s novel deep learning approach, can eliminate these errors.
The researchers also show that CapContact reliably distinguishes touch surfaces even when fingers are very close together on the screen. This is true, for example, of the pinch gesture, which involves moving your thumb and index finger across a screen to enlarge text or images. Close-by adjacent touches are difficult to distinguish on today’s devices.
Their project’s findings have called into question the current industry standard for touchscreens. In another experiment, the researchers used a sensor with even lower resolution than those found in today’s smartphones. Nonetheless, CapContact detected the touches more accurately and was able to derive the input locations with greater accuracy than current devices at today’s standard resolution. This suggests that the researchers’ AI solution could pave the way for new touch sensing in future mobile phones and tablets to operate more reliably and precisely while having a smaller footprint and complexity in sensor manufacturing.