Researchers are making great efforts to develop touch screens, which have become essential in the modern technological industries.
Researchers from the Computer Department at the Federal Institute of Technology ETH, in Zurich, Switzerland, have come up with a new solution using artificial intelligence software, which allows touch screens to sense touch with great accuracy, as it determines where the fingers touch the screen with an accuracy of eight times higher than the sensor systems in current touch screens.
On Friday, Tech Xplore stated that in current touch screen systems, typing a message quickly via smartphone screens may lead to pressing wrong letters on the small keyboard, or on other input buttons in the application.
The researchers said that despite the continuous improvement in the visual quality of smartphone screens and tablets with each new generation of devices, color accuracy, and more pronounced contrast, the touch sensors that detect finger inputs via touch screens have not changed much since they were first released in phones. Portable in the mid-first decade of the twenty-first century.
For his part, Christian Holz, the lead author of the study revealed that the latest generation of iPhones offers displays with a resolution of 2532 x 1170 pixels, but the built-in touch sensor detects inputs with a resolution of only about 32 x 15 pixels, which is about 80 times less than the resolution the screen.
And Hulz said that he and his team have developed a new deep learning algorithm for an artificial intelligence program called CapContact, to overcome this, as it allows touch screen sensors to detect when and where fingers touch the surface of the screen, with much higher accuracy than current devices.
To train the artificial intelligence program, to predict contact areas with great accuracy, the researchers created a device dedicated to recording measurements taken by smartphones and tablets and real contact maps through a high-resolution optical pressure sensor.
The research team designed the new program for capacitive sensors in touch screens, which are used in all smartphones, tablets, and laptops.
In touch screens, the sensor detects the position of the fingers through the presence of an electric field between the sensor lines, as it changes with the proximity of the finger when touching the screen surface.
Holz said, in current touch screens, the capacitance sensor is not designed to determine where the touch is actually and precisely on the screen, as it only detects the proximity of the fingers, and determines the location where the finger is entered from inaccurate proximity measurements.
And the researchers on their project aimed to address this, as they had to increase the current low resolution of the sensors in touch screens, and learn how to infer a precise contact area between the finger and the display surface from measurements and capacitive sensing.
Holz said, CapContact technology estimates physical contact areas between fingers and touchscreens at the touch and then senses these contact areas with eight times more accurate than current systems.
The researchers showed that one-third of the errors in current devices' screens are due to the low-resolution sensor of the input, as CapContact technology can remove these errors, and accurately identify while touching the screen, even when a number of fingers together touch the screen, such as, When moving the thumb and forefinger across the screen to enlarge Texts or pictures.