Friday, 22 October 2021

Ameliorating Touch Screen Device Could Enable Users to ‘Feel’ Objects | ELE Times

The next time you buy a new couch, you may not ever have to leave your old one to get a feel for the texture of the new material.

Dr. Cynthia Hipwell, Oscar S. Wyatt Jr. ’45 Chair II Professor in the J. Mike Walker ’66 Department of Mechanical Engineering at Texas A&M University, is leading a team working to better define how the finger interacts with a device with the hope of aiding in the further development of technology that goes beyond sensing and reacting to your touch.

The ultimate goal of furthering this human-machine interface is to give touch devices the ability to provide users with a richer touch-based experience by equipping the technology with the ability to mimic the feeling of physical objects. Hipwell shared examples of potential implementations ranging from a more immersive virtual reality platform to tactile display interfaces like those in a motor vehicle dashboard and a virtual shopping experience that would let the user feel the texture of materials before purchasing them.

“This could allow you to actually feel textures, buttons, slides and knobs on the screen,” Hipwell said. “It can be used for interactive touchscreen-based displays, but one holy grail would certainly be being able to bring touch into shopping so that you could feel the texture of fabrics and other products while you’re shopping online.”

Hipwell explained that at its essence, the “touch” in current touch screen technology is more for the screen’s benefit than the user. With the emergence and refinement of increasingly sophisticated haptic technology, that relationship between user and device can grow to be more reciprocal.

She added that the addition of touch as a sensory input would ultimately enrich virtual environments and lighten the burden of communication currently carried by audio and visuals.

“When we look at virtual experiences, they’re primarily audio and visual right now and we can get audio and visual overload,” Hipwell said. “Being able to bring touch into the human-machine interface can bring a lot more capability, much more realism, and it can reduce that overload. Haptic effects can be used to draw your attention to make something easier to find or easier to do using a lower cognitive load.”

Hipwell and her team are approaching the research by looking at the multiphysics—the coupled processes or systems involving multiple physical fields occurring at the same time—of the interface between the user’s finger and the device. This interface is incredibly complex and changes with different users and environmental conditions.

“We’re looking at electro-wetting effects (the forces that result from an applied electric field), electrostatic effects, changes in properties of the finger, the material properties and surface geometry of the device, the contact mechanics, the fluid motion, charge transport—really, everything that’s going on in the interface to understand how the device can be designed to be more reliable and higher performing,” Hipwell said. “Ultimately, our goal is to create predictive models that enable a designer to create devices with maximum haptic effect and minimum sensitivity to the user and environmental variation.”

As research into and development of the technology continues to progress, Hipwell said she predicts consumers will begin to see early elements implemented into common devices over the next few years, with some early products already in development.

“I think early elements of it will definitely be within the next five years,” Hipwell said. “Then, it will just be a matter of maturing the technology and how advanced, how realistic and how widespread it becomes.”

The post Ameliorating Touch Screen Device Could Enable Users to ‘Feel’ Objects appeared first on ELE Times.

No comments:

Post a Comment