(Inside Science) — The electronic revolution has engaged the human senses of vision and hearing. Until recently, it has mostly ignored smell, taste, and touch.
But the sense of touch is on the cusp of its own revolution. The technology is called haptics, a broad definition that relates to anything you can feel.
“Haptics is to touch what optics is to sight,” said Will Provancher, an associate professor of mechanical engineering at the University of Utah in Salt Lake City and CEO of a company building haptic devices.
If you are playing a video game, you may be using haptics in your joystick or gamepad. If your cellphone vibrates in your pocket, that’s haptics. If you’ve ever thought it was vibrating, but it wasn’t, that can show you what an impression the technology can leave.
Piloting a fly-by-wire airliner like an Airbus 320 makes use of haptics. The pilot moves a joystick which sends messages to a computer that moves the flight surfaces. Haptics is also used in planes to shake the control yoke – a sign pilots cannot ignore – when the aircraft is about to stall.
Haptics is responsible for much of the buzz behind smart watches, including the upcoming Apple Watch, which senses vibrations on the front and back. It can send silent vibrations to wake you up without disturbing others, and you can tap on the watch to communicate with other wearers through the sense of touch.
The first haptics patents go back to the 1960s and 1970s, out of places like Northrop Grumman and the legendary Bell Labs. The theory behind current haptics applications goes back into what is called multiple resource theory. Humans are visual animals, but when that sense gets overwhelmed other senses can compensate. Think of a blind person learning what another person’s face looks like by touching it.
Now imagine your car has finger grooves in the steering wheel that are connected by software to a GPS navigation system. Just before you are supposed to make a turn, plates on the wheel vibrate under your fingers. The closer you get, the faster and the greater the magnitude of the vibration. You can spend more attention driving than navigating because haptics will guide you.
The haptics field has several subdivisions, including output and input haptics. The pilot in the Airbus is putting in information and the plane is reacting. The pilot nearing a stall is getting information and, hopefully, reacting.
There also is tactile and forced sensory feedback, said Sriram Subramanian of the University of Bristol in the United Kingdom. His group uses ultrasound to produce “focal points”– concentrated sound waves suspended in midair — that control what is shown on a nearby computer display. The points essentially create a floating, invisible button that responds to tactile feedback. The user can “feel” the points and can even feel differences between points.
“With ultrasound, we create patterns with a strong intensity and feel to it,” he said.
Moving the focal points moves the objects on the display or reads physical attributes.
In one demonstration from Subramanian’s team, a user was shown a map of population densities of cities. As the user moves a hand across the screen, a city’s density is indicated by the “feel” of its point on the map. Placing the hand over a sparsely populated area would feel different from having it over London or New York.
An object created through haptics is a tactile illusion, and you can move your hand right through it. But, another kind of haptic object, forced, can actually produce a barrier you cannot go through, Subramanian said.
For this, the user has to wear a device that translates the haptic message into commands that control or restrict motion, he said. If a game requires a character to touch a barrier it cannot get through, the device restricts the player’s motion.
Disney Research, a division of the film company, is particularly active. One system it developed, called REVEL, projects texture on smooth objects. With that technology, it might be possible to feel the material in a sweater before you buy it online. It can work on any surface, including furniture, walls, tabletops, or human skin, through vibrators embedded in a chair. It can help make a player feel as if he or she is driving a race car, complete with skids and collisions. There are haptic keyboards coming.
In medicine it may soon be possible for a doctor to look at a 3D MRI or CT image, stick his or her hand into the image and feel for things like tumors, blockages, or blood clots.
Haptics is already being used in robotic surgery. The surgeon, holding controls, can feel what the robot is feeling and react properly, a variant on the fly-by-wire controls. The daVinci surgical robot in use since 2000 is one example. One system can work with Braille, changing anything non-Braille into Braille characters a blind person can feel.
Much of the future technology will come from game development.
Provancher’s company, Tactical Haptics, has handheld devices for games that simulate weight and inertia. If a player lifts a sword in a battle game, swings it and strikes a character on the screen, the player will feel the weight of the sword, the motion of the swing and the impact.
What happens if the same technology was combined with holography, producing images, and artificial intelligence that would react to subjects? It would be the first step on creating Star Trek’s holodeck, where participants would see and feel environments that do not exist, including places and times that aren’t really there.
“It’s a strong possibility,” Subramanian said. “There’s a lot of excitement.”