In recent years, design engineers have expended considerable time and resources to impart human-like senses into new products, with some notable successes. Natural language processing has helped develop intelligent virtual assistants that enable increasingly rich interaction between humans and machines. At the same time, computer vision has played an important role in bringing amazing applications to life, such as the autonomous car.
However, attempts to give devices the sense of touch have proven to be more elusive. Scientists and engineers have struggled to not only replicate our skin’s ability to precisely detect tactile information—like smoothness, hardness and pain—but also to mimic its signaling and decision-making capacity.
Two new technologies point to a potential shift. These touch systems could well represent building blocks for designers to create products that interact with humans and their operating environment in an entirely new way.
A New Tactile Sensor
In the most recent development, researchers from Daegu Gyeongbuk Institute of Science and Technology, ASML Korea Co., Dongguk University-Seoul, Sungkyunkwan University and the University of Oxford have developed a tactile sensor that aims to measure surface textures with high accuracy.
The device consists of an array of piezoelectric receptors, which generate electrical responses proportional to applied stress, enabling it to identify surface characteristics of objects, such as width and pitch.
The sensor offers a number of features that differentiate it from similar existing sensors. For one, it mimics the way that humans sense surface characteristics, detecting tactile information by touch and sliding. Most competing technologies use only one of these methods.
In addition, the sensor’s receptor array can calculate sliding speed, using data on the time interval between two receptor signals and their distance. Other devices use a single receptor and, as a result, require an external speedometer.
The researchers tested the sensor by pressing square, triangular and dome surface shapes against the sensor’s surface. The scientists also placed soft material against the sensor to see if it could measure depth, raising the prospect of three-dimensional sensing. Although the test results were encouraging, the technology did fail to accurately distinguish between 3D shapes.
It’s important to remember that the sensor is still in the early stages of development, yet the technology represents a step closer to giving robots, prosthetics and electronic devices the sense of touch. At some point in the future, this means that machines could “feel” sensations like roughness, smoothness and even pain, expanding the list of tasks and services that they could perform.
In November 2017, researchers at Stanford University and Seoul National University announced the development of an artificial sensory nerve circuit that can be embedded in skin-like coverings for neuroprosthetic devices and soft robotics.
The circuit consists of a touch sensor that detects force and a flexible electronic neuron that relays signals to a synaptic transistor. The transistor is modeled after human synapses, and performs similar functions, such as relaying signals and storing information to make simple decisions. The developers have engineered the synaptic transistor to recognize and react to sensory inputs based on the intensity and frequency of low-power signals.
Tests evaluating the circuit indicate that the artificial nerve can detect various tactile sensations. It was able to differentiate Braille letters and accurately detect the direction of a cylinder rolled over the sensor.
Tactile Sensors Not Quite There Yet
Touch technology developers will tell you that these sensing systems are still in their infancy. The work of the two research teams discussed here, however, does represent a technological foundation upon which designers can build future touch-enabled systems.
To better impart the sense of touch in applications such as prosthetics and robotics, scientists still have challenges to overcome. To give devices access to a full range of tactile information, they will have to incorporate technologies that can detect hot and cold sensations. Serving these applications also calls for ways to embed the sensing technologies into flexible circuits and to interface them with the brain.
About the Author
Tom Kevan is a freelance writer/editor specializing in engineering and communications technology.
Follow Robotics 24/7 on Facebook
Email Sign Up
Get news, papers, media and research delivered
Stay up-to-date with news and resources you need to do your job.
Research industry trends, compare companies and get market intelligence every week with Robotics 24/7.
Subscribe to our robotics user email newsletter and we'll keep you informed and up-to-date.