Faking Feelings: Haptic Technology & Simulated Sensation
by Michele Baker
You’ll have to bear with me on this. I tend to take my articles on the TDMB blog off on a tangent. If you’ve read any of my previous pieces, you’ll probably already be aware that I like to bring in a bit of popular sci fi to my explorations of new technological developments. What can I say? It’s my USP. So, let’s get started with a bit of a weird question… What do we think about the concept of artificial intelligence that has human-like sensation?
In the first season of Channel 4’s incredible series about AI, we meet a group of sentient robots, or ‘synths’ as the robots are generally known in the show. One of these conscious synths is Niska, pictured in the above gif. When we meet Niska, she is working as a robot prostitute. Whilst all the other ‘women’ in the brothel are non-sentient, Niska is a very rare exception. As such, her pain and anger are all the more visceral.
Obviously, a conscious robot like Niska is still the stuff of science fiction, but one that can sense touch could be just around the corner.
Forbes published an article last week, entitled ‘Artificial Hairs Could Give Robots New Senses’.
Scientists at China’s Harbin Institute of Technology have built a device which mimics the fine hairs the cover the human body. It is these hairs on our skin that relay sensory information to our nerves through the skin’s surface. The team at HIT have replicated these hairs using wires of just 30 micrometres thick, which is about the same thickness as human hair.
These ‘hairs’ have a small electric current running through them, which creates a magnetic field, which is altered whenever one of the ‘hairs’ is pushed out of its original position, from any kind of external pressure.
It was found that the synthetic skin technology could withstand a little over 11lbs of pressure, and could detect a fly upon its surface. It could also sense objects being dragged in different directions across it, and even sensed a light breeze.
The wires are made from a conductive cobalt alloy, coated in a thin layer of glass. The wires are both strong and flexible, reportedly withstanding being tied in knots. Even when durability was tested by cutting the ends off several wires, and breaking wires by punching holes in the silicon rubber ‘skin’, the sensors continued to work.
So, what’s the point?
Well, the most obvious application of such technology is in prosthetics for healthcare. Giving wearers of prosthetic limbs the gift of sensation will be a remarkable, welcome breakthrough, and one that will improve the lives of those affected by loss of limbs considerably. For those who will be, or indeed already are, using robotic arms, this technology will allow them a better sense of how tightly to grip things. Indeed, the team experimented with using the sensor on a robotic arm.
Currently, the sensor is almost two inches long, which means it could take some time before it’s easily integratable into the human body. But its integration into androids could come sooner.
Are we really creating robots like the synths in Humans?
Well, yes. Let me introduce you to Sophia:
Two teams, Hanson Robotics and Hiroshi Ishiguru Laboratories, are working together to build the world’s most advanced android. At present, as you can see, she’s not exactly on a par with Niska (who, shock horror, is played by a human). But Dr David Hanson, who has led the team that created Sophia, stresses that it’s early days.
“Our goal is that she will be as conscious, creative and capable as any human,” says Hanson. “We are designing these robots to serve in healthcare, therapy, education and customer service applications.”
Hanson went on to state that one day, he believes, robots will be indistinguishable from humans. He anticipates a world in which robots will walk, play, teach, help, and form real relationships with us.
Sophia can currently emulate over sixty human facial expressions. A combination of computer algorithms and cameras in her eyes allow Sophia to see, follow faces, make eye contact, and recognise individuals. This is combined with Alphabet, Google Chrome’s voice recognition software, along with other tools (potentially integration with IBM and Intel technology in the near future, according to Hanson), which allow Sophia to process speech, chat, and – crucially – improve her intelligence over time.
If the synthetic, sensing skin being produced by HIT was integrated into Sophia’s operating system, she would also be able to physically feel.
That, however, does not mean consciousness. As I’ve written about before, we can’t expect to really see that any time soon. Unless, of course, like in Humans, someone is able to devise a ‘consciousness code’, and we are able to, at any level, work out what consciousness actually is.
“The artificial intelligence will evolve to the point where they will truly be our friends,” Hanson has said. “Not in ways that dehumanize us, but in ways the rehumanize us, that decrease the trend of the distance between people and instead connect us with people as well as with robots.”
This may be a somewhat rose-tinted view, particularly when we consider the issues that arise if such humanoids are able to replace humans in the workplace. James has written about examples that show that not to be the case. However, if the technology James mentions is just the tip of the iceberg, and we start adopting humanoid robots into the world, and the workplace, this may change.
Though Sophia is one of the most well-known examples of this kind of technology, she is not entirely alone. Hiroshi Ishiguru, Sophia’s co-creator, is one of the leading developers of androids in the world, and Sophia is just one of his creations – but probably the most advanced.
Sexbots: Simulated Sex and Sensation
Sex robots have already been launched that use technology to create the illusion of sentience, and examples of these robots are among the most lifelike to be seen at present. However, they are unlikely to be endowed with IBM Watson machine learning technology, or anything similar. After all, who wants a robot prostitute that wants to talk and learn?
It all feels a bit sad, really. But Dr Helen Driscoll, from the University of Sunderland, a leading authority on the psychology of sex and relationships, believes that robotic, interactive, motion sensing sex tech is likely to move to the mainstream sex industry soon enough. She adds:
“We tend to think about issues such as virtual reality and robotic sex within the context of current norms. But if we think back to the social norms about sex that existed just 100 years ago, it is obvious that they have changed rapidly and radically.”
Indeed, Driscoll’s views appear to welcome a future in which sexbots like Niska are employed for sexual pleasure. The idea is further explored in Humans, when the father of the central human family uses a code printed on a flyer to unlock ‘adult mode’ on Anita, his home assistance synth. And perhaps offering a sexual outlet for the lonely, or those with niche sexual appetites, is an appealing concept. After all, unless the robotic partner gains consciousness, what harm is there?
However, the sort of synthetic skin being developed by HIT remains principally for healthcare, and even if we were to reach the point at which we create sexbots of the standard seen in Humans, they would not necessarily have much need for sensation of their own. Unless, that is, it becomes a marketable concept that proves appealing to the sexbot consumer, which is a possibility.
Sensation, however, would be a handy tool if we were to create fully-functional humanoids for use within society, as Dr Hanson envisages. The ability to physically feel is inexorably linked to many aspects of a being’s engagement with the world, from self-preservation to – as explored in HIT’s research – gripping and moving objects.
Haptic Technology and Virtual Reality
The ability to create sensation where there is none is not limited to HIT’s research, nor – indeed – to my digression into the world of AI and sextech. If we consider the use of haptic technology in virtual reality, we open whole new doors.
In fact, those doors are already wide open. Haptic technology is already emerging as a groundbreaking development that is set to have widespread applications. Again, these are principally within Healthcare. The use of haptic technology in surgery, dentistry, and other aspects of practical medicine, for training specifically, is being picked up by a few companies now. But perhaps the most fascinating is the work of Generic Robotics, a company whose work with haptic technology is set to have a tremendous impact in its sector:
Haptic Technology in Gaming
Of course, haptic technology is not limited to its functions within medicine, and industry as a whole. Haptic gloves, suits, and so on, are an obvious extension for virtual reality gaming.
At the moment, most manual engagement with virtual worlds in the gaming and experiential arena, is currently limited to hand-held controllers. These are an obvious symptom of the relative immaturity of virtual reality, considering that donning a pair of haptic gloves is clearly a better solution.
We have already got the most basic of haptic technology in modern games controllers, which will vibrate in response to certain stimuli within gameplay. But this is smallfry. It’s one of those things that, in just a few years’ time, we will look back on with wry nostalgia – like the way we used to type texts with corresponding letters on the keypad. Those were the days.
Even these early haptic gloves that we are currently seeing emerging in the VR space are still heavily wired. It’ll probably be a little while before we have wireless haptic gloves. Or maybe not. Things are moving pretty fast right now.
As for haptic suits, I can’t help thinking of Ready Player One, the novel I mentioned in my last blog post. I’ll reiterate the point I made there: read it. In the novel, haptic suits are central to the main character’s immersive engagement in the simulated world of the Oasis, and there are certainly many practical applications of such tech in the real world. However, donning a full haptic suit to play games isn’t something most people are comfortable with right now. It’s probably a technology coveted most by hardcore gamers, if anything. And if it’s not marketed correctly, or timed badly, haptic suits could be one massive, ‘Google Glass’-esque flop.
One aspect of haptics that is still in development is the capability to reproduce the sensation of texture. It’s something that Disney, apparently, have been working on for years with touchscreens, but it’s not yet totally perfected.
Consumer adoption of haptic technology is likely to be a couple of years away at least. We are still at the point where virtual reality systems are only seeing tentative adoption, with figures failing to meet projections during 2016. It will be the job of major hardware developers to cement the place of virtual reality within mainstream consumption before we are ready to see haptic technology feature more prominently in the consumer market.
Nonetheless, in Medicine at least, the simulation of sensation through haptic technology is a major breakthrough that could change how we approach medical training, and also revolutionise the world of prosthetics.