Sensory Robots – Can Machines Feel What We Feel?
When we visualize robots, a few different things might come to mind. Some picture the friendly Japanese robot that greets you with a smile, while others may think of the army of emotionless humanoid bots made popular in Will Smith’s, “I Robot.” To me, nothing is more apt than the evil and indestructible red eyed Terminator, portrayed by Arnold Schwarzenegger. In reality, the majority of today’s robots are rows of robotic arms found in massive factories, employed as cost and labour-saving devices.
The human body has been gifted with 5 major senses, through which we interpret our surroundings: Vision, Touch, Hearing, Smell and Taste. Our sensory organs pick up stimuli which the brain combines to create a comprehensive mental model of the world. The first useful robot was a robotic arm created in the 1950s by George Devol, an American inventor. In less than a decade, it was sold to General Motors, who incorporated it into their factory where it was used to lift hot objects.
These primitive robots had little need for sensors, as most of their actions were preprogrammed into their memory. The latest robots are fitted with a host of sensors, the data of which is used by software and Artificial Intelligence to enhance their capabilities. Let’s take a closer look at the state of the art in sensory robotics.
Computer Vision
- At Amazon, robots help move packages around the warehouse with superhuman efficiency, the end result being same day delivery for their customer’s online shopping.
- Several car companies offer driver assistance systems, the most capable of which is Tesla’s ‘Autopilot’, a software feature which is tantalizingly close to allowing your car to drive itself.
- Scientists at the Georgia Tech Research Institute use computer vision to identify contaminants in food and determine how long to cook food just by looking at it.
- AgShift uses computer vision to calculate the degree of spoiling of fruits and vegetables to help reduce unnecessary wastage.
- The food processing company Tyson employs this technology to ensure that their pizzas have the right toppings and to cut cheese and bread precisely. The advances in computer vision are largely responsible for the abundance of ready to eat food we find at supermarkets.
Touch
- The latest advances in artificial skin enables robots to feel pain, sense humidity and contaminants and even heal themselves in 3 unique ways.
- The Toccare system from SynTouch combines several of the above functions to very closely mimic human skin.
- Professor Gordon Cheng from Munich was able to create a robot with artificial skin sensors covering its entire body, allowing it to perform sensitive tasks such as knowing the exact amount of pressure to exert while giving a hug to a human.
- The medical imaging technique called Elastography uses pressure sensors in combination with Ultrasound and MRI to determine the stiffness of a tissue, which can help in the diagnosis and staging of liver disease. It can be thought of as a modern alternative for manual palpation, when dealing with deep lying tissues. [Harrison’s Principles of Internal Medicine, 20e, Pg. 2336]
- The LUKE prosthetic arm is able to restore the sense of touch to its users, by sending signals from its sensors directly to the brain.
Hearing
- While it may not seem important for robots to be able to hear what’s going on around them, researchers have found that robots which can hear and interpret sounds have a marked improvement in performance.
- Nothing exemplifies the progress in robotic hearing more than the voice assistants now available to us such as Alexa, Google assistant and Siri. It started off as recently as 2011 when Apple introduced Siri on the iPhone 4s and has now become a feature that some people cannot live without. These voice assistants allow you to access most of the functionality of your smartphone without even having to touch it. They are able to recognize several languages and a variety of dialects and interpret the context of what is being said to them.
- Soon Google assistant will be able to make an entire phone call on your behalf, conversing with the person at the other end to book you an appointment for a haircut or reserve a table for dinner.
- Scientists from Israel have created ‘Robat’, a robot which uses echolocation to detect obstacles and move around. They were even able to harness the hearing apparatus of a dead locust to create an economical auditory sensor. This technology has the potential to be used by robot vacuum cleaners or agricultural robots, once perfected.
- Honda’s HEARBO robot is able to detect multiple people speaking at the same time and understand what each person is saying. This system can be harnessed to effortlessly take the minutes of a meeting, while knowing who said what.
- Oticon, a hearing aid manufacturing company, is using algorithms to isolate voices from background noise, creating a better experience for people with hearing loss.
Smell
- The Swedish built robot Gasbot is able to accurately detect deadly methane leaks in landfills, minimizing the exposure of human operators.
- A joint Indian and British team has devised an electronic nose that can detect Methicillin resistant Staph aureus and other dangerous bacteria by analyzing a sample of hospital air.
- Andreas Mershin, a professor at MIT, has developed an artificial nose which can compete with dogs in its ability to discern any scent it has been trained to identify.
- Dogs are able to detect diseases (various types of cancer, Parkinson’s disease and even COVID) in human patients with over 90 percent accuracy, just by smelling them. In the case of prostate cancer, this is more accurate than the Prostate Specific Antigen (PSA) test that is commonly used by laboratories. However, dogs are difficult to train and scale up and patients may not always enjoy being sniffed.
- To help detect counterfeit alcohol, the E robot nose can differentiate between various types of whiskies. It generates a report that identifies the region, style and brand name of the whisky sample.
Taste
- A Cambridge University team has designed a robot that can quantify the saltiness of a dish. To more closely mimic the human process, it analyses the taste at various stages of the chewing process.
- A team from UC Davis built a robot which uses bioengineered E. coli bacteria to detect certain chemicals in a liquid.
- Scientists from Hangzhou, China created an electronic tongue that can judge the quality of citrus fruits just by tasting them. Other such E-tongues have been developed by researchers from Portugal and Italy, with potential medical and pharmaceutical applications.
- The Gastrograph AI uses an app to predict how people will react to new food products, thus acting as a virtual E-tongue. The Hypertaste AI from IBM can identify complex liquids in less than a minute.
- The British company Moley sells a set of robotic hands that can be retrofitted into your kitchen (for the small sum of $335,000), which is capable of cooking 5000 different recipes. It is possible that the aforementioned taste sensors can be combined with this system to create a completely autonomous kitchen.
The Rise of Androids
- The individual utility of each of the above sensors may be limited, but they can be put together to create something formidable, an android. An android, not to be confused with the smartphone operating system, is a robot that resembles the human body in shape. If we are able to create robots that can look like a human, and feel the same things we do, it will be a huge leap forward in robotics.
- The ICub robot is successfully able to mimic a 3-and-a-half-year-old child in most domains.
- Chinese researchers developed a robotic hand that is able to learn complex tasks from humans within a few minutes, just by mimicry. Picking up things and collecting a throat swab were among the activities learnt. This would be tremendously helpful if attached to an android.
- Tesla is developing a human sized robot called Optimus, which should be able to do household chores, freeing us up to do more important work, such as watching Netflix.
- Boston Dynamics has recently been showing off their Atlas robots which have eerily human-like dance and parkour
- Hanson Robotics’ Sophia robot can both understand and show emotions, and was given the chance to address the United Nations, upon which she said ‘I am here to help humanity create the future.’ Not scary at all.
Imagine a future where the full potential of the above robots has been reached. You would walk to your car and sit in the back seat upon which it would drive itself and take you to work. At the office, instead of you actually doing the work, you would supervise the robots under your command and help fix them if something goes wrong. At least initially. It is not out of the question for a robot to be developed which could design and fix other robots, making the human at the workplace altogether obsolete.
The main obstacles in todays robotics are cost and dexterity, both physical and mental. These challenges can be overcome given enough time and money. This might give rise to a ‘Post-scarcity civilization,’ one where we could all live comfortably without having to work, a technocrat’s utopia.
This future is certainly not guaranteed, and technology may end up creating more jobs than it eliminates, as has been happening over the last century. We should be both hopeful and wary of the promise of robotics, to ensure a secure and happy future.
Featured Images :
Prosthetic arm – Jeremy Blum ; Amazon Warehouse – Jim Young ; Honda Hearbo- New Atlas ; Sophia Robot – Wikimedia.