Conventional robots lack a comprehensive sensory system and hence are not capable of collecting data from the world around them. As humans, we can see, hear, touch, smell and taste. However, it is not required that robots must smell, taste or process sound but it is essential that they see and feel an object.
How do Robots See?
A photoelectric cell is the simplest optical system used in a robot. The cell enables the robot to decide yes/no situations in its field of vision such as whether a particular piece of equipment is there or not. For instance, if the robot looks at a particular location to locate a tool, in the event of the tool, being present, light will be reflected from this item and delivered to the photoelectric cell of the robot. The light waves are then converted to an electric current, which is transmitted to the brain of the robot.
Television cameras are also used by robots. Images received by the robot are processed by comparing the received image with images that have previously been stored in the robotic system.
How do Robots Feel?
Tactile sensors are used to simulate the human sense of touch. One tactile sensor is a simple switch that moves from one position to another when the fingers of the robot contact a solid object. The switch may close when the finger contacts the object allowing the flow of an electric current to the brain. Combining several tactile sensors will result in a more sophisticated sense of touch. Hence the robot will be able to estimate the size, shape and contours of an object examined.
The Cognition for Technical Systems (CoTeSys) excellence cluster in Munich from the Technische Universitaet Muenchen (TUM) designed an underwater robot in 2010 with a sensory system that is reliable, effective and energy-efficient in environments ranging from turbid waters to sewer pipes to the sea floor.
Their model is based on the lateral line system, an organ that helps amphibians and fish to avoid danger, and to orient themselves and hunt prey in murky or dark waters. This research was aimed at enabling autonomous robots to intelligently react to their surroundings and perform tasks on their own.
These robots are not rigidly programmed as such systems depend on their own sensory operations.
The following video by CoTeSys demonstrates an underwater vehicle called Snookie – an automated system with water velocity sensors on its nose. This sensor system mimics the lateral-line system typical to a fish allowing the fish to navigate and isolate objects in the water by detecting water flow changes.
Underwater Vehicle Snookie
Brooks R.A et al (1999) designed a humanoid robot, Cog that has a range of sensory systems including auditory, vestibular, visual, kinesthetic, and tactile senses.
The visual system of Cog is designed to imitate the features of the human eye including space-variant sensing and binocularity. It is possible for each eye to rotate about a coupled horizontal axis and an independent vertical axis. Each eye is provided with two grayscale cameras that emit an NTSC signal digitized by a frame grabber and linked to the digital signal processor network.
The vestibular system of Cog has three rate gyroscopes that are mounted on orthogonal axes and two linear accelerometers that mimic the human semi-circular canals and otolith organs, respectively. These devices are mounted on the robot head just below eye level. Analog sensors from each of these sensors is amplified on-board the robot and also processed by a commercial analog to digital converter fitted to one of the brain PC nodes.
The auditory system of Cog includes two omni-directional microphones mounted on the robot head. Crude pinnae were built surrounding the microphones to enable localization. A commercial A/D board interfaced to the digital signal processing network processes the analog auditory signals.
The tactile system of Cog uses resistive force sensors. Each sensor determines the amount of force applied to its sensing surface. For experimental purposes, a 6x4 array of sensors was fitted on the torso of the robot. A single 6811 microcontroller multiplexes the signals from these sensors providing position and force measurements.
The kinesthetic system of Cog includes a number of sensors located at each joint that provides feedback regarding the state of Cog’s motor system. The eye axes use the most easy form of feedback, i.e., each actuator includes a digital encoder that offers position information. The torso and neck joints are provided with encoders and motor current sensing, limit switches at the end-points of joint movement and temperature sensors on the driver chips and motors. The most kinesthetic sensing is in the arm joints and each of the 12 arm joints has strain gauges for precise measurements.
MIT Humanoid Robot Cog - Learning with HumanCaregiver
Incorporating a sensory system in a robot will enable its use in a wide range of applications, such as:
- They can be used in medical applications and used to help children and adults suffering from mental and physical disabilities
- Robots with sensory systems similar to that of fish and amphibians can be used underwater and can withstand harsh conditions
- Robots equipped with advanced sensory systems are used for search and rescue operations.
Researchers are working towards a robot that will have all the capabilities of a human being and maybe more. Incorporating suitable sensory systems is the only way to achieve this goal hence this research is ongoing. Researchers are striving towards a robotic system that can recognize and engage in joint attention behaviors enabling social interaction between humans and robots, something that has not been possible before.
The robot of the future will be able to learn from an observer with normal social signals just like human infants learn. The robot will be able to express its desires, ambitions and goals through social interactions without depending on an artificial vocabulary. Hopefully, future robots will also be able to recognize a human’s desires and ambitions and adjust its behavior accordingly.