Cornell Personal Robotics Lab researchers have programmed a robot to navigate across a room, followed by detecting the objects and bringing them to their actual location.
A robot places an item in a refrigerator
For this operation, the robots have been programmed based on new algorithms that are being described in the International Journal of Robotics’ May online edition. Also, certain features of the work were presented at the International Conference on Robotics and Automation in St. Paul, Minn , from May 14 to 18.
Tests were conducted for analyzing the robot’s ability to move or place the objects such as dishes in refrigerators, books in bookshelves, toys on tables and clothes in closets. The robot showed 98% success in this operation, involving familiar objects, while the success rate was reduced to 80% when new objects were targeted. The robots sometimes misidentified ambiguously-shaped objects like shoes and clothing.
A Microsoft Kinect 3-D camera plays a vital role in this operation. Several images are integrated together to construct the complete view of the room. Based on colors and shapes, the images are being divided into blocks by using robot's computer. The robot will be familiarized with several examples of object types and separates the identical ones. For every block, the robot calculates the probability of a match with respect to each object in its database and selects the best match.
The robot scrutinizes the target area for each object in order to determine the suitable location. The 3-D image of the target space will then be divided into small chunks, analyzing the series of features of each chunk with regard to the shape of the object being placed. For this purpose, graphic simulations are used.
Furthermore, a paradigm is created based on labelling (good and bad) on the placement sites. A chunk of space that matches best with the model has been finally selected, followed by creating a graphic simulation for efficient movement by the robot. Microsoft Faculty Fellowship sponsored this initiative.