Neurotechnology Introduces SentiBotics Development Kit

Neurotechnology, a provider of robotics and high-precision object recognition and biometric identification technologies, today announced the introduction of the SentiBotics Development Kit. SentiBotics includes a mobile robotic platform with a 3D vision system, modular robotic arm and accompanying Neurotechnology-developed, ROS-based software with complete source code.

Documentation features detailed descriptions of the robotics algorithms. The programming samples included in the robotics development kit demonstrate a number of capabilities, including how to teach the robot to recognize and grasp objects and how to use the robot to create a map of the local environment that the robot can then use for autonomous navigation in the learned environment.

Based on years of dedicated robotics research and algorithm development, Neurotechnology's SentiBotics Development Kit provides researchers, academic institutions, robotics developers and hobbyists with a "ready-to-go" mobile robotic platform that can dramatically reduce the time and effort required to create the necessary infrastructure, hardware, component tuning and software functionality required for research and development of robots. All platform components can be easily obtained from manufacturers and suppliers worldwide, so developers can use SentiBotics as reference hardware to build their own units or to incorporate different platforms and materials.

"We designed SentiBotics to be a compact, integral and computationally capable robotics system that allows our customers to rapidly test their ideas in real world environments," said Dr. Povilas Daniusis, leader of the Neurotechnology robotics team. "With the included software, SentiBotics not only provides working examples of autonomous navigation, object recognition and grasping algorithms, it also allows users to immediately concentrate on their own algorithm development."

SentiBotics robot hardware includes the following components:

  • Tracked platform, capable of carrying a payload of up to 10kg.
  • Modular robotic arm, capable of lifting objects up to 0.5kg.
  • Two 3D cameras that allow the robot to "see" and recognize objects at a range of 0.15 to 3.5 meters.
  • Powerful onboard computer (Intel NUC i5 computer with 8GB of RAM, 64 GB SSD drive, 802.11N wireless network interface).
  • Durable 20AH lithium battery with charger.
  • Control pad.

SentiBotics software includes source code with the following components that are tuned to work with the SentiBotics hardware platform:

  • Manual control of all degrees of movement via a control pad. For example, the user can manually drive the robot, rotate the joints of robotic arm and grasp objects.
  • Manual on-line environment map building. This functionality allows the user to build a map of the environment while driving the robot manually (via joystick). Among other things, the map may then be used by the robot for autonomous navigation to that previously-visited place.
  • Environment exploration with obstacle avoidance. When switched on, environment exploration mode drives the robot randomly, continuously trying to recognize and avoid obstacles.
  • Object learning and recognition. The user may teach the robot to recognize a previously unknown object by placing it in front of the robot's 3D camera "eyes" and assigning an identifier to it (e.g. "cup"). Once learned, the robot can recognize the object from different angles and distances.
  • Autonomous navigation to a specified, previously-visited place. This functionality allows the robot to travel by itself, using a previously-constructed map and information from its sensors, to a location that is specified by the user.
  • Basic grasping of an object within reach. Once properly calibrated, the robot can grasp simple objects if they are recognized by the object recognition system and are reachable by the arm.
  • Seek and grasp an object when the object is located in a previously visited place. The robot navigates through its previously-mapped locations until it directly recognizes the assigned object and grasps it using the robotic arm.

The SentiBotics Development Kit also includes:

  • Details of all algorithms used, including descriptions and full documentation.
  • ROS-based infrastructure that allows users to rapidly integrate third party robotics algorithms, migrate to other hardware (or modify existing hardware) and provides a unified framework for robotic algorithm development.
  • Programming samples that can be used for testing or demonstration of the robot's capabilities, including:
    • How to drive the robot platform and control the robotic arm with a joystick.
    • How to build a map of the environment by simply driving the robot around.
    • How to use the map for autonomous robot navigation.
    • How to teach the robot to recognize objects.
    • How to grasp a recognized object that is reachable by the robotic arm.
    • How to grasp an object that is located in a previously-visited place.

SentiBotics and the entire line of Neurotechnology products for AI, robotics, object recognition and biometric identification are available through Neurotechnology or from distributors worldwide. For more information, go to: www.Neurotechnology.com.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.