CartMan Logistics Robot Could Enhance Amazon’s Automated Warehousing Process

QUT’s custom-built robot could be the winning solution enabling to save billions in logistics. This robot will be participating in the coveted Amazon Robotics Challenge.

  • Australian Centre for Robotic Vision and QUT are contesting the popular Amazon Robotics Challenge.
  • CartMan is a custom-built robot capable of autonomously picking and packing items using machine learning and robotic vision.
  • The robot’s state-of-the-art design could fill the gap in’s automated warehousing process, worth billions in logistics savings.
  • The competition will take place from 27th-30th July at RoboCup 2017 in Japan.

Will CartMan be Amazon's 'pick' of the bunch? Credit: QUT

CartMan, the logistics robot, has been built by a group of Roboticists from the Australian Centre for Robotic Vision (the Centre), based at QUT. This robot will showcase its item-picking skills against 15 other robots from all over the world in the third annual Amazon Robotics Challenge, part of RoboCup 2017 in Nagoya, Japan, on Thursday 27th July.

Dr Juxi Leitner, Team leader, stated that the competition will be tough since a prize pool of US$250,000 will be awarded to teams that succeed in completing the assignment of picking and stowing objects from a storage system.

Our robot has a vision system to recognise specific items in a crowded container, and a mechanical system to retrieve and stow that item into a shipping box. You won’t believe how hard is it to teach a robot to see a clear bottle of water among a bunch of groceries, or teach it the best way to pick up a bag of marbles. We opted to build our own robot from scratch – a three-axis Cartesian robot that acts much like a gantry crane you see at ports. With six degrees of articulation and both a claw and suction gripper, CartMan gives us more flexibility to complete the tasks than an off-the-shelf robot can offer.

Dr Leitner, a Roboticist from QUT

More than 15,000 hours have been invested into the project by the Centre’s team of 27 Roboticists from QUT, The University of Adelaide and the Australian National University.

The team is still striving for success even though it was placed sixth in last year’s challenge, which is considered to be a major accomplishment in this highly competitive field.

We are world leaders in robotic vision and we’re pushing the boundaries of computer vision and machine learning to complete these tasks in an unstructured environment – we won’t even be told which items CartMan must pick and stow until just before our heats. But I think we stand a good chance – the robot is robust and tackles the task in an innovative way we hope will give us the advantage.

Dr Leitner, a Roboticist from QUT

In comparison to the standard bricks-and-mortar retailers that employ warehouses and distribution centers to transport products to stores, Amazon and other such online retailers focus on fulfillment centers, facilities full of shelving, from which individual items are literally picked and stowed by human workers in order to fill customer orders.

Picking and stowing items is considered to be the glaring gap in Amazon’s automated logistics system, even though the company has mastered using robots to shift products around its fulfillment centers.

This global giant is thus banking on the challenge, which will help in discovering open-source solutions.

Amazon, in its current form, isn’t likely to make a massive impact on Australia’s retail market because we only buy about seven per cent of our goods online, and because geographical distances and higher wages increase logistics costs. If Amazon can reduce the cost of doing business in Australia by automating the picking and stowing process, it could very well increase its market penetration.

Gary Mortimer, QUT Retail and Logistics Expert Associate Professor

The Centre’s team is sponsored by the Australian Centre for Robotic Vision, Amazon Robotics, Osaro and QUT.

The Amazon Robotics Challenge runs 27th-30th July.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Azthena logo powered by Azthena AI

Your AI Assistant finding answers from trusted AZoM content

Azthena logo with the word Azthena

Your AI Powered Scientific Assistant

Hi, I'm Azthena, you can trust me to find commercial scientific answers from

A few things you need to know before we start. Please read and accept to continue.

  • Use of “Azthena” is subject to the terms and conditions of use as set out by OpenAI.
  • Content provided on any AZoNetwork sites are subject to the site Terms & Conditions and Privacy Policy.
  • Large Language Models can make mistakes. Consider checking important information.

Great. Ask your question.

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.