Description

We attached a NVIDIA Jetson TX2 to a PAL Robotics TIAGo robot. The Jetson development board extends the robot by serving as a vision module. We developed a custom approach for segmenting objects in image space inspired by current segmentation networks. Through the robots RGB-D camera we can project the resulting segmentation into a local robot coordinate system for manipulating objects. We integrated the semantic segmentation into a practical use case where we propose a state dependent segmentation method. The robot navigates to a fridge, searches for a handle to open it. Once the the fridge is open, the robot takes an other look for segmenting beer bottles and cans inside the fridge and grasps them. In the end the fridge is closed and the beer is delivered. The current focus of use for the Jetson TX2 is the execution of our custom semantic segmentation network. However other custom robot modules like navigation, mapping and speech recognition can be executed on the Jetson as well.

Video

Best you see the resulting video where TIAGo autonomously opens the fridge and takes a beer to deliver it (click to play):

Bring me a beer from the fridge

Pictures

Opening the fridge
Opening the fridge.
Getting the beer
Getting the beer.
Closing the fridge
Closing the fridge.

home_net

home_net architecture
home_net architecture.
Prediction result
Prediction result.

Along with this challenge we released home_net. A segmentation network that has been used throughout the challenge. For more information head to the documentation linked below.

Code, Documentation and additional data

Team

Raphael Memmesheimer
Team Leader
Lukas Buchhold
Semantic Segmentation
Ivanna Mykhalchyshyna
Manipulation