ROS nodes developed by the Active Vision Group (AGAS) at the University of Koblenz-Landau

AGAS ROS Packages

Packages and Stacks

Object recognition

The object recognition approach is based on feature extraction from rgb scene images. Features are matched with features in learned object models and clustered in Hough-space to find a consistent object pose. Using this approach, team homer@UniKoblenz won the Technical Challenge of the RoboCup@Home league in 2012. The described approach is available online as open source software provided as a ROS package. The appendant paper can be found here: Object Recognition Using Hough-transform Clustering of SURF Features

Install ROS package: sudo apt-get install ros-indigo-or-nodes

Source:

https://gitlab.uni-koblenz.de/robbie/homer_object_recognition

Robot face

Our highly customizable robot face can show 7 different facial expressions, thus provides Lisa with the ability to express emotions. This capability is crucial for robots to be accepted as everyday companions in domestic environments. Aiming towards a more realistic interaction experience our robot face moves its lips synchronously to the synthesized speech. The robot face is available for ROS and can be used with any robot that integrates ROS in its architecture. The appendant paper can be found here: Enhancing Human-Robot Interaction by a Robot Face with Facial Expressions and Synchronized Lip Movements

ROS package: sudo apt-get install ros-indigo-robot-face

Source:

https://gitlab.uni-koblenz.de/robbie/homer_robot_face

homer_mapnav-300x300

Mapping - Navigation

Our mapping node works on odometry form the robot platform on /odom and a laser scan. The created map is being published on /homer_mapping/slam_map and the calculated pose is published on /pose. The map_manager subscribes to all map topics and the laser scans, it merges them together and posts the merged map on /map. Our navigation uses this map on topic /map and the pose on topic /pose for the main navigation. A path is being planed on the map begining at the latest pose to the target. While following this path the navigation also uses the laser scan on topic _ /scan _ to avoid obstacles.

ROS package: sudo apt-get install ros-indigo-homer-mapping

sudo apt-get install ros-indigo-homer-navigation

Source:

https://gitlab.uni-koblenz.de/robbie/homer_mapnav

homer_gui-300x300

Homer_gui

This is the main gui of our codebase. It has a map tab where point of interest and region of interest can be defined and navigated to. Also the object recognition has a tab which is used for learning objects and testing the recognition with a recognition loop. We created a tab for sending test messages e.g. for offline testing of nodes. And there is a tab for the robot control - it shows latest data of the laser scans and cameras and has simple hardware control options. On the bottom of the gui we have a hardware status bar to give us live information of our hardware modules.

ROS Package:

sudo apt-get install ros-indigo-homer-gui

Source:

https://gitlab.uni-koblenz.de/robbie/homer_gui

robot_platform-300x300

robot_platform

The package holds all code and informations needed to get your robot platform going with an Arduino. The included documentation show the set up with an Pioneer 3-AT platform as example. We use nearly the same technique in our CU-2WD platform with our main robot. The main difference is, that the CU-2WD platform is equiped with absolute encoders while the Pioneer 3-AT platform uses incremental encoders. arduino-robot-platform-EN

3D printable Objects

xtion-sockel-300x300

Asus Xtion Tripod Mount

This little part makes it easy to mount your Asus Xtion Pro Live on a normal tripod with a 1/4 “ 20 screw. To use it you have to pop the caps on the sides of the main hinge and take the undelying srew out. Now you should be able to change the packaged deskstand with your printed mount. Afterwards tighten the screw and put the caps back on.

ptu-300x300

Pan-tilt-unit with Servos

This is a 4 part design. On the top you can see our Asus Xtion mount - it is glued to the tilt unit. Inside the tilt unit we use a 9g servo for the up and down movement. This servo is screwed in place and its shaft is connected to the middle part. The Bottom of the middle part is connected to the shaft of the second servo. Therefore the whole tilt mechanism is moved to get the pan movement. The pan servo is screwed to the base.

all 3D designs