RoCKIn Day 2 and Day 3

After we got a base that was able to map, localize and drive autonomously we had to add the ears for speech understanding and the vision for object perception and face detection. We were brainstorming about what we can use to mount the Asus Xtion (which by luck is only powered via USB). By walking through the RoCKIn tent we asked some @Work guys if they may have some spare part that we can use for this. So Sven from the b-it bots team gave us a square. With that we were able to mount a rail that holds an old version of our pan tilt unit setup which was designed for the usage with a Kinect. After we mounted it we added some tape - et voila = a basic mount for the ASUS Xtion and the microphone was built.

Because we didn’t have a PTU on the new robot we had to change some parts of the code for instance while tracking people, the PTU was always looking to the tracked person. We had to change this code in a way that the youBot turns to the tracked person.  The cool thing about the new setup was that we were able to rely on components that don’t need external power supply this made things a bit easier. So we were able to use the new youBot setup completely on the second competition day and were able to compete in all competitions from now on. Some minor navigation issues were fixed and we obtained a competition ready robot. Only a manipulator was missing but as we had more important problems we didn’t follow the task of also porting the Katana arm to our setup. We preferred to invest some time in getting voice output on the robot which sounds like an easy problem to solve. But we fast recognized that it turns out to get hard if you don’t have the components like battery powered speakers and have to use use the internal microphone plug which blocks the audio out when a microphone is plugged in.

So we tried to use a second notebook which should act as a face using the ability of ROS to connect to an external ROS core. But this turns out to block some messages we needed for our statemachines and also seemed to be very unstable due network issues. Then we tried to stream the audio via RDP and stream it on the other notebook which worked but caused other problems.  Then we just wrote a node that sends a command via ssh to an external notebook which then publishes a message that was processed by an own ROS core and a running instance of our robot_face. A very hackish solution, but worked out fine for us. When you had natural voice output in your robot you definitely will miss it afterwards when debugging state machines.

All in all we could use all this to test the robot on the second competition day. With some improvements in the code we started the third and last RoCKIn day and (almost) everything went really smooth. Almost all of our points were scored on the last day, but it was still enough to win the competition! Thanks again to team b-it bots for the robot platform - it would not have been possible without you!

trophy

First Day of RoCKIn

Spoiler Alert! The story that starts in this post ends well: Thanks to team b-it bots who borrowed us a robot, we were able to win the competition and to get the 2nd place in the object perception benchmark.

But read the story first …

We were very excited to go to RoCKIn. Some things had changed in our hardware and software and we were eager to test them. Among others: We integrated the sponsored IDS UI-5580CP camera for object recognition, we replaced the Kinect by an ASUS Xtion and also integrated the Vocon speech recognition that we got from Nuance as an academic license. With all this news and excitement, why were there no new blog posts since the preparation days? This has one simple reason: one thing happen that didn’t give us any time for anything else than building up something that is at least a bit ready for the RoCKIn competition days. So what could that be? Yes of course, our robot burned. Lisa started smoking when we inserted the first pair of batteries on the first competition day. There is nothing worse that could happen 2 hours before a competition (in this case Getting to know my home).

After Lisa burned we immediately started first aid measures by executing a critical operation and found an error in the brain of our robot platform. The microcontroller had a messy black frame around the center, which basically means “S%!T”. Where are fuses when you need them?

Fortunately we had a second microcontroller with us, so we replaced it and started flashing it with the firmware. Everything looked fine at this point, but when we tried to connect to the platform via serial connection, we did not get any responses. A short after the fuse was blown that was responsible for not blowing the micro controller. A bit late little fuse. So we replaced the fuse in hope that we would get any sign in form of serial bits and bytes. But still no sign. After some measurements we recognized that there is a connection between the left motor encoder and a green flashing LED. The flashing LED constantly got darker and darker when connecting the encoder cable. A sign … but a bad one. We recognized that there is only little hope for us to get this machine driving in this competition. A burned encoder is something very special to get in consumer electronic markets.

So while Viktor went on with the CU2WD revival Nico and Raphael tried to find an alternative way to compete. We took a look around at our place and saw a team with two iRobot Roombas. That are those robots that may clean up your rooms. So we asked the Watermelon team kindly if they needed both and the response was - not really. So they allowed us to take it with us and we tried to integrate it with our mapping and navigation. This was quite easy since we switched to ROS last year. Since we needed quick results we tried to use the ASUS Xtion as a 2D  laser scanner. This worked out quite well, but after having the system up we recognized that it takes a long time to get the batteries of the Roomba loaded and we weren’t able to establish a connection while loading. Definitely not good for testing purposes.

So while loading the batteries we were searching for alternatives. Then we saw Jakob, who is working at KUKA, since Raphael knows him from the last RoCKIn Camp we asked him if he had a youBot available that he could borrow us. He said: No … but the b-it bots team has two youBots and may need only one.

The b-it bot team then helped us to setup the robot and establish a connection with it, gave us a briefly introduction in how to use it and also gave us all the code that enabled us to send velocity commands and to get the wheel odometry back. Which is, beside some distance measurements from for instance a laser scanner all we need to let the robot drive autonomously. With their help we were able to set up a new platform that maps and localizes itself within 3 hours. Wow, not bad. Our robot had from now on the most basic skill for at home environments back. So let’s add all the other skills and we are back in the competition. The story will be continued in the next blog post.

youbert

Our borrowed robot “youBart” based on a KUKA youBot that we got from the team b-it bots

RoCKIn - preparation days

Hello everyone, this is the information for the first two preparation days (Wednesday and Thursday):

Shortly after our arrival in Toulouse on Tuesday, we spend the evening making plans for the next day - the first of two preparation days.

On Wednesday, unpacking and building Lisa back together went suprisingly smooth, so we could immediately start with testing our prepared statemachines. But we spend the rest of the morning with solving several merge problems until a solid basis for future development was reached. However, Lisa entered the newly built competition apartment as first and used the opportunity to test her current abilities.

Thursday. We spend a lot of time establishing the connection to few network-bound objects, like e.g. an IPCam and lighting. At the moment, Lisa had some trouble moving autonomously and following us in the new and quite narrow environment. But we are working on it until she gets used to it. Lisa also had to learn a lot of new objects, faces and uniforms, in order to be able to identify those correctly.

There is still lots of detail work to do and much space for optimization, but we are looking forward to taking part in the next three exciting competition days.

20141126_092143 20141127_075816 20141127_103045 20141127_170635

Here you see us working. By the way: we are in Toulouse in the space museum cite de l’espace - so we are working directly next to an Ariane 5!

ProboCups and RoCKIn

Hello guys,

Lisa here ;). These past two weeks were really exciting for me. Me and my team had the first two ProboCups, some sort of a testrun for the real competition next year. The different teams had to test their games that they did for me under real RoboCup conditions.

probo1Taking under consideration that it was their first time doing this, they were really good. Of course they were some problems and some games didn’t work well. But I think it was a good experience for everyone including me. When we had our second ProboCup today we managed to get some points in games that we didn’t in the first one. So some improvements were made, but unfortunately we lost some points in previously successful games. It was a little bit of a bummer, but nethertheless I’m happy and I think that my team is getting better and better.

But it’s not the end of my report for today. Next week i will travel together with my team from Robbie23 to France to compete at the RoCKIn Competition.

rockin1This team worked really hard and my expactations for winning some of the challenges are really high. I hope that I will be able to tell you good news the next time and maybe I will learn some french words while I’m there :).

GoodBye and Salut

Lisa