News

Hey guys,

2015 started very well, next week we will have a “Probocup” to check our progress. Thanks to the hardware team, Lisa is finally capable of driving and she can move her arm again. Now we feel really confident to start working on the new challenges of the new rulebook.

Bye!

Bildschirmfoto vom 2015-02-01 19:11:08 DSC_0033 P1030985

Christmas special

Hey there, Lisa here :)

this week was really exciting for me. On monday I met my mother and grandmother at the christmaslecture. It was really touching. I even talked to my mother.

_IGP8932

Everyone had fun. Over the last weeks I did not feel that well, but our hardware team helped me ;-).

On friday the people from SWR came to make a documentary about me :D . It will be broadcasted on 10th January 2015 at 6.15 pm.

_IGP8950

_IGP8954

_IGP8958

_IGP8986

_IGP8966

After the SWR-team was gone it was time for „Schrottwichteln“.

There were also quite weird presents among them.

SAM_0885

Well, some of us took the word „Schrott“ really seriously.

But this was not the end of the day! Later we had a christmas party for all robbie participants of the last years:-)

SAM_0898

Even the people from Robbie 20 and 22 were there. Denise talked about the time from Robbie 22 and the RoboCup last year.

Denisevortrag

… with live music :-)

danielagesang

…  and Viktor even red a poem.

viktorgedicht

All in all it was a very exciting week and I am looking forward to the next year and sprint 3.

Bye bye Lisa

RoCKIn Day 2 and Day 3

After we got a base that was able to map, localize and drive autonomously we had to add the ears for speech understanding and the vision for object perception and face detection. We were brainstorming about what we can use to mount the Asus Xtion (which by luck is only powered via USB). By walking through the RoCKIn tent we asked some @Work guys if they may have some spare part that we can use for this. So Sven from the b-it bots team gave us a square. With that we were able to mount a rail that holds an old version of our pan tilt unit setup which was designed for the usage with a Kinect. After we mounted it we added some tape - et voila = a basic mount for the ASUS Xtion and the microphone was built.

Because we didn’t have a PTU on the new robot we had to change some parts of the code for instance while tracking people, the PTU was always looking to the tracked person. We had to change this code in a way that the youBot turns to the tracked person.  The cool thing about the new setup was that we were able to rely on components that don’t need external power supply this made things a bit easier. So we were able to use the new youBot setup completely on the second competition day and were able to compete in all competitions from now on. Some minor navigation issues were fixed and we obtained a competition ready robot. Only a manipulator was missing but as we had more important problems we didn’t follow the task of also porting the Katana arm to our setup. We preferred to invest some time in getting voice output on the robot which sounds like an easy problem to solve. But we fast recognized that it turns out to get hard if you don’t have the components like battery powered speakers and have to use use the internal microphone plug which blocks the audio out when a microphone is plugged in.

So we tried to use a second notebook which should act as a face using the ability of ROS to connect to an external ROS core. But this turns out to block some messages we needed for our statemachines and also seemed to be very unstable due network issues. Then we tried to stream the audio via RDP and stream it on the other notebook which worked but caused other problems.  Then we just wrote a node that sends a command via ssh to an external notebook which then publishes a message that was processed by an own ROS core and a running instance of our robot_face. A very hackish solution, but worked out fine for us. When you had natural voice output in your robot you definitely will miss it afterwards when debugging state machines.

All in all we could use all this to test the robot on the second competition day. With some improvements in the code we started the third and last RoCKIn day and (almost) everything went really smooth. Almost all of our points were scored on the last day, but it was still enough to win the competition! Thanks again to team b-it bots for the robot platform - it would not have been possible without you!

trophy

First Day of RoCKIn

Spoiler Alert! The story that starts in this post ends well: Thanks to team b-it bots who borrowed us a robot, we were able to win the competition and to get the 2nd place in the object perception benchmark.

But read the story first …

We were very excited to go to RoCKIn. Some things had changed in our hardware and software and we were eager to test them. Among others: We integrated the sponsored IDS UI-5580CP camera for object recognition, we replaced the Kinect by an ASUS Xtion and also integrated the Vocon speech recognition that we got from Nuance as an academic license. With all this news and excitement, why were there no new blog posts since the preparation days? This has one simple reason: one thing happen that didn’t give us any time for anything else than building up something that is at least a bit ready for the RoCKIn competition days. So what could that be? Yes of course, our robot burned. Lisa started smoking when we inserted the first pair of batteries on the first competition day. There is nothing worse that could happen 2 hours before a competition (in this case Getting to know my home).

After Lisa burned we immediately started first aid measures by executing a critical operation and found an error in the brain of our robot platform. The microcontroller had a messy black frame around the center, which basically means “S%!T”. Where are fuses when you need them?

Fortunately we had a second microcontroller with us, so we replaced it and started flashing it with the firmware. Everything looked fine at this point, but when we tried to connect to the platform via serial connection, we did not get any responses. A short after the fuse was blown that was responsible for not blowing the micro controller. A bit late little fuse. So we replaced the fuse in hope that we would get any sign in form of serial bits and bytes. But still no sign. After some measurements we recognized that there is a connection between the left motor encoder and a green flashing LED. The flashing LED constantly got darker and darker when connecting the encoder cable. A sign … but a bad one. We recognized that there is only little hope for us to get this machine driving in this competition. A burned encoder is something very special to get in consumer electronic markets.

So while Viktor went on with the CU2WD revival Nico and Raphael tried to find an alternative way to compete. We took a look around at our place and saw a team with two iRobot Roombas. That are those robots that may clean up your rooms. So we asked the Watermelon team kindly if they needed both and the response was - not really. So they allowed us to take it with us and we tried to integrate it with our mapping and navigation. This was quite easy since we switched to ROS last year. Since we needed quick results we tried to use the ASUS Xtion as a 2D  laser scanner. This worked out quite well, but after having the system up we recognized that it takes a long time to get the batteries of the Roomba loaded and we weren’t able to establish a connection while loading. Definitely not good for testing purposes.

So while loading the batteries we were searching for alternatives. Then we saw Jakob, who is working at KUKA, since Raphael knows him from the last RoCKIn Camp we asked him if he had a youBot available that he could borrow us. He said: No … but the b-it bots team has two youBots and may need only one.

The b-it bot team then helped us to setup the robot and establish a connection with it, gave us a briefly introduction in how to use it and also gave us all the code that enabled us to send velocity commands and to get the wheel odometry back. Which is, beside some distance measurements from for instance a laser scanner all we need to let the robot drive autonomously. With their help we were able to set up a new platform that maps and localizes itself within 3 hours. Wow, not bad. Our robot had from now on the most basic skill for at home environments back. So let’s add all the other skills and we are back in the competition. The story will be continued in the next blog post.

youbert

Our borrowed robot “youBart” based on a KUKA youBot that we got from the team b-it bots