Robotics in the Arctic - with Bernardine Dias

 

Background: Bernardine Dias is a 4th year graduate Ph.D. student in the Robotics Institute at Carnegie Mellon University. Her interests and activities are wide and diverse. Recently, she moderated the 2002 Alumni Seminar in the Liberal Arts. In this interview, she shares her amazing field trip to the Canadian Arctic where she not only had the opportunity to work on her Robotics project, but also to meet scientists from all over the world!

 

What did you do this summer?

Bernardine: I went with a team of roboticists to the Canadian Arctic, to an island called Devon Island, which is within the Arctic Circle. We took our NASA-funded robot, Hyperion, which we built the year before. The idea was to prove the concept of Sun-synchronous circumnavigation, which means that you have a robot that plans its path and executes a loop in synchronization with the sun. Thus, it is entirely powered by the sun. What is does is be cognizant of where the sun is and make sure it always chooses paths that will allow it to remain powered by the sun.So if it has to go into a shadowed region for some science experiment, it will plan to be fully charged when it enters the shadowed region and it will leave the shadowed region and get back in synchrony with the sun when it needs power again.So the guarantee is more that it will not run out of power. It finishes the mission 24 hrs later in the place that it started with full energy capacity.

How long was the trip?

Bernardine: The trip itself was about 3 weeks. We were supposed to only be there for a little less than two weeks. But we ended up staying longer because our experiments were delayed due to bad weather. When itís raining and cloudy, we could not use the sun to power the robot.

What did you do when you were not busy with work?

Bernardine: We hung out with the media (such as the Discovery Channel, CNN, Popular Mechanics). So we spent some time talking to them. We also spent a lot of time learning about what other scientists were doing. We went on hikes, cooked, ate, and played cards and all sorts of fun things.

How was the food?

Bernardine: Food was great!!

What kind of food did you have?

Bernardine: All sorts of things! The cook was great! He would get up in the morning and make us crÍpes for breakfast. He would make coffee cakes, Mexican food, Italian food, and all sorts of food. The last day we were there, I cooked a Sri Lanka meal with him for the entire camp. It was fun!

How were your housing arrangements?

Bernardine: We lived in tents. There were basically no people there, except for the people that came with the project and the people from NASA. So there were scientists from all around the world who had come to participate in the experiment (NASAís Marsian habitat simulation experiments). It was pretty cool to meet such a wide variety of people from around the world!

When you were sleeping, did you need to wear a lot of clothes?

Bernardine: Since Iím from the tropics, it was very cold for me. But we had very warm sleeping bags. I didnít have to wear a jacket, just thermo underwear, and that was sufficient. We even had a thermo rest underneath the sleeping bag. My sleeping bag was actually a little too big, so I had to stuff it with other things to get rid of the air pockets. It took a little getting used to. That was the longest time Iíve ever camped.

How did you get interested in this specific project?

Bernardine: I was working on a project called ďCognitive ColoniesĒ which formed the basis of my thesis work (ďHow do you coordinate several robots to work together?Ē). That project was funded by the military. And I decided about 6 months to a year into the project that I did not want my thesis funded by the military for ethical reasons. So I spoke with my advisor, and he promised to get me funding for my thesis work from NASA, and in the meantime, I would shift to a different project. And this project was up for grabs. It was something new; I hadnít worked much with navigation before. Actually, my advisor wrote the base code for the navigation, so I used that and applied it to this project. It was a great experience for me. It was a chance to be able to go on a field trip! It was really an amazing experience, unlike anything else, to get the opportunity to take your robot out there and see what itís like to code on the spot. You get to interact with scientists from all over the world.

Why did you go to the Arctic?

Bernardine: We needed to go to a polar region. So it was either the Arctic or the Antarctic. The Arctic was a better choice for several reasons. For one, NASA already had a base there, where they do simulations and experiments about what people would do if they went to Mars. Having the facilities there already made it easier because we didnít have to do everything by ourselves. Also, for this project, it is important to note that the earth rotates in one day (so from our point of view, the sun goes around in one day). If we were down at the equator, we have to go around the earth, which is a long way to go. But if we were at the Arctic, the circular path is relatively small. And the robot could actually go at a reasonable speed and still keep up with the sun. The other important thing about the pole is that there was no vegetation. And thatís key, because this robot is a prototype for going to other planets (so its geared to a place where there is no vegetation). The terrain in the Arctic was much like the planetary terrain.

How fast did the robot go?

Bernardine: On average, it went 30cm/sec.

How big is the robot?

Bernardine: Itís pretty big. Itís about the size of a vehicle, but much taller because it has a huge solar panel attached.

How expensive is the robot?

Bernardine: It costs about a million dollars.

Did the robot move based on preprogrammed code, or based on where the sun is?

Bernardine: What I worked on was the navigation software on the small scale. Paulís software, the mission planning software, gave us an array of points in the circle that the robot had to move to, and the robot had to be at those points at a certain time in order to synchronize with the sun. These points were roughly 50 to 100 m apart. Chrisís software was basically the ďeyesĒ of the robots. The two stereo cameras on the robot took pictures and sent back images. I took the images, and put them on a map and planned the paths so that the robot would reach the goals corresponding to the points set by Paul.

How many experiments did you do? And what were the results of those experiments?

Bernardine: We only managed to do 2 experiments because it takes 24hrs to do an experiment and we didnít have very good luck with the weather. The first experiment was over 90% autonomous, so that was very exciting for us. Even the parts that we did do manually, it was more because we were cautious, and didnít want to let the robot do it by itself. The second experiment had a whole lot of problems. Partly it was because it had rained before and so it was muddy. We also went a different route, and we lost contact with the robot half way through the route. There were all these delays, and we were constantly running behind the planned time. The sun was sinking really low, and the sun was right on the camera. So the autonomy system could not function correctly because the camera could not see anything and was putting false obstacles everywhere. Also because of the sun problem, we had to drive the robot manually a lot, and managed to ram it into a rock, and got it stuck. So we had to manually pick it up.

What would you want to improve on the robot, after this experience?

Bernardine: Basically, my software is the autonomy software, which allowed the robot to drive around autonomously. We had a sliding autonomy system, which means that we could have an operator operate the robot, or send the robot to autonomous mode. There were 3 different modes, safeguarded operations, autonomous, and monitored. Monitored was the basic mode; it allowed the user to drive the robot blindly using a GUI. Safeguarded mode gave you an alarm when the robot is about to do something bad (like hit a rock), but would not override you. Autonomy allowed the robot to do everything by itself. We could set the speed manually (it defaulted to 30cm/sec), but we could slow it down in certain places so that the robot had better adjusting speed, by sending the robot a command. We could improve on the robot by making the speed change autonomous.