Adding autonomy to a small tugboat
This is Lesley, an autonomous tugboat capable of navigating around foam icebergs and chasing down a stuff pink narwhal on a remotely control tugboat.
Timeline: 2.5-week Final Project for Fall 2018 Fundamentals of Robotics Course at Olin College
Collaborators: Amy Phung, Robert Wechsler, Jordan Leadley, w. Professor David Barrett
The goal of this project was to add sensing capabilities and software to a remote controlled tugboat in order to autonomously complete four missions. The missions took place in an indoor circular pool, and the robot could be reset in between missions. Mission descriptions are listed below, and the github repo is here.
These are videos of Lesley, our autonomous tugboat, completing each mission during the final demo.
Here we see Lesley completing her first mission - circle the pool. The objective was to undock, and follow the pool wall until getting back to the dock, at which point the mission was complete. Here, we see Lesley take a sharp left out of the dock until she gets to the edge of the pool, at which point she begins following the pool wall. As she nears the dock, she takes a hard right to avoid a collision, and completes the mission.
Here we see Lesley completing her second mission - three figure 8’s around the pool. The objective was to undock, and then do three consecutive figure 8’s around the foam icebergs in the pool. Once again, Lesley takes a sharp left out of the dock until she sees the pool wall. She follows the wall until her heading goes past a certain threshold. At that point, she begins circling the iceberg until her heading hits another threshold, letting her know it’s time to straighten out. She repeats this process until she’s performed the three figure 8’s, and successfully completes the mission.
Here we see Lesley completing her third mission - one figure 8 around the pool, then dock. The objective was to undock, as per usual, perform a single figure 8 around the foam icebergs, and then dock. Lesley uses the same strategy for performing her figure 8 as before, but now breaks out of doing figure 8’s after her first one. At that point, she straightens out towards the dock, and uses her PixyCam to align herself with the big red dot above the dock. She straightens herself towards the dock, and slows down as she reaches her destination, successfully completing mission 3.
Here we see Lesley on her fourth and final mission - undock and chase the pink narwhal! The objective was to undock, locate, and tag the boat with the pink narwhal. Alas, I don’t have video of the entire mission, so we’ll have to settle for a small snippet. Lesley isn’t particularly successful in this portion of the mission, as she fails to actually locate the narwhal, instead perceiving it as an obstacle and attempting to avoid it. This is due to the fact that Lesley can only see the narwhal with her PixyCam, which is mounted her front. Only later, does she finally see the narwhal, and tag it.
I managed this team to ensure that we could deliver a robot boat that could accomplish all of our missions. I planned with my team how to set up our sensing array to best accomodate our autonomy algorithms, and created our sensor mounts. My technical contributions also included setting up remote communication with the robot, and writing and testing control software for the first two missions.
At the start, were given a small tugboat with batteries and motors already integrated, and access to high quality 3D printers, various sensors, and various micro controllers. Over the following weeks, we had to decide how we would approach the various missions, how our system would be architect-ed, create sensor mounts, develop control software, test our algorithms, etc. The final assessment of the project was how well it completed the missions, so the pressure to deliver was on!
I spent a lot of the first week planning with my team how we would approach the various missions, how we were going to design our system, and how we would manage ourselves as a team. This meant deciding what sensors we were using, where we were putting them, how we were mounting them, how we were dividing up tasks, and how we were going to approach the different missions. My primary technical task was to design and fabricate our sensor mounts. You can see these mounts as the orange parts of the boat with the sensors attached. I designed the mounts in Solidworks and 3D printed them in Olin’s robotics lab.
With a fleshed out plan for system development in place, we divided and conquered sub systems our second week. I set up radio communication between our robot boat and an offshore laptop via an XBee radio. This made it so that we could remotely estop our robot, as well as set it to different modes based on what mission it needed to accomplish. This is also where I was tasked with creating a communications protocol between three Arduino Uno micro controllers - Arduino 1 gets camera data, Arduino 2 gets low level sensor data plus radio messages, and Arduino 3 actuates our rudder and propellers. This was a result of a single Arduino Uno not having enough pins for all of our intended electronics. While this architecture seemed promising, we noticed that our system exhibited significant latency when we tried to accomplish the first mission: circle the pool. At that point, Amy suggested switching all of our firmware to a single Arduino Mega. When we tested this, we no longer saw such a high latency, and decided to scrap the original multi-Arduino architecture.
For the last portion of the project, we had a fully integrated robotic system that was ready to take on the missions. Except, we still needed to write the control software to make sure the robot boat wouldn’t bump into walls, over steer, go the wrong direction, etc. I developed a few different control schemes for tackling the first two missions - circle the pool, and do three consecutive figure 8’s around the pool. Jordan and I tested these control schemes on our system, and iterated on our controllers until we achieved functional behavior. Amy and Robert tackled the last two missions.
These are videos I took while testing out the controllers for the first two missions.
Lesley uses an on-board gyroscope to determine her heading at all times during her trip, as well as sharp IR sensors on her port to determine where she is relative to the pool wall. Her voyage begins with an undocking phase that ends once her heading is beyond a certain threshold. Then, she switches into a wall following state. Once she gets close to the dock, which is determined via heading, she makes a sharp right to avoid colliding with the dock. She straightens out to swim right past the dock, but unfortunately keeps going towards the pool wall. We never fixed this bug because perfect was the enemy of done on this project, and all she needed to do was get past the dock.
And wait there’s more! Lesley can also run figure 8s around the icebergs in the pool. Here you can see a preliminary test of a more complex algorithm we designed.
Lesley begins once again by undocking. When her heading hits a certain threshold, she switches into wall following mode, and begins chugging alongside the pool wall. Her heading hits another threshold, prompting her to make a hard right around the iceberg. She straightens out with yet another heading trigger. Lesley chugs along until her left sharp IR sensors see the next iceberg. She makes a hard left, and the process continues like this until we send her a stop command.