The Autonomous Systems class at Michigan Tech introduces students to autonomous robot operation using both real turtlebots, and simulated ones in the Gazebo environment using ROS.
Throughout this class we implemented various algorithms for perception, planning, and control of the robot including a Hough Transform for object identification, the potential field method of path planning, and a PI control for control of the robot.
The final project consisted of a competition with two stages. The first was exiting a simple maze that we had previously mapped with lidar scans, and second was detecting and approaching a cylinder and following it while avoiding collisions with the other robot also trying to complete its objective.
After a decent amount of tuning, we were able to get a near perfect stage 1 run with our robot, completing it in 12sec. This was the fastest run that anyone in the class did.
To accomplish this, our robot used the potential field method for navigation, taking its current position relative to the obstacles and calculating the best direction for it to move in. This, along with an aggressive controller allowed us to complete this run with the robot almost always being at top speed.
In the video our robot is on the right side.
In the second stage, our robot was able to consistently identify and follow the cylinder at the center of the maze, but due to a lack of testing (probably due to too much time spent tuning stage 1) our robot avoidance did not work as we had hoped.
A video of the simulation of the robot following the cylinder until it is time to exit the maze is shown to the right.