Autonomous Object Collecting and Depositing Robot
Academic Level at Time of Presentation
Senior
Major
Engineering Physics
List all Project Mentors & Advisor(s)
Dr. Aleck Leedy, PhD; Dr. Ted Thiede, PhD
Presentation Format
Oral Presentation
Abstract/Description
The Institute of Electrical and Electronics Engineers is hosting SoutheastCon 2019, with one of the conference’s focuses being a robotics competition. The robotics competition simulates the task of removing space debris near a space station and depositing it in predetermined areas. This task must be completed by an autonomous robot that meets a variety of physical design criteria. The performance of robots is measured by means of points, wherein accomplishing positive tasks earns points and vice versa, and the robot that most consistently scores points wins.
In response to the competition, I and three other engineering physics majors have created a robot that uses LiDAR and cameras to accomplish the goals put forth by IEEE. The LiDAR serves to detect the geometry of the course and allows the robot to maneuver without hitting course features, given known information concerning course geometry. At the same time, a camera enables the robot to detect where it is in the course by recording the tape colors found in the course, as these are unique to specific areas of the course. Another camera observes the colors of objects collected by the robot. Whenever the colors detected by the two cameras match sufficiently, the robot knows that it can deposit its collected objects and maximize its scored points. The robot runs on two Raspberry Pi computers and is coded in Python. It makes use of the open-source OpenCV library for auxiliary camera-based functions. The robot has produced promising results, and is being further refined.
Spring Scholars Week 2019 Event
Honors College Senior Thesis
Autonomous Object Collecting and Depositing Robot
The Institute of Electrical and Electronics Engineers is hosting SoutheastCon 2019, with one of the conference’s focuses being a robotics competition. The robotics competition simulates the task of removing space debris near a space station and depositing it in predetermined areas. This task must be completed by an autonomous robot that meets a variety of physical design criteria. The performance of robots is measured by means of points, wherein accomplishing positive tasks earns points and vice versa, and the robot that most consistently scores points wins.
In response to the competition, I and three other engineering physics majors have created a robot that uses LiDAR and cameras to accomplish the goals put forth by IEEE. The LiDAR serves to detect the geometry of the course and allows the robot to maneuver without hitting course features, given known information concerning course geometry. At the same time, a camera enables the robot to detect where it is in the course by recording the tape colors found in the course, as these are unique to specific areas of the course. Another camera observes the colors of objects collected by the robot. Whenever the colors detected by the two cameras match sufficiently, the robot knows that it can deposit its collected objects and maximize its scored points. The robot runs on two Raspberry Pi computers and is coded in Python. It makes use of the open-source OpenCV library for auxiliary camera-based functions. The robot has produced promising results, and is being further refined.