Mars Rover Team

Skills Used: ROS, Python, Linux, Qt, Localization, GPS, Computer Vision, Nvidia Jetson, team management, testing, documentation

I worked on BYU's Mars Rover Team for close to two years. The first year we were not able to participate in the competition due to COVID. 

Both years I worked as a member of the Autonomy team, where we focused on the Autonomy Task portion of the competition. I developed on the Nvidia Jetson for our onboard computer, using a ZED 2.0 for vision and magnetometer data. We also had RTK GPS units to perform localization with respect to target coordinates. 

For background, the autonomous competition task consisted of 7 legs:

The first year I worked on the Mars Rover team, I was a volunteer helping other students on their capstone project. I was first assigned to work on AR tag detection capabilities. We used an open source library (aruco_detect) to find AR tags and determine the Euclidean distance. We achieved a distance of 8 m to reliably detect AR tags in several lighting conditions. I programmed a simple algorithm that would navigate to the given GPS coordinate and then perform a box search to find the AR tag for legs 4-6 (see image). This gave a tunable way to find the AR tag in a robust manner. 

My second year working on the team, it was my capstone project. My main contributions this year were:

I also had the chance to be the Team Lead for the Autonomy team during the second portion of the school year. This was a unique experience because it was the semester before the competition and we had a lot of work to do in order to bring our best rover. Some things I learned: