Building our robots requires investment from individuals who come from a wide range of disciplines. ARUW is split into sub-teams according to these key disciplines. The sections below outline the skill areas and projects that each sub-team is working on.

Note for applicants: we do not require that your major matches the focus area you choose to work on in our team. If you choose to apply, you should apply to the sub-team(s) you strongly want to work in, regardless of your chosen academic degree.


Administrative Team

Performs a wide range of tasks that help keep our club running smoothly. Most of the tasks involve managing our business, outreach, and PR needs.

Key Projects:

  • Reaching out to companies and acquiring sponsorships.

  • Maintaining club relations with current sponsors.

  • Media production: photography, videography, editing, etc.

  • Running our social media accounts.

  • Maintaining our website.

  • Organizing community outreach events.


Mechanical Team

Designs our robots using computer-aided design (CAD), prototypes mechanisms, manufactures components, and assembles and wires the final robot.

CAD is done in SolidWorks, and manufacturing processes include 3D printers, mills, lathes, and more.

The mechanical subteam is split into squads of 4-7 people for each robot, with each squad being led by a veteran member.

Key projects:

  • Completing a robot from each squad.

  • Improving design and manufacturing processes.


Software Team

Implements software systems for our main robot controllers, auto-aiming & auto-navigation capability, and electrical subsystems. Develops control loops for actuator control and designs neural networks for detection tasks. Spans embedded microcontrollers and desktop development.

Additionally, we develop and maintain in-house platforms for vision model annotation/training and real-time robot debug rendering.

Codebases are primarily written in C++, Python and Rust for onboard robot systems. PyTorch and TensorRT are utilized for our machine learning models. We use React.js and Javascript for our full-stack web-based vision platforms.

Some of our recent work:

  • Deadwheel and LIDAR-based Odometry

  • IMU Sensor Fusion Algorithm

  • Particle Filter for Plate Detection

  • Real-time wireless vision debug rendering platform

Key projects:

  • Maintaining and expanding our open-source robot control systems library, Taproot.

  • Improving performance and accuracy of machine learning-based target classification & detection systems.

  • Refining localization, mapping, path planning, and autonomous navigation systems.

  • Designing full-stack vision tooling platforms, including model annotation/training and debug rendering.


Electrical Hardware Team

Designs power and signal systems for all robots, including custom Printed Circuit Board (PCB) designs and human-scale wiring systems.

PCB designs are done in Altium, with electrical simulations in LTspice and PLECS.

Key Projects:

  • Designing, building and testing a 2000 Joule-stored energy onboard capacitor bank, with embedded microcontroller, synchronous buck/boost controllers, dynamic software control.

  • Designing, building, and testing a brushless Electronic Speed Controller (ESC) with regenerative braking capabilities.

  • Designing power & signal multiplexers and distribution boards.