Skip to content

Robot overview

amessing edited this page May 20, 2016 · 3 revisions

This page will ideally serve as a starting point for new members to learn about the structure of our robots.

Hardware

We tend to borrow the same basic hardware design from previous years, but modify many elements to fit the current year's challenge.

Wheel bases

Four Wheel Tank

Mecanum

Swerve

Appendages

Multiple Degree of Freedom Arm

We have used a Lynxmotion AL5B. This can use either a gripper or a vacuum suction cup.

Sensors

Ultrasonic

I2C Encoder

Encoder

Line Sensor

Switch

Actuators

Servo

Servo Motor

Stepper Motor

Electronics

Power

The robot is currently powered by a 7.2V NIMH battery fed through an adjustable switching power supply set to 5V. The battery voltage can be checked through a button on this power supply. Any voltage less than 7.2V might indicate a need for a battery charge/replacement. Many high power devices are directly connected to the battery, while the logic devices are generally on the 5V rail.

Beaglebone Black or Raspberry Pi

The main logic device is the Beaglebone Black or the Raspberry Pi. It serves as the main processor for the robot and is accessible through WiFi. They connects through USB to many secondary devices and processors, which will be outlined below.

Teensy++ 2.0

Used with the 2016 robot, but probably won't be used much because Arduinos are easier to work with.

Arduino Mega

Popular micro-controller that is used for connecting to sensors and actuators.

The Mega 2560 is a microcontroller board based on the ATmega2560. It has 54 digital input/output pins (of which 15 can be used as PWM outputs), 16 analog inputs, 4 UARTs (hardware serial ports), a 16 MHz crystal oscillator, a USB connection, a power jack, an ICSP header, and a reset button.

Arduino Uno

Popular micro-controller that is used for connecting to sensors and actuators.

The Uno is a microcontroller board based on the ATmega328P. It has 14 digital input/output pins (of which 6 can be used as PWM outputs), 6 analog inputs, a 16 MHz quartz crystal, a USB connection, a power jack, an ICSP header and a reset button.

Software

Currently the software is organized into a few main sections. Analogizing to a human, we can simplify these sections by naming them after parts of the body.

Head

The head is the software that will run on the BBB or RPi, effectively the brain of the robot. It can take care of all the high level operations of the robot, as it is the easiest to change and work with. Currently this part is written in Python, which has its pros and cons, but tends to work well in robotics. It is nice to be able to open up a Python shell and run the same commands that are used in the main program for testing and debugging. Python also offers a great logging module that helps us analyze problems that occurred in previous runs.

Spine

The spine section of the software is essentially the software that commands the microcontrollers such as the Teensy and the Arduino. These micros implement a simple serial interface that accept ASCII-based commands with arguments. Examples of such commands are:

  • Return the state of a certain limit switch.
  • Set a certain motor's power to 128 out of 256.
  • Set a servo's position to 58.

Just like an actual spine, these commands simply relay information to and from various parts external to the main processor. They should not really do any processing besides parsing the command, and they probably should not take any time to execute or include any delays. The only processing that should be done on the microcontrollers is a velocity pid for motors.

Because these micros implement a similar and standard command response protocol, the same code can be used for each one and they can also be easily debugged with a command like the following:

picocom /dev/mega -b 115200 --echo

Imaging

The imaging section of the software is where we store code for image processing. This can either be done on a pre-saved image or on a live video feed from either a usb camera (we typically use a Logitech c270) or a RPi Camera. There is currently a Target Tracking class that finds targets based on color and size and determines the robot's angle and distance from them and a Block Detection class that determines the size and color of blocks in an image.

Navigation

Navigation is a huge part of the robot each year as the robot cannot do anything if it cannot reach the areas where it needs to do things. While it is always a part of the competition it remains a difficult challenge to navigate without error.

Pathfinder

This is a library that is used for motion planning for the robot. It is still experimental, but has been used in FIRST robotics with success. The user inputs waypoints for where the robot needs to be and the program generates splines to connect each waypoint in a way that the robot has the correct heading when it arrives. These splines are then discretized into small time increments with the position, velocity, acceleration, jerk, and heading of the robot recorded for each time step. The files can then be used to make the robot follow the path.

NavX

http://pdocs.kauailabs.com/navx-micro/ This is a 9-axis inertial/magnetic sensor and motion processor. It can be used for field oriented driving, auto-balancing, rotating to an exact angle, and collision detection.

Torso

Torso contains all the code that runs on the microcontrollers such as the Arduinos or Teensy. Most of this code is receiving and processing commands sent from the main micro-processor and then working with the sensors and actuators.

ArduinoGen

This a new project that generates Arduino code based on a config file that contains what sensors, actuators, and systems are connected to the Arduino.