top of page
RAMU.jpg

Introducing

RAMU
Really Awesome Mobile Unit

OPEN SOURCE

RAMU's design and codebase is completely open source

LOW-COST

RAMU is made up of a mixture of low cost components and 3D printed parts

SLAM

RAMU is capable of performing Simultaneous Localization and Mapping

Autonomous Navigation by RAMU

Documentation of Build

RAMU is a low-cost, wheel encoder-less, holonomic mobile base that is capable of autonomous navigation. RAMU is powered by ROS 2 and Navigation 2 systems. RAMU is equipped with a Raspberry Pi 3B for most of its processing and an Arduino Uno for controlling its motors.  The RPi communicates to the Uno via Serial. For more information, please click the link to the open sourced project below. The rest of this webpage will aim to document the build process for RAMU.

As with most design projects, RAMU started off in the digital CAD world. One of the intentions with RAMU was to make it a holonomic robot with the use of four mecanum wheels. As a means of reducing cost, the idea to 3D print the mecanum wheels was initially explored. However, after making a few prototypes, it became clear that rollers made of PLA would not have sufficient traction on most surfaces. Rubber bands were added over the rollers but this did not improve performance much. Moreover, the bands tore off over time. Hence, the decision was made to swap the 3D printed wheels with ones procured online.

photo4904566345673320648.jpg
photo4904566345673320647.jpg
20200826_225440.jpg
20200823_163534.jpg
20200820_224219.jpg
photo4904566345673320650_edited.png

Slowly the rest of the design took shape, including the chassis, motor mounts, sensor mounts and electronics mounts.

photo4904566345673320653.jpg
photo4904566345673320655.jpg
photo4904566345673320656.jpg
photo4904566345673320657.jpg

While waiting for parts to arrive/get done printing, the Raspberry Pi was setup with Ubuntu 20.04 along with ROS 2 Foxy. Serial communication between the RPi and Uno was prototyped. As seen in the video below, the brightness of the LED is set based on the displacement of the joystick axis. Here a ROS 2 node publishes the state of the joystick. A second node receives the joystick state and maps the displacement to a brightness value [0,255]. The value is then sent as a binary string to the Uno via a serial port. While this example looks simple, it is rather significant. By replacing the LED with a DC motor we have a system where a wireless joystick controls the speeds of a motor. If we scale the system with 3 more motors, and add compute logic to the RPi to map the joystick displacement to motor velocities for the four motors (through inverse kinematics), we then have the basics of a teleoperable robot functioning.

Once the parts started trickling in, construction of RAMU began.

20201121_160221.jpg
20201122_210514 (1).jpg
20201122_210525_edited.jpg
20201127_180238.jpg
20201230_103842.jpg

Logic to teleoperate RAMU was written into the RPi and Uno respectively and RAMU was ready to take his first steps as seen below.

Next up was the addition of a low-cost 2D lidar which provides RAMU with a means to perceive the external world. A custom ROS 2 driver had to be coded to publish the laser scan data over a ROS 2 message. The scan data is visualized in RViz below.

20201214_213250.jpg

With the 2D lidar mounted on RAMU, it was ready to create a map of my living room using SLAM algorithms. But first a very naive state estimation code was written which publishes odometry data of RAMU. Without proper wheel encoders this meant relying directly on the forward velocity commands received from the joystick publisher to determine how much distance RAMU had moved. This is of course a highly unreliable source of odometry but something is better than nothing. For creating the map, I was planning to use Cartographer. However, several key parameters had to be tuned first to compensate for the poor odometry. Surprisingly, I was able to generate a fairly decent map of the environment. It definitely exceeded expectations. 

20201223_011001.jpg

Next up was configuring various parameters in the ROS 2 Navigation stack and test out autonomous navigation. Most of the tuned parameters were related to AMCL and to compensate for poor odometry. The yaw threshold at goal position was set fairly high and velocities parameters were set to modest values to ensure the robot travels at a reasonable speed. Too fast and the poor odometery would give AMCL a hard time to localize the robot, too slow and it would just be painful to watch. The perfect configuration file simply does not exist as tuning can take infinite time. In the end, I arrived at a satisfactory level of performance. The video below captures RAMU's first autonomous mission.

Not long after, RAMU was autonomously navigating much bigger maps akin to the one in the video at the top of the page. This is merely the start of RAMU's growth. Many more features including a "follow-me" mode are in the works. The project has been a humble introduction to mobile robotics and the power of open source software. The use of ROS 2 and Navigation 2 astronomically shortened the time for development. These works truly enable developers like myself to stand on the shoulders of giants. The concepts of mapping, localization, path planning and control have strongly captured my interests. I hope to delve deeper and grow my knowledge in these fields through working with RAMU.

bottom of page