Skip to content

6‐DOFenshmirtz: Functionality

Siddharth edited this page Mar 17, 2024 · 3 revisions

6-DOF Arm 🦾🤖

Introduction

Welcome to our 6-DOF Robot Arm project! We're developing a 6 Degree of Freedom (DOF) robotic arm using NEMA 17 motors and TMC2099 drivers. The primary goal of this project is to create a versatile desktop helper arm that can assist in various tasks.

Yes, we're that lazy. We don't want to reach for the switch to turn off the lights. We don't want to unplug our phones from charging. And most importantly, we're lonely enough to need a robotic arm to high-five 🙌

Photos of us using our prototype model for high fives:

GIF Example GIF Example

What we've accomplished

Hardware:

We've engineered the robotic arm using NEMA 17 motors and TMC2099 drivers to ensure accurate and smooth movements. The choice of components allows for precise control and positioning while maximizing torque output.

Software:

To achieve seamless movements, we developed a multi-axis stepper motor control library that utilizes timer interrupts & motors are scheduled using a heap implementation. This approach ensures that the arm moves efficiently and smoothly across all six degrees of freedom with minimal latency.

Robot Kinematics:

We've also developed a custom forward and inverse kinematics library in Python entirely from scratch to enable accurate positioning as best we can. This library is crucial for translating desired end-effector positions to motor movements. All functions in this library were designed using the Modern Robotics course lecture notes & videos as a reference.

PCB Design:

The customization extends further - as we've designed a 4-layer PCB using Altium, which integrates the TMC2099 drivers, ATMEGA328 microcontroller, power management buck converters, and an FTDI chip to handle USB to Serial. This integration streamlines both the control and communication components of the arm.

Gear Ratio:

Innovated custom gears designed to operate with a 51:1 gear ratio. We fine-tuned this choice, optimizing the arm's performance, and effectively balancing speed and torque for enhanced functionality.


Hardware (in-depth)


Software (in-depth) - MultiStepper

Our goal here was to create a MultiStepper library from scratch, that can control 6 separate motors simultaneously. Challenge: When attempting to control each motor separately, it can be challenging to ensure that they all move in perfect synchronization (an Arduino can't handle precise control of 6 stepper motors at different times). Timing differences in motor steps can quickly accumulate, leading to misalignment of the arm's end-effector and the desired target positions. This is problematic when executing complex movements that involve multiple axes moving simultaneously.

Our Solution - Motor Control and Heap Scheduling

Again, the goal here is to have a precise and coordinated motion while optimizing performance. We've constructed an elegant system of motor control using the classes defined under arduino/motor-heap/. The key concept here is the management of motor tasks through a heap-based scheduling mechanism. Let's take a closer look at how this works:

Motor Initialization and Heap Setup

Each motor is initialized using the StepperMotor class. We provide the necessary parameters such as step and direction pins and microstep configurations. An essential connection is established between each motor and a shared MotorHeap instance. This heap plays a big role in managing motor tasks with a time-based priority approach.

1. Motor Control Functions

Under the StepperMotor class:

  • init(): Initializes the motor's pins for step and direction control.
  • getAngle() and getAngleDeg(): Provide the current angle of the motor in radians and degrees, respectively.
  • controlMethod(): Allows for different control modes, such as setting the motor's target position in degrees or radians, setting a specific step position, or controlling the motor's speed.
  • driveSpeed(): Adjusts the motor's target speed in rad/s.
  • getRPM(): Calculates and returns the motor's revolutions per minute (RPM) based on the current speed.
  • stepMotor() and run(): Control the actual stepping of the motor - managing step pulses, delays, and calculations based on the target speed.
  • runHeap(): Interfaces with the heap scheduling mechanism, determining when and how the motor should step, optimizing its timing and ensuring efficient execution.

2. Motor Heap Scheduling

The magic happens with the MotorHeap class and its related member functions:

  • dequeue() and enqueue(): Manage the priority queue of motors waiting to execute their steps. Motors are dequeued from the heap when they're ready to step, and they're enqueued back with adjusted priority counts based on their timing needs.
  • heapUp() and heapDown(): Helper functions - maintain the heap structure by adjusting the order of motors based on their priority counts.
  • decrement(): Decrements the priority counts of motors still waiting to step, ensuring they move up in the heap queue over time.

Note: Sound unfamiliar? A quick overview of the heap data structure can be found here.

3. Multi-Motor Coordination

Yeah. 2 classes still aren't enough. To drive multiple motors simultaneously, we've introduced the MultiStepperDrive class. This class encapsulates an array of motor pointers and provides functions to initialize and drive these motors concurrently - basically acts as a central coordinator for driving multiple motors simultaneously.

  • init(): Initializes Timer 1 and sets up the interrupt-driven mechanism for controlling motor steps.
  • driveSpeed(): Enables coordinated speed adjustments for multiple motors, ensuring synchronized motion across all six axes.

4. Tying it together

In our motor-heap.ino file, we have an ISR (Interrupt Service Routine) TIMER1_COMPA_vect which keeps time and manages motor priorities. It decrements priority counts for motors awaiting execution and dequeues motors when they're ready to step. This ensures that the motors move in a choreographed way, executing complex tasks with precision.


Software (in-depth) - Forward and Inverse Kinematics

Still not done with the software side of things! How are we going to make the arm move for the end-effector to reach a certain (x, y, z) coordinate? Can there be multiple ways to achieve the same position? How will we constrain it to only one way? There's a LOT to come your way. Here's some information that will make our code within our Python/kinematics/ package, that we've implemented from scratch. No NumPy even ;)

  • SO(3): Special Orthogonal group in three dimensions represents the group of all 3x3 rotation matrices with determinant 1. In robotics, it is used to represent the orientation of rigid bodies in 3D space.
  • SO(6): Special Orthogonal group in six dimensions represents the group of all 6x6 rotation matrices with determinant 1. It is used to represent the orientation of rigid bodies in 6D space, which includes both rotational and translational components.
  • SE(3): Special Euclidean group in three dimensions represents the group of all 4x4 transformation matrices with orthogonal rotation submatrix and a translational component. It is used to represent rigid body transformations (both rotation and translation) in 3D space.
  • SE(6): Special Euclidean group in six dimensions represents the group of all 6x6 transformation matrices with orthogonal rotation submatrix and a translational component. It is used to represent rigid body transformations (both rotation and translation) in 6D space, which includes both the position and orientation of a rigid body.
  • Exp3: This function calculates the matrix exponential of a 3x3 skew-symmetric matrix, which represents angular velocity in 3D space. It is commonly used in robotics to compute the transformation matrix for small rotations around a fixed axis.
  • Exp6: Similarly, exp6 computes the matrix exponential of a 4x4 matrix, representing the twist coordinates of a spatial motion. It is used for computing the transformation matrix corresponding to small spatial displacements or rotations in 3D space. This function is often utilized in forward kinematics algorithms for robotic manipulators.
  • Joint Angles: In robotics, joint angles refer to the angles of rotation of each joint in a robot manipulator. These angles determine the configuration of the robot arm and directly affect its end-effector position and orientation. 6 DOF means we will have 6 joint angles.
  • Forward Kinematics: Given the joint angles of a robot, forward kinematics calculates the position and orientation of the end-effector in the robot's workspace using kinematic equations. It outputs a 4x4 SE(3) transformation matrix representing the end-effector's pose.
  • Inverse Kinematics: Determines the joint angles needed to achieve a desired end-effector pose. It reverses the forward kinematics process (big surprise), computing the joint angles that result in the desired pose. The output is a set of joint angles corresponding to the desired end-effector position and orientation.

Constraints

Mechanical Limitations: The mechanical design of the arm imposed limitations on its reach and payload capacity. These constraints influenced the arm's overall dimensions and range of motion across the desk.

Accuracy Trade-offs: Achieving a balance between speed and accuracy was challenging - so we had to strike the right balance to ensure that the arm can perform tasks efficiently while maintaining precision. More testing is to be done to ensure extra-accurate motion.

Cost Constraints: The choice of materials, motors, sensors, and other components was influenced by budget limitations. This impacted the overall quality and performance of the arm, as higher-quality components might have been preferred but were restricted due to cost considerations.


Future Plans

Our team has tossed a few ideas around, but most are still up in the air. Nevertheless, some of these include:

  • OpenCV integration with a camera (track human arm movements & mirror them on 6DOF)
  • Transitioning from Arduino to STM32 (significantly more efficient)
  • SLAM techniques & algorithms (to enable the arm to build a map of its surroundings while localizing itself within that map)
  • 3D Object Reconstruction via depth-sensing cameras or LiDAR sensors (to reconstruct 3D models of objects in the environment)
  • Integration with IoT Platforms like RaspberryPI (for remote monitoring, control, and data logging as a way of interacting with 6DOF over the internet)