Showing posts with label robot. Show all posts
Showing posts with label robot. Show all posts

05 July 2020

SRC2 - Explicit Steering - Wheel Speed

SRC2 Rover
This fourth post about the qualifying round of the NASA Space Robotics Challenge - Phase 2 (SRC2) addresses the speed of the wheels of the rover, shown to the right, under the various possible motions. The rover uses Explicit Four Wheel Steering which allows the orientation of the wheels to be independently changed. The second and third posts explored the geometry to determine the position of the wheels for a turn, pivoting in place, and crab / straight movement. See the first post for the basics of the competition. 

Wheel Orientation

The orientation of the wheels on the rover determines the speed for the wheels. In straight or crab movement the speed is the same for all wheels. When turning, shown in the diagram below, the speeds are different for the inner and outer wheels. The requested overall speed of the rover, determined at the center of the rover, is used to calculate the inner and outer speeds. 



 Term        Description
 ICR Instantaneous Center of Rotation
 Rr Radius from ICR to center of rover
 RiRo Radius of  rover's inner (i) and outer (o) sides through ICR
 Wb, Wt Wheel base and wheel track of rover. Lengths are representative of actual size.
 WRi, WRo Radius of inner(i) and outer (o) wheels
  δiδo  Steering angle for inner (i) and outer (o) wheels

Visualize on the diagram three concentric circles drawn from the ICR. One circle passes through the center of the rover while the others pass through the inner and outer corners, or wheels, of the rover. The second post calculated the wheel's turning radius as:


The rover radius is Rr, the distance from the ICR to the center of the rover. 

The speed (Sr and turn radius of the rover determine the time (Tr) to complete a full circle, as shown in the first equation below. The next equation calculates the speed of either set of wheels (WR) using the circumference of the respective circles. Subsequently, that equation can be simplified as shown in the formulations that follow. 



Twist Calculation

The standard ROS movement command is the twist message. It contains two 3 dimensional vectors. One specifies the linear movement for the x, y, and z dimensions. The other specifies the orientation, also as x, y, and z, but meaning roll, pitch, and yaw respectively. 

The calculations for steering orientation and speed are all based on the radius of the turn. That turn radius needs to be calculated using the X velocity and the Yaw from the message. Recall from post three that turning during a crab movement is not under consideration so the linear Y value is ignored. 

Getting to how to do the calculation requires some interesting analysis but the final, actual calculation is extremely simple. The starting point is the Yaw in radians / second. The first equation determines the time it would take to turn a full 2𝜋 radians at the Yaw rate. Or, how long to traverse a full circle. 



Next, that time is used to determine the circumference of the circle using the X speed. Knowing the circumference the radius is determined. The equations show the individual steps but then combine them to reduce them to a simple calculation. Everything should reduce to such simplicity. Note that dimensional units are included to assure the final units are valid.

03 July 2020

SRC2 - Explicit Steering - Crab, Straight, and Pivot Movements


SRC2 Rover
This is the third in a series of post about my involvement with the qualifying round of the NASA Space Robotics Challenge - Phase 2 (SRC2). The first post introduced the basics of the competition. One aspect of the challenge is there is no controller for the rover depicted to the right. It uses Explicit Four Wheel Steering which allows the orientation of the wheels to be independently changed. This provides multiple ways for the rover to move, e.g. straight, crab, turn, pivot.

The second post explored the geometry on positioning the wheels for a turn. This post will address pivoting in place and crab movement, i.e. moving sideways. It also addresses the trivial crab case of moving straight forward or back. 

29 June 2020

SRC2 - Explicit Steering - Wheel Orientation for Turning

SRC2 Rover

The first post in this series explained I'm currently involved with the qualifying round of the NASA Space Robotics Challenge - Phase 2 (SRC2). The competition requires controlling the rover to the right. It uses Explicit Four Wheel Steering which allows the orientation of the wheels to be independently changed, This provides multiple ways for the rover to move, e.g. straight, crab, turn, pivot.

The challenge is there is no controller for the rover in the Robot Operating System (ROS) because the rover wheels are controlled by effort, not the typical speed control.
This article address the geometry of controlling the rover when it is turning. The diagram below illustrates the rover making a counter-clockwise turn around the Instantaneous Center of Rotation (ICR). The arrows represent the wheel orientation. The dotted box is drawn proportional to the wheel base and track of the rover. Note the orientation of the X/Y axis which is ROS standard for robots.
Explicit Steering - Rover Turning


 Term        Description
 ICR Instantaneous Center of Rotation
 Rr Radius from ICR to center of rover
 RiRo Radius of  rover's inner (i) and outer (o) sides through ICR
 Wb, Wt Wheel base and wheel track of rover. Lengths are representative of actual size.
 WRi, WRo Radius of inner(i) and outer (o) wheels
  δiδo  Steering angle for inner (i) and outer (o) wheels


NASA Space Robotics Challenge - Phase 2


Another NASA Centennial Challenge began earlier this year. It will be the 3rd I've entered. I also entered the 2019 ARIAC competition which makes 4 competitions. The current competition is the Space Robotics Challenge - Phase 2 (SRC2). In this competition robotic mining rovers explore the Moon to detect volatiles and collect them, detect a low orbiting cube sat, and position one rover aligned with a fiducial on a processing station.

Links to Follow-on Posts


In the following posts I'll explain what I can about this topic. There are a number of research papers available however my posts will provide a simplified explanation. 

Explicit Steering

The biggest challenge in this competition is controlling the rover. It is the size of a small SUV with Explicit Four Wheel Steering, i.e. each of the four wheels is steered separately. If you've seen the Mars rover it is a similar design. That's the base rover, above.

Explicit steering allows flexibility in movement of the rover. It can turn all four wheels to the same angle to move sideways in a crab movement. This also provides straight forward movement. By orienting the wheels at different angles the rover can turn. An extreme example of this is pivoting in place.

Robot Operating System

The base software for the competition is the Robot Operating System (ROS) which consists of the fundamentals for communicating amongst software nodes and a large number of packages that provide useful capabilities. Unfortunately there isn't one for controlling the SRC2 rover.

There are two ways of controlling the wheels for locomotion. The predominant one is issuing a speed command. A number of ROS controller packages provide this capability. Another specified the effort or torque. There are effort controllers but none directly apply to the SRC2 rover.

The location of the rover on the surface is required for reporting the position of volatiles, moving to them, and generally controlling the rover's movement. This requires deriving odometry from wheel movement, vision processing via stereo cameras and an inertial measurement unit (IMU). Again, this is not provided. It is especially challenging due to the low friction between the wheels and simulated Moon surface which allows slippage of the wheels.

07 August 2013

Programming Languages Used for SRR

I asked at the SRR Challenge about the languages and vision processing used by each team. Here is what I found:

Team                     Language                       Vision Processing
Intrepid                         C++ / Matlab                                          
Kuukulgur                     C++                                          OpenCV
Mystic                           C++                                          RobotRealm
SpacePride                    RoboRealm state machine          RoboRealm
Survey                           C++, Python                             OpenCV
Middleman                     LabView                                   LabView
UCSC                           C/C++                                      OpenCV
Waterloo                       C++, Python                                            
WPI                              C++                                                             
Wunderkammer             Python                                      ROS vision packages

Here is a rough synopsis of how the teams fared:

Team Intrepid was the first to leave and return to the platform. It thought it picked up the sample but actually did not.

Team Kuukulgur (it means Moon or Lunar Rover), a demonstration team, from Estonia, was the first to pick up the sample but did not make it back to the starting platform. They had the slickest looking robots but then three of the team are mechanical engineers. They brought a swarm of four but one failed so only three took the field.

Team Waterloo, a demonstration team from Canada, also picked up the sample and were the first to return it to the starting platform but the sample was just outside the 1.5 meter square area. It did not hurt them financially since they were a demonstration team and thus ineligible for the NASA money. They did receive $500 from WPI for picking up the sample.

Team Survey won the Phase I competition this year and will take home $6,000 for that effort. ($5,000 from NASA and $1,000 from WPI.)

Team Mystic Lake, myself, did not do that well but I consider it a "building year", to borrow from sports team terminology. Mystic Two traveled the furthest distance of a robot in the SRR to date. It just kept trekking. I proved out much of my code and the ability of my very small rovers to handle the terrain.

SpacePride fielded two rovers but were unable to accomplish much. Their software developer dropped out near the end so they had to scramble to get something working via a state-machine in RoboRealm.

I will update the table if more information becomes available.

Just after I returned from the challenge, an email on a robotics mailing list asked for advice on languages to use for robots. Since I had almost all of the information posted above I put it into a reply and received a nice thanks in return. Hopefully someone will find this interesting.

(Updated UCSC - UC Santa Cruz from comment. Thanks.)
(Updated Middleman aka RoBear from comment. Thanks.)



17 June 2013

Rover Names

When I got the 4 computers for the rovers I still had not come up with a good names for them. Loading Windows XP I simply called the first one Mystic One. That led to the others being Two, Three, and Four. The more I used that name I liked it and started referring to them collectively as The Mystics. That is their name now and I will use if for the team name next year: Team Mystic.

The name obviously comes from Mystic Lake Software, my DBA. After Shari and I decided I was retired - I promised to have a warm supper on the table for her every night - I wanted to create a DBA just in case I did pick up some kind of work. Behind our house - and across a street - is a small park with a lake - Mystic Lake. That felt like a good name so I used it by simply adding the "Software". It conveys that aura of mystery that pervades software.

Back from the SRR, links and an EMMY!

Back home after an awesome experience at the 2013 NASA Sample Return Robot Centennial Challenge. I posted on Facebook to capture the activities of myself and the other teams. It was an intense week and with seven days of travel involved getting there and back I am still not ready to jump full speed into much of anything.

If you visit the Mystic Lake FB page take a look at the pages I liked from there. They are either vendors I have used or additional FB pages associated with the SRR. A few of the teams are there. If others have FB pages I am not aware of them.

I do not use Twitter but if you search for #srrbot you can see what was tweeted by others.

The SRR is one of many NASA Centennial Challenges. Challenges of this type have a long history in aerospace. Lindbergh's crossing of the Atlantic is probably the one most well known. He won $33,000 for meeting the challenge.

Worcester Polytechnic Institute hosted the challenge. They were very good hosts providing 3 meals each day so the teams could work continuously. That also provided an opportunity to talk with some of the other teams. At all the other times the teams were heads-down working on their robots.

Friday local school kids visited, saw NASA exhibits, the teams demonstrating robots, and were generally exposed to technology. A group of high visibility social media users also toured the robot pits to talk with the teams and see the robots.

On Saturday, NASA and WPI hosted Touch Tomorrow to showcase NASA and robotics. All the teams demonstrated their robots at times throughout the day. The crowds, and especially the kids, liked seeing the robots perform.

All through the week NASA360 was there talking with us and taking pictures and videos. They were a great bunch of guys to have poking their lenses at us. There are some terrific photos and videos. To cap it off, NASA360 won an emmy for their TV show about last year's SRR: Robots, Rocks and Rovers. In May they won a Telly award for the same episode.

Here are some links to photos and videos:
https://pictures.lytro.com/NASAHQPHOTO/pictures/658265
NASA photos from all days.

(Tom had way too much fun putting shots of the Mystics in trouble in these.)
NASA360 Rover Madness
NASA360 Kicking 'Bot

Video from Mystic Two - taken from the camera on the rover.
Photos of the other teams.
Photos of us and the Mystics (pending getting them organized.)


18 September 2012

DARPA LAGR Project

Today I was looking through some robotics papers applicable to Sample Return that I previously found on the web. One mentioned they were using a DARPA LAGR robot so I looked to see what it was. I found that Carnegie Mellon was involved in producing the standardized robots. The idea was to provide these robots to different researchers, have them develop navigation software, and have them compete on real-word runs to see what went better. The robots only had vision, GPS, and bumper sensors. The outcome of this project seems very applicable to the SRR competition.

One of the researchers at NYU has a long list of papers on navigation.

05 February 2010

RoboRealm Vision Processing - Wrappers Classes

I've been working with RoboRealm over the last week. It is a vision processing application. One of its nice features is being able to access it from another program. You can let it do the heavy lifting of extracting information from a web cam image and then your program just gets a few important data points for analysis.

The module I've been working with is Center of Gravity which locates a blob in the image and reports its size and location. In particular, I'm looking for a red circle.

The interface I've used is the RR_API which is a XML over a socket connection. Reading a single variable is straightforward but reading multiple variables with one request is a lot of detail chasing. I hate chasing details over and over again. That is why they originally created subroutines and, more recently, classes. So I wrote some classes to wrap the read variable routines. I haven't need to write information, yet, so that will wait until needed.

The files are in Google Code.

Individual variables are handled through the RoboRealmVar class and its base class RoboRealmVarBase. The base class is needed to provide an interface for reading multiple variables. More on that below.

RoboRealmVar is a template class to allow for handling different data types. One of the details with th RR interface is all data is returned as a char string so it has to be converted to the correct data type. The class handles that automatically. The header file has instances of the template for int, float, and string. Other types could be added but may need a char* to data type conversion routine. See the string instantiation for how that is done.

Variables are declared by:

rrIntVar mCogX;
rrIntVar mCogBoxSize;
rrIntVar mImageWidth
;

The examples are all class members, hence the prefix 'm' on their names.

Initialize the variables with the instance of the RR class. In the example mRoboRealm is the instance of RR opened through RR_API:

mCogX("COG_X", mRoboRealm),
mImageWidth("IMAGE_WIDTH", mRoboRealm),
mCogBoxSize("COG_BOX_SIZE", mRoboRealm),


and then read them using an overload of operator():

int cogx = mCogX();

Multiple variables are read using the RoboRealmVars class. Declare it and instantiate it with:

RoboRealmVars mCogVars;
mCogVars(mRoboRealm)

Again, my examples are from inside a class.

Then add the individual variables to the list by:

mCogVars.add(mCogX);
mCogVars.add(mImageWidth);
mCogVars.add(mCogBoxSize);

then read them through:

mCogVars();

You can access their values just as shown above through the individual variables.

Hopefully this will be useful to others.

25 January 2010

Robot Components

Time to explain the components of the robot a bit more. The diagram provides an overview.

The main platform is the iRobot Create. It is an autonmous robot by itself but provides control through a serial port connection using a protocol called the Open Interface (OI). The OI can read the sensors and control the actuators of the Create.

The Fit PC Slim is a compact, low power PC with 3 USB ports and a Wifi, plus the usual PC components. It is powered from the Create through a voltage regulator on the Interface Board (IB). The IB also carries the USB interfaces for the serial port and I2C.

I2C is a standard 2 wire bus for controlling actuators and accessing sensor input. I'm not totally sure what is going to be on the bus. I expect a compass module, at least, to provide orientation. I have sonar and IR distance sensors working on I2C but am not sure which to use. These would be backup for detecting obstacles via vision processing. A main goal is for the robot to move around without bumping into obstacles. I also have a digital I/O board that could be used to provide LED indicators of what the robot is doing.

The reasons for the Wifi on the Slim is to download software and allow monitoring from the desktop or laptop, especially in the field.

RoboRealm (RR)is a software package whose main purpose is vision processing. It also has a lot of robot control capability, including a plug-in for the Create. I decided not to use that plug-in after some issues figuring out exactly how it worked. That may have been a mistake. My other concern was the latency of getting sensor information with it getting collected by RR and then collected from RR by the control program. RR will be used to handle the camera and vision processing.

Shifting Gears - iRobot Create

I'm shifting gears to robotics. Awhile ago I got an iRobot Create. Its basically a Roomba vacuum cleaner with the guts removed to make a cargo area. In this area is a 25-pin connector that provides power, TTL serial port, digital and analog I/O.

I also got a Command Module (CM) which fits onto the connector. The CM is an Atmega 168 processor that adds some additional I/O. It can be programmed to control the Create. I did so and basically reproduced the wandering behavior of the Create. It move around, bumps into things and turns away from what it hit. I added some additional behaviors such as if it got trapped, i.e. caught in the same place for a period of 10 secs, it would move to extract itself.

I want to do more with robots, such as entering in a RoboMagellan contest. That requires an outdoor capable robot that does a lot more than bump into things. A key component to me is vision. Maybe I could do that with the CM and another processor (like the CMUCam) but I really didn't want to learn YAPE (yet another programming environment).

Around the time I got thinking seriously on this I looked at ITX boards. Then the Fit PC computers became available, specifically the Fit PC Slim. The PC form and wireless sold me on trying to us it. The one drawback might be the processor speed when trying to do vision processing. That is acceptable because the Create with the Slim is a testbed for RoboMagellan where an entirely new, slightly larger platform will be used. By going with the PC as the base there are a large number of possibilities, including laptops and netbooks. If the processor is slow for vision the Create simply won't move as quickly or smoothly.

I have the Slim hooked up to the Create, drawing power, and running most of the behaviors previoiusly implemented with the CM. Once I got the basic threading, serial communications, and Create interface working the behaviors started working within minutes since they ported easily from the CM versions. All the code is C++. Threading and serial port routines are all from previous projects so its all come together with a few days work.

19 January 2010

Subsumption Architecture - Introduction

The brain of a robot is the software. The software has to take in the sensor data, interpret it, and generate commands to the actuators. One architecture for robot software is called subsumption. It came out of MIT and Prof. Rodney Brooks who is a founder of iRobot who makes my Create robot. The idea is to break the robotos activities into small pieces. Let me build up the concept by example. A fundamental activity of a robot is to cruise around. If nothing is to be done just let the robot drive straight ahead. So we create an activity called Cruise. It simply actuates the motors to go straight at a safe speed. It is easy to write and test. After driving straight ahead for awhile the robot bumps into something. And continues to try to go straight ahead. This is not good for the robot or the cat or furniture it bumped into. So we write a Bump activity using the sensors on the robot - the Create has a right and a left bump sensor on the front that wrap around a bit toward the sides. So Bump determines if a bump sensor was triggered the robot should stop. So we write Bump. How to Bump and Drive get put together in the software? This is the subsumption architecture part. Initially its easy. First call Bump. If it doesn't activate, call Drive. If Bump does activate, don't call Drive. Let this run. The robot goes merrily off, bumps into a cat, and stops. The cat gets up, the robot continues straight ahead, hits a chair, stops, and stops and stops and stops. Not very interesting. What we'd like is for the robot to back up a little bit, turn, and then continue straight ahead. Hopefully that will clear the obstacle, and it will if the robot just brushed a wall. But even if it doesn't, repeating that behavior will eventually turn the robot toward an open area. (Well, many times it will. More later...) This new behavior is somewhat different from Bump and Drive because we want it to do the back and turn without interruption. The term used for this is a ballistic behavior. Do we add this to Bump or Drive, or create a new behavior? The texts I've read added it to Bump. But based on my experience I create a new behavior called Flee. Flee works with what could be called an internal sensor. This internal sensor tells Flee thow much to back up and turn. So Bump sets this internal sensor to backup a little bit (40 mm) and turn a little (20 degrees). Since Bump can tell whether the left or right bump sensor (or both) was hit it also sets the direction of the turn so the robot will turn away from the bump. Now the activities are called in the order: Flee, Bump, Drive. Remember that if Flee is active the later activites aren't called. If Bump is active, Drive is not called. So the robot Drives ahead, Bumps into something, the internal flee sensor is set, and Flee backs up and turns the robot. Then with both Flee and Bump inactive, Drive engages and the robot moves ahead. Just for completeness, I have another activity called Trapped. It is added between Flee and Bump. Every time Trapped is called it records the time and distance moved. If the robot has not moved very far (80 mm) in a certain period of time (10 seconds) then Trapped sets the flee sensor to back a little bit and turn 180 degrees. The idea is that by turning 180 degrees the robot can get out of a bad situation. One such situation is the legs on a rolling desk chair, or a corner. With these behaviors my Create wanders around the house pretty well. The actual implementation needs a couple more details. Here is some psuedo code: preempt = false; preempt = Flee(preempt); preempt = Trapped(preempt); preempt = Bump(preempt); preempt = Drive(preempt); The variable preempt is set to true by an activity if it is active. If Bump sense a bump then it sets preempt. When Drive sees preempt is set it does not activate, i.e. does not drive forward. If Bump sees preempt is set it does not bother checking the bump sensors, because presumably Flee is now handling the bump condition. Why bother calling activities if they are preempted? Look back at how Trapped works. It is monitoring the distance traveled. If it is not called because Flee is active... And there I'm going to let it hang because I don't remember why. But I'm going to publish this now, leave it as is, and resume in another posting. This is software development as it is, folks. There are a few possibilities here:
  • I simply don't recall the reason so have to remember it or rethink it. That is why you should document things.
  • There was a valid reason that is no longer valid. Boy, that happens all the time in development. A good habit to develop is to revist assumpts regularly to see how they've changed.
  • I simply blew it when writing the code many months ago.
  • ...or some totally different situation that I can't think of right now.

That is the basics of subsumption, though. A good book on robot programming that covers subsumption is Robot Programming - A Practical Guide to Behavior-Based Robotics" by Joseph L. Jones.

...sine die

SRC2 - Explicit Steering - Wheel Speed

SRC2 Rover This fourth post about the  qualifying round of the NASA  Space Robotics Challenge - Phase 2  (SRC2) addresses t he speed of the ...