When I got the 4 computers for the rovers I still had not come up with a good names for them. Loading Windows XP I simply called the first one Mystic One. That led to the others being Two, Three, and Four. The more I used that name I liked it and started referring to them collectively as The Mystics. That is their name now and I will use if for the team name next year: Team Mystic.
The name obviously comes from Mystic Lake Software, my DBA. After Shari and I decided I was retired - I promised to have a warm supper on the table for her every night - I wanted to create a DBA just in case I did pick up some kind of work. Behind our house - and across a street - is a small park with a lake - Mystic Lake. That felt like a good name so I used it by simply adding the "Software". It conveys that aura of mystery that pervades software.
Showing posts with label robot hardware. Show all posts
Showing posts with label robot hardware. Show all posts
17 June 2013
Back from the SRR, links and an EMMY!
Back home after an awesome experience at the 2013 NASA Sample Return Robot Centennial Challenge. I posted on Facebook to capture the activities of myself and the other teams. It was an intense week and with seven days of travel involved getting there and back I am still not ready to jump full speed into much of anything.
If you visit the Mystic Lake FB page take a look at the pages I liked from there. They are either vendors I have used or additional FB pages associated with the SRR. A few of the teams are there. If others have FB pages I am not aware of them.
I do not use Twitter but if you search for #srrbot you can see what was tweeted by others.
The SRR is one of many NASA Centennial Challenges. Challenges of this type have a long history in aerospace. Lindbergh's crossing of the Atlantic is probably the one most well known. He won $33,000 for meeting the challenge.
Worcester Polytechnic Institute hosted the challenge. They were very good hosts providing 3 meals each day so the teams could work continuously. That also provided an opportunity to talk with some of the other teams. At all the other times the teams were heads-down working on their robots.
Friday local school kids visited, saw NASA exhibits, the teams demonstrating robots, and were generally exposed to technology. A group of high visibility social media users also toured the robot pits to talk with the teams and see the robots.
On Saturday, NASA and WPI hosted Touch Tomorrow to showcase NASA and robotics. All the teams demonstrated their robots at times throughout the day. The crowds, and especially the kids, liked seeing the robots perform.
All through the week NASA360 was there talking with us and taking pictures and videos. They were a great bunch of guys to have poking their lenses at us. There are some terrific photos and videos. To cap it off, NASA360 won an emmy for their TV show about last year's SRR: Robots, Rocks and Rovers. In May they won a Telly award for the same episode.
Here are some links to photos and videos:
https://pictures.lytro.com/NASAHQPHOTO/pictures/658265
NASA photos from all days.
(Tom had way too much fun putting shots of the Mystics in trouble in these.)
NASA360 Rover Madness
NASA360 Kicking 'Bot
Video from Mystic Two - taken from the camera on the rover.
Photos of the other teams.
Photos of us and the Mystics (pending getting them organized.)
If you visit the Mystic Lake FB page take a look at the pages I liked from there. They are either vendors I have used or additional FB pages associated with the SRR. A few of the teams are there. If others have FB pages I am not aware of them.
I do not use Twitter but if you search for #srrbot you can see what was tweeted by others.
The SRR is one of many NASA Centennial Challenges. Challenges of this type have a long history in aerospace. Lindbergh's crossing of the Atlantic is probably the one most well known. He won $33,000 for meeting the challenge.
Worcester Polytechnic Institute hosted the challenge. They were very good hosts providing 3 meals each day so the teams could work continuously. That also provided an opportunity to talk with some of the other teams. At all the other times the teams were heads-down working on their robots.
Friday local school kids visited, saw NASA exhibits, the teams demonstrating robots, and were generally exposed to technology. A group of high visibility social media users also toured the robot pits to talk with the teams and see the robots.
On Saturday, NASA and WPI hosted Touch Tomorrow to showcase NASA and robotics. All the teams demonstrated their robots at times throughout the day. The crowds, and especially the kids, liked seeing the robots perform.
All through the week NASA360 was there talking with us and taking pictures and videos. They were a great bunch of guys to have poking their lenses at us. There are some terrific photos and videos. To cap it off, NASA360 won an emmy for their TV show about last year's SRR: Robots, Rocks and Rovers. In May they won a Telly award for the same episode.
Here are some links to photos and videos:
https://pictures.lytro.com/NASAHQPHOTO/pictures/658265
NASA photos from all days.
(Tom had way too much fun putting shots of the Mystics in trouble in these.)
NASA360 Rover Madness
NASA360 Kicking 'Bot
Video from Mystic Two - taken from the camera on the rover.
Photos of the other teams.
Photos of us and the Mystics (pending getting them organized.)
18 September 2012
DARPA LAGR Project
Today I was looking through some robotics papers applicable to Sample Return that I previously found on the web. One mentioned they were using a DARPA LAGR robot so I looked to see what it was. I found that Carnegie Mellon was involved in producing the standardized robots. The idea was to provide these robots to different researchers, have them develop navigation software, and have them compete on real-word runs to see what went better. The robots only had vision, GPS, and bumper sensors. The outcome of this project seems very applicable to the SRR competition.
One of the researchers at NYU has a long list of papers on navigation.
One of the researchers at NYU has a long list of papers on navigation.
03 February 2010
Create Fun with Grandson
Last weekend two grand kids were here. The girl, Dorian, is a teenager. The boy, Kade, is six. Just before Christmas I was working on the Fit PC Slim to iRobot Create interface when they visited. He had his nose up close asking when it would be done. He asked the same thing in another visit since then. I had to reply it was not done but I was working on it.
So this visit I just had to have something working. I got the basic wander and bump into routines working with the Slim - a reproduction of the Create demo 1 behavior. I figured that would be good for about 2 minutes of interest so needed more.
Since this project will be using a web camera for vision I used some velcro to plunk the camera onto the Create just behind the IR sensor. The velcro raised it enough to see over the top of the sensor. I brought up RoboRealm and setup its built-in web page viewer. This lets you see the camera's images. I pointed the laptop at the web pages to display what the Create was seeing.
That was good for about 20 minutes and we got called for lunch and told to put the robot away. Awwww!!!
Next visit is in a couple weeks - they are visiting here pretty regularly now. The goal is to have the Create follow a "leash" - a red dot mounted on the end of a stick. That means getting the software to talk with RR to get information on the center of gravity (COG) of the red dot and send the drive commands to the Create to center and drive toward the dot. It should stop when it gets a little bit away from the dot, and even back up if the dot gets closer.
So this visit I just had to have something working. I got the basic wander and bump into routines working with the Slim - a reproduction of the Create demo 1 behavior. I figured that would be good for about 2 minutes of interest so needed more.
Since this project will be using a web camera for vision I used some velcro to plunk the camera onto the Create just behind the IR sensor. The velcro raised it enough to see over the top of the sensor. I brought up RoboRealm and setup its built-in web page viewer. This lets you see the camera's images. I pointed the laptop at the web pages to display what the Create was seeing.
That was good for about 20 minutes and we got called for lunch and told to put the robot away. Awwww!!!
Next visit is in a couple weeks - they are visiting here pretty regularly now. The goal is to have the Create follow a "leash" - a red dot mounted on the end of a stick. That means getting the software to talk with RR to get information on the center of gravity (COG) of the red dot and send the drive commands to the Create to center and drive toward the dot. It should stop when it gets a little bit away from the dot, and even back up if the dot gets closer.
25 January 2010
Robot Components
Time to explain the components of the robot a bit more. The diagram provides an overview.
The main platform is the iRobot Create. It is an autonmous robot by itself but provides control through a serial port connection using a protocol called the Open Interface (OI). The OI can read the sensors and control the actuators of the Create.
The Fit PC Slim is a compact, low power PC with 3 USB ports and a Wifi, plus the usual PC components. It is powered from the Create through a voltage regulator on the Interface Board (IB). The IB also carries the USB interfaces for the serial port and I2C.
I2C is a standard 2 wire bus for controlling actuators and accessing sensor input. I'm not totally sure what is going to be on the bus. I expect a compass module, at least, to provide orientation. I have sonar and IR distance sensors working on I2C but am not sure which to use. These would be backup for detecting obstacles via vision processing. A main goal is for the robot to move around without bumping into obstacles. I also have a digital I/O board that could be used to provide LED indicators of what the robot is doing.
The reasons for the Wifi on the Slim is to download software and allow monitoring from the desktop or laptop, especially in the field.
RoboRealm (RR)is a software package whose main purpose is vision processing. It also has a lot of robot control capability, including a plug-in for the Create. I decided not to use that plug-in after some issues figuring out exactly how it worked. That may have been a mistake. My other concern was the latency of getting sensor information with it getting collected by RR and then collected from RR by the control program. RR will be used to handle the camera and vision processing.
The main platform is the iRobot Create. It is an autonmous robot by itself but provides control through a serial port connection using a protocol called the Open Interface (OI). The OI can read the sensors and control the actuators of the Create.
The Fit PC Slim is a compact, low power PC with 3 USB ports and a Wifi, plus the usual PC components. It is powered from the Create through a voltage regulator on the Interface Board (IB). The IB also carries the USB interfaces for the serial port and I2C.
I2C is a standard 2 wire bus for controlling actuators and accessing sensor input. I'm not totally sure what is going to be on the bus. I expect a compass module, at least, to provide orientation. I have sonar and IR distance sensors working on I2C but am not sure which to use. These would be backup for detecting obstacles via vision processing. A main goal is for the robot to move around without bumping into obstacles. I also have a digital I/O board that could be used to provide LED indicators of what the robot is doing.
The reasons for the Wifi on the Slim is to download software and allow monitoring from the desktop or laptop, especially in the field.
RoboRealm (RR)is a software package whose main purpose is vision processing. It also has a lot of robot control capability, including a plug-in for the Create. I decided not to use that plug-in after some issues figuring out exactly how it worked. That may have been a mistake. My other concern was the latency of getting sensor information with it getting collected by RR and then collected from RR by the control program. RR will be used to handle the camera and vision processing.
Subscribe to:
Posts (Atom)
SRC2 - Explicit Steering - Wheel Speed
SRC2 Rover This fourth post about the qualifying round of the NASA Space Robotics Challenge - Phase 2 (SRC2) addresses t he speed of the ...
-
The brain of a robot is the software. The software has to take in the sensor data, interpret it, and generate commands to the actuators. On...
-
Another NASA Centennial Challenge began earlier this year. It will be the 3rd I've entered. I also entered the 2019 ARIAC competition...
-
Just as I was finishing my first look at the accelerometer and magnetic field sensors a couple of threads cropped up on the Android Develope...