Show All 8 Items
I'm an Electrical Engineering major and each year my college's branch of IEEE competes in a student hardware competition. Last year's competition was inspired by the natural disasters in Haiti and Chile (the competition was held one week after the earthquake in Japan). This was a very large project that was tackled by a group of people. I was the programmer of the group but also helped with some of the sensors that the arduino would be directly interfacing with. I will only cover those parts that i was involved with, however i will briefly talk about the other parts so that readers can get a better understanding of the scale of the competition.
The playing course was meant to imitate a hotel that had been damaged during an earthquake. As seen below there were 4 rooms with a central hallway. In each room there could be up to 4 "Victims", as well as various pieces of "debris" and a "Hazard". The victims were composed a 3" PVC cap with a magnetic coil, and an status indicator LED. the debris was various sizes of 2X4 & 1X1 lumber painted white. The hazard was a large magnetic field at a different frequency then the victims.One of the biggest challenges was being able to navigate the course with the possibility of your path being blocked by debris. That's where I came in!
This is my first instructables(hopefully not the last). Unfortunately I wasn't really planning to write it at the time ( I was to focused on the completion) so I don't have any pictures of the robot under construction.I finally have some time to do a write up and the Microcontroller contest was just the incentive i needed to complete it. I'd really appreciate any comments and suggestions, and I'll try to answer any questions you have. If you like my instructables please vote for me in the Microcontroller and/or Make It Move contest!
Step 1: The Sensors
During our early meetings, we decided that the most efficient way to navigate was to follow the wall around the course, To do this we needed a way to keep track of not only the wall but any obstacles that could be placed anywhere around the course. We build a lot of robots and other project at my college so we know quite a bit about sensors. What we needed was a way to accurately measure the distance to the walls.
We considered both Infrared and Ultrasonic sensors, There are infrared sensors that can read the distance that we needed, however we didn't know many details about the paint that was to be used on the course walls or ambient lighting, We were afraid that this could through off our distance readings. We decided on Ultrasonic sensors. There are dozens if not hundreds of ultrasonic sensor, but one of the most well known is the PING sensor from Parallax. We looked at several models but finally decided on the PINGs because they have the shortest minimum read distance of 2cm. This way we could hug the walls pretty closely.
The first thing we needed to do was figure out how many sensors we needed. We determined that we would need 2 sensors to square up against the wall. we also needed 2 sensors on the front for obstacle avoidance and navigation. We would also need sensors on each side of the robot to locate the victims. One of the requirements when finding the victims was to announce the victims location on an invisible X and Y gird in each room. To do this we would need to be able to figure out our distance to each of the 4 walls.(really we could do this with only 2 walls but 4 walls provided some redundancy.) In the end we used 7 sensors, 2 on the front, 2 on the right side, 2 on the back and 1 on the left side (we only needed 1 on the left to provide distance to the walls, the pairs are used together to make sure the robot is square against the wall either while turning or driving straight).
Step 2: What is a PING sensor
Basically a PING sensor uses high frequency sound to accurately measure distance. They send out a pulse when sent a command and when the echo of that pulse returns, they send back a high signal to the microcontroller. Since the speed of sound is basically constant (it varies very slightly with temperature/humidity/elevation) the distance from an object can be calculated by the time it takes for the pulse to return. It takes 29 microseconds for sound to travel 1 cm, and 74 microseconds to travel 1 inch. But that is only one way. for the pulse to travel to an object and back to the sensor takes twice as long. Luckily for us the arduino language has a command built it that measures this for you, its called pulseIn. You can divide the results from pulsin by 58 for cm or 148 for inches or just use the raw microseconds as a judge of distance. For general purpose distance measurements I used the cm conversion, however for some portions of the code that needed sub-cm accuracy i used the microsecond values.
The gif below shows a dolphin sending out an ultrasonic pulse and it bouncing off of a fish and returning to the dolphin this is the same principle as a PING sensor
Step 3: Finding the victims
I mentioned earlier that I didn't originally plan to make an instructables about our robot, so unfortunately The pictures don't show the Camera module that we used. We ordered one early on but never received it, luckily the staff engineer who helped us a lot offered his own private camera to use during the competition, which we returned after the competition. We used a CMU-Cam 1 which was developed by students at Carnige Mellon University. We had a Parallax BasicStamp 2 controlling the CMU-Cam which we trained to only see the green light reflecting off of the white PVC caps. During early testing with the CMU cam we noticed that white light would cause false readings even on black surfaces when close enough. We were worried that if glossy paint was used it would cause false readings and severely slow us down. The rules said the walls would be purple, so we decided to find a color that would be absorbed by the walls to prevent problems. We had an assortment of different color LED clusters from another project and realized that the green was the best choice.
Unfortunately, the obstacles were also painted white (they didn't want to make it to easy). That is where the EM-Field sensors came in. Each victim produced an Electromagnetic field which would distinguish them from the obstacles. Using the knowledge we gained in our early electronics courses and EM fields class, We were able to build a fairly simple EM field detector that we tuned to only detect the victims, and a second to only detect the Hazard field. The sensors are based on the fact that if you expose a wire to a varying magnetic field, it will induce a current in that wire. This current also produces a voltage based on the resistance of the coil, which was very small. We hand wrapped our own coils around some PVC pipe. We then amplified that many times with OP-Amps. Next we passed the signal through a Band pass filter which only allowed the frequency of the victims to pass. Finally the signal was passed though a rectifier which converted the AC signal into a DC signal. Both of these circuits output an analog voltage that we tied to the Mega's Analog port.
Once a victim was found, the CMU-Cam was programed to switch to detect the status of the LED on the victim. By taking a series of pictures fast enough, We could determine if the LED was always on, always off or blinking. This indicated the status of the victim (unconscious,dead,alive)
Step 4: The Drive System
Now that we knew how we would find our way around the course, We needed a way to get there. Because we knew that there would be small obstacles that we would need to drive over we choose 4 DC Gear motors(2 on each side). They are strong and run about the right speed to get us around the course fast enough. Unfortunately gear motors take quite a bit of current and run on 12 volts, so we couldn't just wire them to the Arduino,
There are several ways to drive them from a microcontroller (relays,mosfets, etc.) but we needed precise control over the motors so that we could navigate correctly. We needed a Motor Controller. We chose a Sabertooth 2X10 (2 motors, 10A each), Yes our motors only draw about 1/2 amp each, but you don't want your motor controller pushed near its limits (Very Bad things happen! ). The sabertooth is controlled by the arduino though the serial port. You have to send 2 numbers to it the first is between 1-127 this controllers the speed and direction of the first motor. Then a second number between 128-255 this controllers the speed a direction of the second motor.Make sure to always send 2 numbers or else your robot will continue the way it was going.
Step 5: The MOST important part
This is the one part that our robot would not have been possible without.
Yes its that important. Our robot's body was all metal and even with the rubber pad we layed down on the bottom layer there were still many opportunities to short out a connection(and it happened a few times) so almost every component, connection, and wire was wrapped in electrical tape. We used so much tape that we named the robot "Plym" (short for the Plymouth brand electrical tape we used)
Step 6: The Code
This was my main contribution to the robot. I spent my entire spring break programming! Except for the example code to handle the ping sensors(which i modified to control 7 pings) I wrote all of the code for this robot(someone else wrote code to control the CMU-Cam on the BasicStamp). After brain storming a basic flow chart of what the robot needed to do, I started out building small portions of the code, one step at a time.
I started out with the first logical step, I loaded up the PING example program that is built into the arduino language and tested it out. I then modified it to read the 7 pings we were using. This became the Ping test program below, which was used later to ensure proper connections and when developing some portions of the main code.
The next step was a simple test of the Sabertooth. This was the first time that I had ever used one before, so I wrote a small program that just drove forward, then back, then turned left, then right. It's a simple program, but again very useful. We ended up rebuilding and rewiring the robot several times and after each time we would run this simple program to ensure we had connected the motors correctly.
The next program was written using an early mock up of our robot made from a scrap piece of aluminum,a couple of servo motors, pings and cardboard (i taped the pings to the cardboard to mount them), You can see a picture of our little junk bot below. it was thrown together in about an hour but served its purpose. I started by reading the right side sensors. The robot would drive forward as long as the sensor values were the same (square to the wall) and it would drive one motor faster then the other if the values were different. This corrected any drift that would occur. This is very important, even if you drive 2 motors at the same speed they don't always turn at exactly the same rate, especially if they are turning in opposite directions.
Once this code was working I began to monitor the front right sensor to see if it ever went dramatically rose, this would only occur if the robot reached the edge of a wall. After a 90degree turn it would still be seeing a long distance to the side wall which could be a problem, even if it did drive forward from there it would become confused when the first sensor hit the next wall, so a special algorithm was written to use the other sensors to square up against another wall and drive forward until both side sensors read the wall again. I later added in some more exceptions to get around the entrance to each of the rooms and a check of the front sensors for inside corners.
I'm sure that most people reading this are doing so because they want to learn how to make an obstacle avoiding robot, This was a very purpose built robot, and some parts of the code used during our competition would be of no use to most of you, So I decided to provide 2 stripped down versions. I know how annoying it can be to find example code that does exactly what you need, then have to go through and erase half of what you don't need. They are still functional as obstacle avoiding robots except that i removed the portions that communicate with the Camera and the Speakjet shield. This is actually the code that was used during our "Engineering Day" demo of the robot so the 7 segment displays randomly generate new numbers whenever it reaches an inside corner. We built a small rectangular course(our test course had been disassembled after the competition) for the event and thought it would be nice for the display to update from time to time so we set it to randomly generate numbers. In the Bare minimum version, i also remove a lot of code that was used to keep track of each room and display to the 7segment displays.
Step 7: Thank You!This is my first instructables and I'd like to take a moment and thank everyone who made this possible.
All of my professors with out them we wouldn't have a clue where to begin on this.
Billy,Randy,Ron,Matthew,Steve,Jenna,Josh,Drew,and everyone else that helped with the robot
My family,who always supported me.
Thank You everybody.
I'd appreciate any comments and i'll try to answer any questions. Thank you for checking out my instructables, Hope you enjoyed it.