I'm currently reading the rather excellent book called "Why does E = mc^2?" (The ^ sign should denotes exponential - I am writing this on my phone and it unfortunately doesn't do superscript.
Going through the chapter where the authors, Professor Brian Cox and Professor Jeff Forshaw are describing how to obtain a spacetime velocity vector.
While paddling through the mathematics like a confused gundog searching for a ninja duck in the weeds, my mind started wondering about that famous two slit experiment where it was found that the act of observing a photon's behaviour, changed it's behaviour.
In other words, if you look at a photon, you lose the ability to predict where it will go, and vice versa.
I think Terry Pratchett calls them inspirons, little particles of inspiration which fly around till they hit a receptive neuron.
One of those hit me and made me wonder.
This is probably utterly wrong.
We, humans, are three dimensional creatures with an imperfect perception of a fourth dimension, time. Don't believe me? Show me a hypercube. Not its shadow, I want to see the real deal.
It can't be done.
In the book mentioned above, Cox and Forshaw show that in order to absolutely describe an object's position, we use a vector which essentially contains representations of the coordinates of an object, as well as an added time descriptor.
That's an imperfect description, but it will do.
At the risk of sounding like I am anthropomorphicising photons, if we are four dimensional massive objects with a limited perception of the time ldimension, is it possible photons are three dimensional massless objects with a limited existence on four dimensions?
The energy of an object equals mass times the speed of light squared, but photons are massless particles which have the velocity c.
Following from that (this is shaky for me) if an object has no mass, it exists as a point in spacetime, giving it three dimensions. A point has an x dimension, a y dimension and a time dimension.
Not really much point to this post except to pour this out. If anyone spots and flaw in all this, feel free to correct me. I would prefer to be educated rather than wrong.
Monday 21 February 2011
Sunday 13 February 2011
I havent forgotten about this place!
Yes, I know its been a while since I've posted, but rest assured gentle reader, I haven't forgotten about you. I have simply been snowed under with other stuff and havent had a chance to write about what I've been doing with the robot.
What I've been doing is sort of not much, although there has been a little bit of action.
I was paid a week ago and to celebrate, I sent off to the good folk at www.Active-Robots.com for a Ping sensor and mounting kit. Amazingly, I ordered it at 3pm on the Friday and it arrived in the post the very next day. Damn good service from those guys.
The Ping sensor is an ultrasonic distance measurement sensor, which has a range of 2cm to 3m. Its a useful thing. The mounting kit comes with a standard servo motor, the workings of which I'm unfamiliar with - what I know about servo's is written up on this blog - so I'm currently in the process of experimenting with it.
At the moment, all I've been able to come up with is the following code:
which simply returns the servo to its central position. I should note, the majority of the code above was lifted from the information .pdf I downloaded from the Paralaxx website. The original code initially took the servo back to its central position and held it there for about 5 seconds. What I've done is whittle the time down so I can use that code as a subroutine to return the servo to center from any position it may be in. 11 pulses is sufficient to return it to center from any point along its slightly more than 180 degrees of travel.
As I discovered, standard (limited turn) servos dont seem to need to be centered like continuous rotation servos. This simplifies things, however I need to figure out how many pulses will take it to extreme left and extreme right. Simple enough to do, I think some code like the following will do the trick:
Useful that PBASIC counts down in that particular manner. Obviously, the reverse code will do for the other extremity:
If it doesn't find the extremity within 100 pulses, I'll be surprised, since it only takes a maximum of 11 pulses to return it to center.
More later.
What I've been doing is sort of not much, although there has been a little bit of action.
I was paid a week ago and to celebrate, I sent off to the good folk at www.Active-Robots.com for a Ping sensor and mounting kit. Amazingly, I ordered it at 3pm on the Friday and it arrived in the post the very next day. Damn good service from those guys.
The Ping sensor is an ultrasonic distance measurement sensor, which has a range of 2cm to 3m. Its a useful thing. The mounting kit comes with a standard servo motor, the workings of which I'm unfamiliar with - what I know about servo's is written up on this blog - so I'm currently in the process of experimenting with it.
At the moment, all I've been able to come up with is the following code:
FOR counter = 1 TO 11
PULSOUT 14, 750
PAUSE 20
DEBUG CR, DEC counter
NEXT
PULSOUT 14, 750
PAUSE 20
DEBUG CR, DEC counter
NEXT
which simply returns the servo to its central position. I should note, the majority of the code above was lifted from the information .pdf I downloaded from the Paralaxx website. The original code initially took the servo back to its central position and held it there for about 5 seconds. What I've done is whittle the time down so I can use that code as a subroutine to return the servo to center from any position it may be in. 11 pulses is sufficient to return it to center from any point along its slightly more than 180 degrees of travel.
As I discovered, standard (limited turn) servos dont seem to need to be centered like continuous rotation servos. This simplifies things, however I need to figure out how many pulses will take it to extreme left and extreme right. Simple enough to do, I think some code like the following will do the trick:
FOR counter = 750 TO 650
PULSOUT 14, counter
DEBUG CR, DEC counter
PAUSE 20
NEXT
Useful that PBASIC counts down in that particular manner. Obviously, the reverse code will do for the other extremity:
FOR counter = 750 TO 850
PULSOUT 14, counter
DEBUG CR, DEC counter
PAUSE 20
NEXT
If it doesn't find the extremity within 100 pulses, I'll be surprised, since it only takes a maximum of 11 pulses to return it to center.
More later.
Wednesday 9 February 2011
I'm not telling you what this is.
Call it a private joke
erk: C0 CE FE 84 C2 27 F7 5B D0 7A 7E B8 46 50 9F 93 B2 38 E7 70 DA CB 9F F4 A3 88 F8 12 48 2B E2 1B
riv: 47 EE 74 54 E4 77 4C C9 B8 96 0C 7B 59 F4 C1 4D
pub: C2 D4 AA F3 19 35 50 19 AF 99 D4 4E 2B 58 CA 29 25 2C 89 12 3D 11 D6 21 8F 40 B1 38 CA B2 9B 71 01 F3 AE B7 2A 97 50 19
R: 80 6E 07 8F A1 52 97 90 CE 1A AE 02 BA DD 6F AA A6 AF 74 17
n: E1 3A 7E BC 3A CC EB 1C B5 6C C8 60 FC AB DB 6A 04 8C 55 E1
K: BA 90 55 91 68 61 B9 77 ED CB ED 92 00 50 92 F6 6C 7A 3D 8D
Da: C5 B2 BF A1 A4 13 DD 16 F2 6D 31 C0 F2 ED 47 20 DC FB 06 70
<3 you Sony.
erk: C0 CE FE 84 C2 27 F7 5B D0 7A 7E B8 46 50 9F 93 B2 38 E7 70 DA CB 9F F4 A3 88 F8 12 48 2B E2 1B
riv: 47 EE 74 54 E4 77 4C C9 B8 96 0C 7B 59 F4 C1 4D
pub: C2 D4 AA F3 19 35 50 19 AF 99 D4 4E 2B 58 CA 29 25 2C 89 12 3D 11 D6 21 8F 40 B1 38 CA B2 9B 71 01 F3 AE B7 2A 97 50 19
R: 80 6E 07 8F A1 52 97 90 CE 1A AE 02 BA DD 6F AA A6 AF 74 17
n: E1 3A 7E BC 3A CC EB 1C B5 6C C8 60 FC AB DB 6A 04 8C 55 E1
K: BA 90 55 91 68 61 B9 77 ED CB ED 92 00 50 92 F6 6C 7A 3D 8D
Da: C5 B2 BF A1 A4 13 DD 16 F2 6D 31 C0 F2 ED 47 20 DC FB 06 70
<3 you Sony.
Saturday 22 January 2011
Exif Data in C# / Mono / .NET or keeping up with my friends
While the majority of this blog is based around robotics, I'm also going to be writing about some of the coding I do as I work my way towards my goals.
As I mention in my profile, I'm an Australian living in the UK and while I like being here, I do have friends and family back in Australia. Before I got this phone, the time I spent on my computer was limited. Reading through this blog, you can see I'm reasonably prolific when I sit down to write something, so if I spent all my time writing emails and so on, I would never have got anything done in the time I had on my computer.
Recently however, I acquired an HTC Desire smartphone, which is an object of wonder in my eyes, and the eyes of my 5 year old son.
I have had "smart" phones before, an HTC Universal which I liked a lot, a Nokia E61i which was fabulous, but this HTC desire leaves me speechless. I find it to be more useful by several orders of magnitude than either of the phones I just mentioned. A large part of that is because of the Android platform which runs the applications on the phone.
For those wondering, Android is built on top of a Linux base. Linux or GNU/Linux (as Richard Stallman would ask us to say) is the operating system running the phone, Android is the bits you can see. Its more complex than that, but to completely define it would require words like "kernel" and "micro" and possibly a number of car analogys.
With this phone however, I'm discovering the joys of brief but frequent bursts of communication, and consequently I'm managing to get back in touch with a lot of the people I left back in Australia. Part of that happens on Facebook, part through writing emails on the phone to people.
I can honestly say that the smartphone has changed the way I communicate and has consequently improved my life. I know thats a huge statement, but in fairness, its also true.
Also, I'm the kind of person who likes to use the full capabilities of a device, so I'm not just using the internet capabilities of the phone, I'm using the GPS, camera and the other sensors as much as I possibly can.
In this particular post, I'm working mainly with the data embedded in the photos taken as I stroll around. The specific data I want is:
The exif (EXchangeable Image File format is kind of a format in a format. The data is embedded in specific parts of an image and taking specific bits of data means knowing what the ID number of each individual bit of data is.
The IDs are in a hexadecimal format which is fairly readily converted to decimal. Once ytou have the ID though, you might think the race is won. Not so. The values associated with each ID number is stored as a Byte array, which might, which itself has been encoded using some fairly unlikely seeming formats. Some are stored as UTF-8, some as ASCII, some as completely different formats which must be interpreted. Its a messy system.
So, I spent most of yesterday learning all this stuff and went home thinking I wouldn't get it sorted in any reasonable amount of time. Came out today and had a quick look again, to find this article on CodeProject.
The guy there has done all the hard work - and I do mean hard work, some of this stuff is incredibly dense - and released a library into the wild which I can use. So now, I'm just tinkering with the library and figuring out how to output a nice neat textfile which has an HTML table which has space for each image and next to the image, a table containing the relevant exif data.
Another idea I had is to write a program which goes through my images folder and finds each location in the photo on google maps. I know its probably been done before, but its a great exercise in figuring out a new API as well as getting some coding done.
Image is the data and image from the first part of the project.
Amazing how much data can be generated by a single button click, isn't it?
As I mention in my profile, I'm an Australian living in the UK and while I like being here, I do have friends and family back in Australia. Before I got this phone, the time I spent on my computer was limited. Reading through this blog, you can see I'm reasonably prolific when I sit down to write something, so if I spent all my time writing emails and so on, I would never have got anything done in the time I had on my computer.
Recently however, I acquired an HTC Desire smartphone, which is an object of wonder in my eyes, and the eyes of my 5 year old son.
I have had "smart" phones before, an HTC Universal which I liked a lot, a Nokia E61i which was fabulous, but this HTC desire leaves me speechless. I find it to be more useful by several orders of magnitude than either of the phones I just mentioned. A large part of that is because of the Android platform which runs the applications on the phone.
For those wondering, Android is built on top of a Linux base. Linux or GNU/Linux (as Richard Stallman would ask us to say) is the operating system running the phone, Android is the bits you can see. Its more complex than that, but to completely define it would require words like "kernel" and "micro" and possibly a number of car analogys.
With this phone however, I'm discovering the joys of brief but frequent bursts of communication, and consequently I'm managing to get back in touch with a lot of the people I left back in Australia. Part of that happens on Facebook, part through writing emails on the phone to people.
I can honestly say that the smartphone has changed the way I communicate and has consequently improved my life. I know thats a huge statement, but in fairness, its also true.
Also, I'm the kind of person who likes to use the full capabilities of a device, so I'm not just using the internet capabilities of the phone, I'm using the GPS, camera and the other sensors as much as I possibly can.
In this particular post, I'm working mainly with the data embedded in the photos taken as I stroll around. The specific data I want is:
- Date
- Time
- GPS Latitude
- GPS Longitude
- Altitude
The exif (EXchangeable Image File format is kind of a format in a format. The data is embedded in specific parts of an image and taking specific bits of data means knowing what the ID number of each individual bit of data is.
The IDs are in a hexadecimal format which is fairly readily converted to decimal. Once ytou have the ID though, you might think the race is won. Not so. The values associated with each ID number is stored as a Byte array, which might, which itself has been encoded using some fairly unlikely seeming formats. Some are stored as UTF-8, some as ASCII, some as completely different formats which must be interpreted. Its a messy system.
So, I spent most of yesterday learning all this stuff and went home thinking I wouldn't get it sorted in any reasonable amount of time. Came out today and had a quick look again, to find this article on CodeProject.
The guy there has done all the hard work - and I do mean hard work, some of this stuff is incredibly dense - and released a library into the wild which I can use. So now, I'm just tinkering with the library and figuring out how to output a nice neat textfile which has an HTML table which has space for each image and next to the image, a table containing the relevant exif data.
Another idea I had is to write a program which goes through my images folder and finds each location in the photo on google maps. I know its probably been done before, but its a great exercise in figuring out a new API as well as getting some coding done.
Image is the data and image from the first part of the project.
Amazing how much data can be generated by a single button click, isn't it?
Monday 17 January 2011
Left Wheel
Right, repetitive iterative testing sort of completed on the left wheel, although I do have some questions need answering about it.
Here is the data:
There is a slight difference in the way this servo reverses direction. It goes further forwards than it does in reverse. In order to get the same distance backwards as well as forwards in the file TestServoSpeed.bs2, it was necessary to change the value of the counter variable from 244 x 2 microseconds to 238 x 2 microseconds, a difference of 12 microseconds.
If you think back to the previous post about the right wheel, the top speed that servo seemed capable of was 47.5 rpm.
As you can see, the top speed the left servo appears capable of is around 43 rpm, which is pretty significant really. Its enough to make a difference in dead reckoning navigation anyway.
Given that I tested the right wheel with absolutely fresh batteries and that I'm testing the left wheel with the same batteries as I used for the right wheel, I initially thought the difference may be down to the batteries not putting out as much power for the left wheel as for the right.
That seems to have some effect, but it isn't the whole answer. I tested the right wheel with a pulse width of both 820 and 679, which in the previous test were widths which managed to achieve the top speed of 4.75 and in this battery of tests, those widths only gave the wheel an rpm of 46. So, while the freshness of the battery may have had some effect, it isn't as pronounced as 4 rpm.
I wonder if there is a way to tune servos?
Here is the data:
13, 752, 0.20
13, 746, 0.15
13, 753, 0.25
13, 745, 0.2
13, 754, 0.33
13, 744, 0.30
13, 755, 0.45
13, 743, 0.40
13, 756, 0.55
13, 742, 0.55
13, 757, 0.66
13, 741, 0.66
13, 758, 0.75
13, 740, 0.75
13, 759, 0.91
13, 739, 0.91
13, 760, 1.03
13, 738, 1.03
13, 765, 1.6
13, 733, 1.6
13, 770, 2.25
13, 728, 2.25
13, 775, 2.75
13, 723, 2.75
13, 780, 3.25
13, 718, 3.25
13, 785, 3.6
13, 713, 3.6
13, 790, 3.9
13, 790, 3.85
13, 790, 3.9
13, 790, 3.85
13, 708, 3.75
13, 795, 4.1
13, 703, 4.05
13, 800, 4.25
13, 698, 4.2
13, 805, 4.25
13, 693, 4.25
13, 810, 4.3
13, 688, 4.3
13, 820, 4.3
13, 683, 4.3
13, 746, 0.15
13, 753, 0.25
13, 745, 0.2
13, 754, 0.33
13, 744, 0.30
13, 755, 0.45
13, 743, 0.40
13, 756, 0.55
13, 742, 0.55
13, 757, 0.66
13, 741, 0.66
13, 758, 0.75
13, 740, 0.75
13, 759, 0.91
13, 739, 0.91
13, 760, 1.03
13, 738, 1.03
13, 765, 1.6
13, 733, 1.6
13, 770, 2.25
13, 728, 2.25
13, 775, 2.75
13, 723, 2.75
13, 780, 3.25
13, 718, 3.25
13, 785, 3.6
13, 713, 3.6
13, 790, 3.9
13, 790, 3.85
13, 790, 3.9
13, 790, 3.85
13, 708, 3.75
13, 795, 4.1
13, 703, 4.05
13, 800, 4.25
13, 698, 4.2
13, 805, 4.25
13, 693, 4.25
13, 810, 4.3
13, 688, 4.3
13, 820, 4.3
13, 683, 4.3
There is a slight difference in the way this servo reverses direction. It goes further forwards than it does in reverse. In order to get the same distance backwards as well as forwards in the file TestServoSpeed.bs2, it was necessary to change the value of the counter variable from 244 x 2 microseconds to 238 x 2 microseconds, a difference of 12 microseconds.
If you think back to the previous post about the right wheel, the top speed that servo seemed capable of was 47.5 rpm.
As you can see, the top speed the left servo appears capable of is around 43 rpm, which is pretty significant really. Its enough to make a difference in dead reckoning navigation anyway.
Given that I tested the right wheel with absolutely fresh batteries and that I'm testing the left wheel with the same batteries as I used for the right wheel, I initially thought the difference may be down to the batteries not putting out as much power for the left wheel as for the right.
That seems to have some effect, but it isn't the whole answer. I tested the right wheel with a pulse width of both 820 and 679, which in the previous test were widths which managed to achieve the top speed of 4.75 and in this battery of tests, those widths only gave the wheel an rpm of 46. So, while the freshness of the battery may have had some effect, it isn't as pronounced as 4 rpm.
I wonder if there is a way to tune servos?
Sunday 16 January 2011
Right Wheel Calibration
I'm thinking if someone happens to just stumble across this blog and start reading it without any knowledge of the subject, they would look at this post and think I was possibly the most boring person allowed to post content on the internet.
If you have found yourself in that situation, then I apologise, this isn't going to be the most entertaining post in the world. Interesting and useful if you have a bent towards the topic I'm writing about, but otherwise, you might want to back away.
In my previous post, I noted the difference fresh batteries made to the overall speed of the robot, adding 5cm/s after changing them. thatsmore than a 25% improvement on the original score, so I thought it might be best to repeat my iterative wheel speed tests in order to be able to precisely control not just the distance, but the velocity at which the robot covers said distance.
As I said in a previous post, sending the servos a pulse train with an up width of 1.5 milliseconds, the servo takes that as an instruction to do nothing.
Finding out exactly what width of signal the servo will begin turning, can only be a useful thing, and knowing the way the speed of the rotation increases as pulse width changes both above and below that magic figure of 1500 microseconds is equally useful.
So, some boring, but necessary testing followed, with the results as follows. For the sake of my sanity, I wont be converting the PBASIC PULSOUT command syntax to microseconds, since I plan to do that in an excel file at some point. For those interested, the code used to perform this testing follows:
If you have found yourself in that situation, then I apologise, this isn't going to be the most entertaining post in the world. Interesting and useful if you have a bent towards the topic I'm writing about, but otherwise, you might want to back away.
In my previous post, I noted the difference fresh batteries made to the overall speed of the robot, adding 5cm/s after changing them. thatsmore than a 25% improvement on the original score, so I thought it might be best to repeat my iterative wheel speed tests in order to be able to precisely control not just the distance, but the velocity at which the robot covers said distance.
As I said in a previous post, sending the servos a pulse train with an up width of 1.5 milliseconds, the servo takes that as an instruction to do nothing.
Finding out exactly what width of signal the servo will begin turning, can only be a useful thing, and knowing the way the speed of the rotation increases as pulse width changes both above and below that magic figure of 1500 microseconds is equally useful.
So, some boring, but necessary testing followed, with the results as follows. For the sake of my sanity, I wont be converting the PBASIC PULSOUT command syntax to microseconds, since I plan to do that in an excel file at some point. For those interested, the code used to perform this testing follows:
' {$STAMP BS2}
' {$PBASIC 2.5}
counter VAR Word
pulseWidth VAR Word
pulseWidthComp VAR Word
FREQOUT 4, 125, 3000, 4000
DO
DEBUG "Enter pulse width: "
DEBUGIN DEC pulseWidth
pulseWidthComp = 1500 - pulseWidth
FOR counter = 1 TO 244
PULSOUT 12, pulseWidth
'PULSOUT 13, pulseWidthComp
PAUSE 20
NEXT
LOOP
Again, I can't take credit for that code, although I have made my own refinements to part of it.
Looking at the interesting part of the code, within the FOR...NEXT loop there is a statement PULSOUT 12, pulseWidth.
that variable, pulseWidth is set by the DEBUGIN command, which simply allows me to enter the width I want each loop of the code. Since each loop takes six seconds, once I have done this testing, the mathematics involved in telling the robot to cover a specific distance at a specific speed are simplicity itself.
In any case, the command PULSOUT 12, 750 is telling the servos to stay still. To calculate how many microseconds that is, is simply a matter of multiplying it by 2 microseconds. That sounds complicated, but in reality, isnt.
750 x 2μ
= 750 x 0.000002 s
= 0.0015 s
= 1.5 ms
In any case, interesting results did come from the laboriousness.
At a pulse width of 752, the wheel started travelling backwards very slowly indeed. Looked at linearly, it probably moved less than a centimentre and did it in a stop-start jerky fashion. Because of that, the first "real" movement in a rearwards direction starts at a pulse width of 753.
The first forwards movement starts at a width of 747.
In order to get exactly one complete turn of the wheel, no more, no less, going forward, requires a pulse width of 740. To go backwards, 759.
Top speed turned out to be around 4.75 turns every 6 seconds, in both directions. There were occasional flashes of a higher speed, including one occasion when the wheel came within a centimetre of completing 5 turns, but repeated testing at that pulse width showed that must have been a fluke of some sort. The data follows, in a not particularly elegant manner.
pin 12 = right
pin 13 = left
12, 755, 0.40
12, 745, 0.37
12, 760, 1.1
12, 740, 1 <-------one complete turn
12, 759, 1 <-------one complete turn
12, 750 - 0
12, 753 - SLOWEST BACKWARDS
12, 747 - SLOWEST FORWARDS
12, 765, 1.75
12, 735, 1.66
12, 734, 1.75
12, 770, 2.45
12, 730, 2.25
12, 729, 2.4
12, 775, 3.05
12, 774, 2.95
12, 724, 3.0
12, 780, 3.5
12, 719, 3.5
12, 785, 3.9
12, 714, 3.9
12, 790, 4.2
12, 709, 4.2
12, 795, 4.4
12, 704, 4.4
12, 800, 4.5
12, 699, 4.5
12, 810, 4.6
12, 689, 4.6
12, 820, 4.75<----peak backwards speed
12, 679, 4.75<----peak forward speed
12, 830, 4.75
12, 669, 4.75
12, 840, 4.75+
12, 659, 4.75
12, 850, 4.75
12, 649, 4.75
12, 870, 4.9
12, 629, 4.75
12, 885, 4.85
12, 614, 4.75
12, 900, 4.85
12, 950, 4.85
12, 870, 4.85
12, 870, 4.75
Again, I need to repeat this with the other wheel. Oh joy.
Establishing and discovering performance
Ok, so I managed to get the base build of the bot completed a couple of days ago. Since then, I've been doing a fair bit of tinkering with it and repetitive testing of what it can do.
Found a couple of interesting results.
With my cheap, £1.99 for 20 AA batteries, the bot was managing to move at around 21cm/s with brand new fresh batteries.
While I'm reasonably happy with that, I'm a little disappointed that I have to repeat a lot of iterative testing. The reason I have to repeat it is because to do the testing, I used the same batteries I used to build and sub-system test the robot, and when I did a speed test of the robot, it managed to achieve a magnificent 16cm/s. 5cm/s is a lot to lose, so I'll bang on with that iterative testing again and get it resolved.
Found a couple of interesting results.
With my cheap, £1.99 for 20 AA batteries, the bot was managing to move at around 21cm/s with brand new fresh batteries.
While I'm reasonably happy with that, I'm a little disappointed that I have to repeat a lot of iterative testing. The reason I have to repeat it is because to do the testing, I used the same batteries I used to build and sub-system test the robot, and when I did a speed test of the robot, it managed to achieve a magnificent 16cm/s. 5cm/s is a lot to lose, so I'll bang on with that iterative testing again and get it resolved.
Assisted Rangefinding.
I've adapted this idea from one found at this website: sites.google.com/site/todddanko/home/webcam_laser_ranger
To save you clicking the link, that page describes a method of making a do it yourself laser range finder, using a webcam and a cheap laser pointing device.
The basic operation is as follows:
A laser is mounted a specific distance away from a camera, in such a way that both vader and camera are parallel to each other. Obviously, in the context of robotics, both devices are mounted on the robot doing the rangefinding.
In the link above, the laser is mounted above the camera, but in reality, it doesn't really matter where its mounted, only that it is parallel to the camera.
The way it works, the laser and the camera fire at the same time, the resulting picture is then processed by the robot.
Specifically, the processing is looking for the brightest pixels on the image. Usually, that will be the pixels illuminated by the laser.
Since the laser was mounted parallel to the camera and we know exactly which is the centre pixel on the picture, if we find out how many pixels the the laser dot is from the centre pixel, basic maths can convert that information into a distance from the camera.
Reason I'm rabbiting on about that, is because I have come up with a modification of that scheme which could potentially a) increase the useful range, b) introduce some redundancy into the system and c) improve the accuracy of the system.
At the moment, its still very much an idea, there is no hardware or even software yet, but it seems like a logical extension to me.
This is the idea. Instead of using just one laser, use three, set up in a definite configuration. Doesn't particularly matter what the configuration is, just that the configuration is definite. For example three lasers set up in a line, all pointing the same way (obviously), three inches apart.
The config I like the most, for no particular reason, is to have one laser two inches directly above the camera, one robot below and to the left, one below and to the right, with wax laser four inches from either of the other two.
The reason I like this config, is because the three lasers will make a specific pattern on whatever object they happen to be painting.
When the camera takes a shot, the image is processed, removing all colours except the red spectrum, or whatever colour the laser is.
Once that's done, the camera should be left with three dots in a triangle shape.
Because each of the lasers is a known distance from the center of the camera lens, we can then perform the calculation mentioned above on each of the red dots individually.
In addition, we can take use the distance between each dot as another metric, as well as working with the size of the shape formed by the three dots, which makes up another metric.
Needs more thinking about, but with that many metrics, we should be able to combine the measurements and come up with a statistically accurate figure.
Interesting!
To save you clicking the link, that page describes a method of making a do it yourself laser range finder, using a webcam and a cheap laser pointing device.
The basic operation is as follows:
A laser is mounted a specific distance away from a camera, in such a way that both vader and camera are parallel to each other. Obviously, in the context of robotics, both devices are mounted on the robot doing the rangefinding.
In the link above, the laser is mounted above the camera, but in reality, it doesn't really matter where its mounted, only that it is parallel to the camera.
The way it works, the laser and the camera fire at the same time, the resulting picture is then processed by the robot.
Specifically, the processing is looking for the brightest pixels on the image. Usually, that will be the pixels illuminated by the laser.
Since the laser was mounted parallel to the camera and we know exactly which is the centre pixel on the picture, if we find out how many pixels the the laser dot is from the centre pixel, basic maths can convert that information into a distance from the camera.
Reason I'm rabbiting on about that, is because I have come up with a modification of that scheme which could potentially a) increase the useful range, b) introduce some redundancy into the system and c) improve the accuracy of the system.
At the moment, its still very much an idea, there is no hardware or even software yet, but it seems like a logical extension to me.
This is the idea. Instead of using just one laser, use three, set up in a definite configuration. Doesn't particularly matter what the configuration is, just that the configuration is definite. For example three lasers set up in a line, all pointing the same way (obviously), three inches apart.
The config I like the most, for no particular reason, is to have one laser two inches directly above the camera, one robot below and to the left, one below and to the right, with wax laser four inches from either of the other two.
The reason I like this config, is because the three lasers will make a specific pattern on whatever object they happen to be painting.
When the camera takes a shot, the image is processed, removing all colours except the red spectrum, or whatever colour the laser is.
Once that's done, the camera should be left with three dots in a triangle shape.
Because each of the lasers is a known distance from the center of the camera lens, we can then perform the calculation mentioned above on each of the red dots individually.
In addition, we can take use the distance between each dot as another metric, as well as working with the size of the shape formed by the three dots, which makes up another metric.
Needs more thinking about, but with that many metrics, we should be able to combine the measurements and come up with a statistically accurate figure.
Interesting!
Friday 14 January 2011
Major Assembly Completed
Apologies for the image heavy post, with luck I wont have to do that again.
Ok, just like the title says, the major part of the assembly has been completed. As you can see from the images, the robot itself is quite a neat little beast. I on the other hand, am not famous for my tidyness, so the wires you see poking around all over the place are down to my cack-handedness in doing as the manual suggested and "tucking them under the chassis".
Still and all, its done, together nicely and servos are tested and working. What I need to do now is write and test the program/circuit suggested by the manual which will inform me if something screwy is happening to the bot or whether its behaving strangely because of low power.
It isn't behaving strangely at the moment of course, the circuit is intended to let me know if it does.
Turns out, all the code and exercises the manual has taken me through so far is simply an exercise in pre installation testing, or sub system testing. Not the first time I've heard of the concept, but the first time I've thought about it really outside an object oriented computer program, which this thing most certainly is not.
Next on the agenda is testing iterative testing to find the values which will make each servo turn at the same rate, forwards and backwards.
This is an exercise the manual takes you through, because regardless of the fact that you centered the servos before putting the bot together, the servos will still turn at slightly different rates.
Why, I hear you ask?
Servos seem to be set up so a particular configuration of signal instructs the server to do nothing. This seems strange, until you consider the fact that the servo is capable of turning at a variable speed. The speed of the servo is determined by the width of a modulated power signal which arrives at the servos terminals. In addition, a servo is capable of travelling clockwise and counterclockwise, which is a good thing, a servo which went in only one direction would be useless after the first use.
So, in order to get the servo turning in one direction or the other, at one speed or another, we simply vary the width of the power signal which will arrive at the servo's terminals.
The center signal is the width of signal which means the servo doesnt turn at all. In short, its an instruction to stop.
Now, while we are working with some amazing bits of kit, capable of measuring down to a couple of millionths of a second, we aren't working with top of the line, nano-engineered, nuclear precision equipment. That means, when we went through the somewhat fiddly exercise of centering the servos, what we actually did was the equivalent of putting a needle somewhere between two "on" switches.
When we send a signal to the servo, we do it using the command PULSOUT [pin number], [duration].
The [duration] part of the command is the number of microseconds (one millisecond is one thousand microseconds) we want power supplied to the servo. That is what is meant when someone mentions the "width" of the "up" signal.
In addition, the "up" signal is followed by a command to the microcontroller to stop sending power to the servo for a given period of time.
In PBASIC, that command is given as PAUSE [duration], with the [duration] element given in milliseconds.
Most servos use an "up" (I'm going to stop surrounding that with quotes) signal with a width of 1.5 milliseconds as a signal to do nothing. If the width is any more than that, the servo will turn one way. Any less, it turns the other way.
Amazingly, it is possible to pass a signal with a width as small as two microseconds through the BS2. Possible, however impractical. However, knowing the smallest resolution means we can calculate exactly how to structure the PULSOUT command when we center the servos.
If we call two microseconds the smallest time unit the BS2 is capable of dealing with, it makes it easier.
One millisecond has 500 of these BS2 time units in it, and when we center our servos, we need a signal width of 1.5 milliseconds. So, since the PULSOUT command uses these time units for its duration parameter, we simply provide the command as follows:
DO
PULSOUT [pin number], 750
PAUSE 20
LOOP
While the command is running, we are getting busy with the screwdriver and adjusting the potentiometer until the servo stops moving.
So, with all that background, we can look at what probably happened when we centered the servos.
Looking at that diagram, we can see that while the perfect placement would put us on the 750, the likelihood is that the servos "center" was placed somwhere between the two boxes, as shown by the two red needles on the diagram.
That diagram I made is pretty, but still only tells part of the story. We can see on the diagram that there is a nice neat spacing on both the CW side and the CCW side. Looking at the diagram, it seems if we pass a signal with a width of 755 to the servo, it may well start turning, very very slowly in a clockwise direction. Likewise, if we pass it a signal with a width of 745, it will turn the opposite direction, again, very very slowly.
Not the case unfortunately. It may be that a width of 753 starts the servo turning in a clockwise direction and a width of 746 starts it in a counter clockwise direction. I suspect every servo is slightly different.
Didn't mean to go on for quite so long, but it was worth it.
In case many of you haven't guessed yet, I am using this blog equally as a way of cementing the knowledge in myself as much as to inform.
So, next exercises are beckoning, till the next update!
Assembly preparations
Wednesday 12 January 2011
On the importance of centering the servos
LEDs and Resistors |
Circuit Built |
Servos connected. |
The next exercise in the manual is looking using an actual circuit on the breadboard, programmatically making two Light Emitting Diodes (LEDs)flash on and off.
This may seem like fairly basic stuff to some of you, and indeed it is, but thats exactly what I was looking for really. I have lots of theoretical knowledge and oodles of ideas, but actual hands on practical experience with electronics and circuits is pretty thin on the ground for me. If you are reading this after having dug through the archives of the blog to find out where it all started and you find this is a bit humbug for you, feel free to zip ahead. Those of you reading each post as I post it, well, you'll just have to put up with my learning pace.
The manual tells me I will need two red LEDs and two 470 ohm resistors for this test circuit, so digging into the box of bits I recieved, I found the little bag of electronics components and using the excellent instructions within the manual, identified the parts I needed.
Before anyone thinks this manual is similar to a recipe book, which tells you to add ingredients without telling you why, let me put things right.
While I had a fair idea of the concept of resistors and LEDs and so on, the purpose of each of them was never really clear to me. I had no idea that an LED can be compared to a one way valve for electricity, and similarly, I had no clue of what a resistor is used for. Some people might say the clue is in the name and to a point they would be right.
My point is, this manual is doing a fair job of giving me a practical grounding in the basics of electronics, which can only be a good thing.
Back to it.
The good news is, I didn't screw up the circuit. There is no bad news, this seems to be working well so far.
As you can see in the second picture, the LEDs are on, nothing seems to be visibly broken and I'm alive to type this. There were in fact a few different things to do in the exercise, mainly to do with making das blinkenflashen lights. It all worked, so thats good.
Next exercise is connecting the servo motors, which is going to be fun.
Simple enough to connect the servos, just had to change a jumper setting and connect the leads to the correct pins on the BOE. The next activity is centering the servos, as they will have left the factory in a workable but unknown state. Centering the servos puts them back to a known position, so circuits can use them with some precision.
Hmm, just shows what happens when you get cocky, part of centering the servos involves using a screwdriver on a potentiometer within the casing of the servo. I rather foolishly assumed it to be the handy phillips head screw in the middle of the servo horn and spent a few minutes cursing the thing for doing nothing. Still, got them done, no damage done.
Here is the code used to center these particular servos. Whether its the same for all servos I dont know yet. Interesting to find out though.
' P13 for manual centering
' {$STAMP BS2}
' {$PBASIC 2.5}
DEBUG "Program running!"
DO
PULSOUT 13, 750
PAUSE 20
LOOP
So, so far so good. I'll stop for a bit now and when I come back, it looks like we are mostly looking at mathematics and getting comfortable with using PBASIC.
Time Control
Parallax Continuous Rotation Servos |
The exercise I just finished is early in chapter 2, and is getting the concept of control flow and timing using the BOE and the BASIC Stamp microcontroller. Here is the code:
As you can see, there isn't anything terribly magical about this. The programming language used is a specialised form of the BASIC language. Specifically, it is known as Parallax Beginners All-purpose Simple Instruction Code. Compared to Java, Python, Ruby and the .NET programming languages, BASIC and its variants operates at a significantly lower level.' {$STAMP BS2}' {$PBASIC 2.5}
DEBUG "Start Timer..."
PAUSE 1000DEBUG CR, "One second elapsed"
PAUSE 2000DEBUG CR, "Three seconds elapsed"
DEBUG CR, "Done"
END
The reason time control is important, is because when we are operating the servos (pictured) we are essentially sending digital signals to an analogue device. Because we are operating at such a low level, we can't simply give the device a goal and expect it to get on with it, we actually have to teach it how to use itself. When we want these servos to actually start moving, what we really need to be doing is telling the microcontroller how long to supply electricity to the servo in order to make it function. When electricity is applied to the servo, the control horn rotates.
In order to precisely control the servo, we need to have some fairly fine control over how long we apply power to it, right?
In the program listing above, the numbers are referring to milliseconds, or thousandths of a second. 1000 milliseconds = 1 second.
The word DEBUG is simply an instruction to the microcontroller to send the text in quotes back to the terminal, the CR simply represents a carriage return.
The statements with a single quote in front of them are compiler directives. All I know at this stage is, those two are necessary.
The program above, when run, produces the following output:
Pretty much as expected, so far so good.Start Timer...One second elapsedThree seconds elapsedDone
Gets more interesting when you consider that while a thousanth of a second seems like it passes pretty quickly, two millionths of a second is the base time granularity of the BASIC Stamp 2 (BS2).
Pretty well blew me away when I found that out.
I mean, we all know computers are quick, but have you ever stopped to consider exactly how quick? Divide a second by one thousand, you have a millisecond. Divide that millisecond by a thousand, you have a microsecond, also referred to as μs. Thats not "us" with an unusual accent.
According to that bastion of incontrovertible truth, Wikipedia, it takes 5.4 microseconds for light to travel 1 mile in a vacuum. In addition, the length of a day increases by 15 microseconds due to the tidal pull of the moon, but 2.68 microseconds was subtracted from the length of the day as a result of the 2004 Indian Ocean earthquake.
Who measures these things?
In any case, two of those microseconds is theoretically the shortest period of time I can apply power to those servos. In reality, I dont think anything would move in 2 microseconds, but when I actually get to that part of the book, I'll have a go and let you know.
Tuesday 11 January 2011
Unboxing? Sorta kinda.
I was originally going to do a grand unboxing of the bot, but I'm actually so keen to get into it, I'm not going to bother.
Instead, I'm going to go through the exercises included in the impressive looking manual "Robotics with the Boe-bot" which accompanied the bot. Hundreds of pages long, I've looked at the first few chapters and am already impressed with what this little bot can do, I fully expect that once I'm finished these exercises, I'll be doing some interesting experiments.
In the picture, you can see the manual, the Board of Education experiment board (BOE), the BASIC Samp 2 module, the rubber feet for the BOE and the stylish and really rather classy Parallax screwdriver.
Keep in mind here please, I'm going between posting these entries on my phone as well as posting them on my computer and I'm using different software for each method. If there are any formatting inconsistencies on these early posts, thats why.
Update:
Having finished chapter 1 of the manual, I have to say I'm impressed. While I'm hardly any kind of expert in these things, the guys at Parallax have included enough FlashenBlinken lights on the board to let me know I'm making progress with the simple stuff.
The first chapter had me revisiting a few topics I already had some knowledge of, but I've resisted the urge and pushed through with the stuff on ASCII numbers and basic commands like using
DEBUGto output back to the computer rather than commands like
coutor my "native"
System.out.println("Yes, I was taught java as a first language");
Probably a good idea to get the basics, no matter how impatient I am. Hopefully doing some more tonight, chapter 2 is about connecting up the servos!
Parallax Boe-bot kit.
Frosty piss!
That's slashdot for first post. Since I'm posting this from my mobile phone, it isn't going to be a huge post.
At the weekend, I ordered a this kit from active-robots.co.uk, for the low price of £126 + p&p, all up around £135.
Included in the kit is the BASIC Stamp 2 microcontroller, Parallax Board of Education electronics development board and electronics components including passive sensors and servos required to build the Boe-bot.
I have a specific project in mind once I have completed the exercises included in this kit, but I'll get into that later. For the moment, I'll be blogging mainly about what I'm doing with this robot, with the occasional rant about politics or the popular media. I'll tag things though so you can avoid those bits though.
That's slashdot for first post. Since I'm posting this from my mobile phone, it isn't going to be a huge post.
At the weekend, I ordered a this kit from active-robots.co.uk, for the low price of £126 + p&p, all up around £135.
Included in the kit is the BASIC Stamp 2 microcontroller, Parallax Board of Education electronics development board and electronics components including passive sensors and servos required to build the Boe-bot.
I have a specific project in mind once I have completed the exercises included in this kit, but I'll get into that later. For the moment, I'll be blogging mainly about what I'm doing with this robot, with the occasional rant about politics or the popular media. I'll tag things though so you can avoid those bits though.
Subscribe to:
Posts (Atom)