rocco wrote:
The only suggestions I can offer require additional hardware, and aren't very practical:
1) Add GPS, which has it's own accuracy issues and adds significant costs.
2) Add additional transceiver hardware that allows you to implement Wings' phase-shift technique.
3) Ship each system with a long tape measure.
rocco wrote:
I don't think the Freescale transceiver gives you enough low-level control to take the approach that Wings suggested, but I think that technique holds your best chance of success. I think the firmware turn-around time is not deterministic enough to extract a time period as small as 3 microseconds. Heck, the interrupt latency alone can be more than that.
Hi, BeNoM:
I don't understand why do you say that the firmware is not deterministic enough if I going use the same firmware and the same trancievers and same message/frame on A and B spots.
I will take in acount the interrupt and other effects as "parasite time".
rocco wrote:
My concern, which may be unfounded, is that the latency may not be repeatable. I don't know about the firmware you are using, but the SMAC firmware that I experimented with had interrupts blocked for various amounts of time, typically greater than a few microseconds. I would suspect that your "measurement" packets would arrive asynchronously, and therefore the response could be delayed by an arbitrary amount of time.
It's just a concern, it may not be reality.
I gonna use MC13211 (SiP) with SMAC firmware.
It's not a problem if the interrupts can block for greater than a few microseconds,
but it will be a real problem if there are "unreapetable" procedures.
Could you tell me more about SMAC and what problems you expect?
Moreover, I have to send a "stream" of data and only one frame per data "measurement".
Is it possible with the SMAC?
Thanks,
BeNoM
Also, using a timer severely restricts the granularity of your distance measurements. With a 25 Mhz timer clock, the resolution is already 12 meters. Plus there will be +/- 12 meters of error just from having unsynchronized clocks at the transmitter and receiver. Then there's a potential deal breaker coming from the frame protocol itself, which is bit stuffing. Each transmitted bit has a duration on the order of microseconds - which translates to a speed-of-light distance of something in the range of 300 meters! If your distance measuring packet is exactly the same every time except for recipient address, and your software can calculate the CRC, determine exactly how many stuff bits are there, then you could possibly account for this source of error. But you also need to have a gaurantee that the channel is open before you attempt the measurement. All in all, I believe the measurement will be very crude and perhaps unusable. I think the absolute very best you could accomplish would be a very rough estimate of distance like this:
0-24 meters
12-36 meters
24-48 meters
36-60 meters
48-72 meters
60-84 meters
72-96 meters
84-108 meters
Does Zigbee even work at a range of 100 meters? I'm guessing that in reality, the results would be twice as bad as this. Anyway, I like Wings's method much better, but you can't do that using Zigbee equipment. You could do it cheaply with a pair of analog transceivers and a PLL: Get a crystal oscillator, modulate that on a frequency of your choice, demodulate at the receiver and remodulate and retransmit on a different frequency. Then at the source you can compare the original crystal oscillator signal with the received signal using the phase discriminator from the PLL. Low pass filter the output, scale, digitize and calibrate, and I believe you will have a much more accurate measurement.
What about the frequencies that less affected by the water?
I mean other ISM Bands (lees than 2.4MHZ) : 315KHZ, 433KHZ, 868KHZ, 915KHZ.
Thanks,
BeNoM
Hmmm...
I doubted this was going to work based on the fact that you don't have direct control of the tranmitter/reciever but only telling it to "send this" and "I have this packet for you"
Now you tell us you are going to spin the antenna around in a 2 metre circle, invert it, push it 1 metre under the water and then lift it 0.5m out of the water all at a couple of Hertz.
I don't ever like to throw cold water (pun intended) on a clever idea before it gets of the ground (or into the water) but this won't be easy (or accurate) or even possible.
Regards David
'Flew' would be a better word since I haven't done it for years. I've been meaning to get back into it. I never flew anything fancy - just the Cessna 150, 172, and also the Piper Cherokee. I've never done anything but VFR flying, personally, but I've been up with other pilots on top of the clouds.
I completely agree that Transmitter range would be an extremely unreliable method for measuring distance. Especially when the receiver is moving and changing its orientation and the permeability and permittivity of the surrounding space is variable due to the water. The antenna will never be perfectly omnidirectional either, so depending on the angle of the receiver relative to the transmitter, there will be a large amount of variability in perceived signal strength. Then you have reflections which are sometimes additive and sometimes subtractive. If every part of the system was static, then maybe this method would be feasible, but that is very far from reality in this case.
Using sonar, why would you need to synchronize the clocks? The only advantage I can see to using this method would be that you would not need a transceiver on the swimmer - just a receiver. But if you implemented a sonar transponder on the swimmer, then it wouldn't matter if the clocks were synchronized - and the accuracy would be superb since the speed of sound is so slow. The question would be - what frequency to use? Low frequencies would be omni-directional, which would be highly favorable, but low frequencies take too much power to produce. Ultrasound would be more reasonable, but it is highly directional. If it's pointed the wrong way then you wouldn't be measuring the line-of-sight distance.
"why would you need to synchronize the clocks?"
I meant that you would not need them synchronized if you used the transponder method. If the mobile station had no transmitter then I agree that they would need to be synchronized. But with synchronized clocks, as you mentioned earlier, you would have to do a zero distance calibration every time, and the accuracy would constantly diminish with time as the clocks drifted. To me, that's pretty limiting.
Hello,
The following considerations come to mind for the RF method -
Regards,
Mac
Message Edited by bigmac on 2006-08-07 01:48 AM