Amber G. wrote:The accuracy required (in both stable frequency and latency (wrt to "standard" TAI) is about MILLION times more than what is typical needed (or achieved) on a typical computer or a cell phone.

Yes. What I am trying to understand is WHY is such accuracy required .

(*** side question *** then how does GPS on your cell phone work? if it does not have such an accurate clock ***? (I generally ask this question to test really top students to see if they can reason it out without looking up))

I was never a "top student" anywhere , but let me try it all the same.

Hmm. Normally I would have answered that it is "easy" . We know that what the GPS measures is time delay of arrival of signal. The satellite broadcasts a signal identifying a) Itself b) It's location in space and c) A timestamp when the signal was sent out. Now the receiver looks up the timestamp , and finds out the time at which it received it , and hence calculates the time the signal took for the signal to propagate from satellite to receiver and from that knowing speed of radio waves , we calculate the distance of satellite to receiver.

Now do this same from 3 satellites simultaneously, we will know the distance of the receiver from the 3 satellites, and since we know the positions of the satellites, the distance between the satellites themselves are known , and from solid geometry with the 3 satellites and the receiver forming a "prism", and since we know the measure of all the sides, we can locate the receiver exactly in 3d space.

Actually your Garmin or GPS on cellphone's clock is not synched with satellite.. in fact there is not even a very precise clock there.

And this is what made me realise that it wasn't as "easy" as I posted above. So, I put pen on paper and thought of something like this.

The satellite clocks are synchronised, and the signals they transmit are continuous and time synchronised as well . So, if the receiver is getting signals from 2 satellites, depending on the distance, if the time of arrival, of one is t, the other signal will lead or lag a bit. What we can measure by comparing the signals is the lead /lag in phase .

If the two satellites are A & B , if the receiver is at position C, since we know the distance AB , and we know that both the satellites have be visible on the horizon , we can solve numerically given the triangle ABC ( A,A + c*delta t, AB) , imposting the condition that angle ACB < 180 deg , and from that we get the distance from the satellites A & B .

So let me post my question again - if you have not heard this question before --- How does GPS on a cell phone can get your position even though it does not have an atomic clock??

Hmm. What am I doing wrong here ? I don't seem to need an atomic clock , or even a clock at all in the cell phone, nor do I need to know what the actual time at all ! All I need to know is the speed of radio waves, which is a constant!

This is not exactly true, as not only time but also the position of sats calculated by ephemeris (calculating orbit with very high precision) comes into play. For that one has to know the 'true time' to calculate the orbit.

But why would the satellite need an atomic time to find it's own position ? Theoretically, the satellite can have a "GPS receiver" on board and get signals from GPS ground stations that are visible , sort of inverse of how the GPS receiver on earth gets it from the satellite without needing any accurate clock (or even any clock possibly), and it can very accurately determine it's position in 3d space , just like the receiver does wrt to the satellite constellation.

All that it seems to require is that the clocks in the satellites are synchronised between themselves and not necessarily in absolute time with the "master clock" / atomic time on earth.