how much time it takes to arrive?

Started by
7 comments, last by conquestor3 9 years, 2 months ago

so the normal way of measuring how much time it takes for data to arrive to another computer is by measuring how much time it takes to go and come back and then divide this time in half, but this method is wrong if the time it takes to go is different from the time it takes to come, so, is it possible to measure this two times individually?

Advertisement

Can't you just set both computers to the same NTP server and use one time-stamp at the start, one at arrival, and one on return?

t0 ------------> t1 -----------------> t2

time to get there = t1 - t0

time to come back = t2 - t1

total time = t2 - t0

Or am I missing something?

I think, therefore I am. I think? - "George Carlin"
My Website: Indie Game Programming

My Twitter: https://twitter.com/indieprogram

My Book: http://amzn.com/1305076532

sorry, I want to know if it is theoretically possible but using only two computers sending each other messages

If the second machine is responding anyway, have it send a time-stamped message.

Machine1: "Here's a message sent at t1."

Machine2: "Your message stamped t1 was received at t2. This message sent at t3"

Machine1 receives machine2's message at t4.

Time Message 1 to 2: t2 - t1

Time Message 2 to 1: t4 - t3

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

Buckeye

wrong...

In your example "t" the universal time, how was it set in the first place?

how can you set up a universal time between two machines without knowing the distance between them?

this problem gets simplified to, how much time it takes for data to arrive?


how can you set up a universal time between two machines without knowing the distance between them?

Do you mean physical distance? As in meters or feet? That has nothing to do with the problem you described.

If you ask a question without defining constraints ... phooey. You're trolling.

So, ignoring effects of relativity, I assumed the machine clocks are perfect and those clocks were synchronized before the experiment.

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

well some people could say an easy solution by just using the distance so I am placing that constraint

others would come with another easy solution by just using a universal time, so I am putting another constraint there

If I understand the constraints correctly, this can't be done. If you can synchronize their clocks (say, by using GPS on both sites), sending timestamps works. Otherwise neither side can tell which of the two legs took longer.

You'd have to define your own time format, right? You can get relative time between the 2 with a pretty good estimation, so as long as you're fine with creating a new time measurement it's possible.

So machine 1 would generate time + set measurement.

It would pass that to machine 2, which would sync it.

Say, each unit of time is 2 minutes.

Machine 1 would start counting to the next increment, and inform machine 2 of that.

Machine 2 would let machine 1 know it's starting

Machine 1 would tell machine 2 the time remaining until the next unit of time incriment

Machine 2 could adjust to that

Or is this something different then I'm thinking?

This topic is closed to new replies.

Advertisement