Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon.

Pages: 1-

How long does it take?

Name: Anonymous 2005-10-05 9:25

Let's say you're in a car, some distance away from your home.  You change your speed so that once you're x miles away, you travel at x mph.  How long will it take to get home, approximately?  Consider two specific cases:

1) You change your speed continiously, so that at, say, 7.5 miles away, you're traveling at 7.5 miles per hour.

2) You drop your speed by one mile an hour for every mile traveled.  That is, at 7 miles away, you go to 7 mph, holding that speed for a mile, then going down to 6 mph at a distance of 6 miles away.

Does it matter how far away you are when you start?

Name: Anonymous 2005-10-05 11:11

Yes it matters how far you are away. It's a simple linear situation. The time required to go from x=7 to x=0 will always be the same no matter what your starting distance is.

Name: Anonymous 2005-10-05 12:01

>>2
 Nope, not linear.  Not at all.  1) should be answerable using just basic logic.

With a question like that at the end, the obvious answer is no, though it does matter if you're worried about 1% differences or less.

Name: Anonymous 2005-10-05 13:13

Why does (1) seem so much like Zeno's paradox.

Name: Anonymous 2005-10-05 13:26

>>4
Because it is.  And, accordingly, the answer is "never."  You'd never reach your destination, though after a relatively short amount of time, you'd be immesurably close.

That's what part two questions, actually.  Instead of continiously variying your speed, you do it in small but finite steps.  Incidentally, the amount of time it takes is mostly independant of the total distance you're traveling; it's a second order effect.  The first order depends only on your time metric. 

Name: Anonymous 2005-10-06 0:52

>>1
The question is all over the place
You don't state if your driving to home, or away from it.
On the considerations you state you might be driving 7.5 miles an hour at 7.5 miles away , but in 2 you say your dropping speed by 1 mph for every one mile travelled.

Name: Anonymous 2005-10-06 8:06

>>6

It's two different cases.

The answer is that it takes ~1.7 times your time metric for case two.  (Again, you'll never get there in case 1, because doing the math means evaluating the logarithm of zero.)  If you're measuring in miles per hour, it'll take about 1.7 hours.  If you're measuring in meters per minute, it'll take about 1.7 minutes to get home. 

Does it matter where you start?  Barely.  If you start 100 miles away, or 50, or 500, it'll still take about the same amount of time, 1.7 hours.  The question is, why?  (If you're familiar with Taylor Series, the answer is obvious.)

Name: Anonymous 2005-10-06 8:32

>>7
oh didnt realize it was multiple choice

Name: Anonymous 2005-10-06 9:00

>>8


Not so much multiple choice as two different -- but related -- questions.

Don't change these.
Name: Email:
Entire Thread Thread List