Estimating the speed of the plane.
January 16, 2007 physics
I’m sometimes bored while flying, and I like looking out the window (though if I can, I usually pick aisle seats so I can exit more quickly).
I realized something rather amusing. I closed one eye, and held two fingers about an inch apart and a foot away from my open eye. Then, I timed how long it took an object on the ground to move from the one finger to the other finger an inch away; it took about ten seconds.
Let be the distance in feet from my eye to that point on the ground. By similar triangles, moving an inch when one foot away from my eye means moving inches on the ground. The distance from my eye to the ground is (wild guess!) 60,000 feet, so the point on the ground actually moved 60,000 inches, or 5000 feet, about a mile. Moving a mile in ten seconds is moving six miles per minute, or 360 miles per hour.
I seem to recall that 450 mph is actually how fast a commercial jet might go, so at least I’m within an order of magnitude. Now 450 miles per hour would have been 39,600 feet per minute, or 6600 feet in ten seconds, or 79,200 inches in ten seconds, so maybe I should’ve estimated 80,000 feet to the ground. But there are so many other sources of error in this technique…
Are there other fun things to estimate when trapped on a plane?