Kyx wrote:@pinkteddyx64 @
spotify95 @coleslaw
Imagine that a car travels 1 mile in 12 seconds on the first go. On the second go the engine breaks so they have to push the car. This takes 3 hours.
For GWR the average speed would be (300 mph + 0.3 mph)/2 = 150 mph
The actual average speed is 2 miles/3 hours = 0.67 mph
Huge difference
Indeed, I always knew that speed = distance / time, the first one (u+v)/2 would only be relevant if an item starts at one speed and finishes at another speed (going back to Physics SUVAT equations)and completes the whole run like that.
So if a car started at rest and was at 60mph after traveling 0.5 miles (constant acceleration ofc) then you'd use s=[(v+u)/2]*t, with the bit in square brackets being how you calculate the speed.
If a car is traveling at a constant 60mph then use speed = dist / time. The same goes for if it has to be pushed. Therefore, as you stated, if it takes 3 hours and 12 seconds (combined time) to complete a total distance of 2 miles (both runs) then the average speed cannot possibly be 150mph, or anywhere near that...
Hope I've not completely fucked up somewhere in my calculations! (Feel free to correct me if I have.)