I am trying to develop an android app for a friend that uses the gps to tell you how many seconds ahead or behind you are from your target speed vs your actual speed.
For example I could drive for 1 min at 10mph, with '60mph' selected on the app. The app would then show me '-300sec' because I would be 5min behind if I had been going 60mph for 1min instead.
The formula I have is:
((actualSpeed - targetSpeed) / actualSpeed) * Time
so for the example:
((10mph - 60mph)/10mph) * 60sec = -300sec
I thought my formula was fine until i tried a different example:
I could drive for 1 min at 60mph, with '30mph' selected on the app. The app should then show me '+60sec' because I would be 1min ahead if I had been going 30mph for 1min instead.
so for the example:
((60mph - 30mph)/60mph) * 60sec = +30sec
With this example I'm getting +30sec instead of +60sec
I'm thinking that this formula has been used a lot with racing to optimize time but I can't seem to get a formula that works for all inputs.
Thanks in advance.
For example I could drive for 1 min at 10mph, with '60mph' selected on the app. The app would then show me '-300sec' because I would be 5min behind if I had been going 60mph for 1min instead.
The formula I have is:
((actualSpeed - targetSpeed) / actualSpeed) * Time
so for the example:
((10mph - 60mph)/10mph) * 60sec = -300sec
I thought my formula was fine until i tried a different example:
I could drive for 1 min at 60mph, with '30mph' selected on the app. The app should then show me '+60sec' because I would be 1min ahead if I had been going 30mph for 1min instead.
so for the example:
((60mph - 30mph)/60mph) * 60sec = +30sec
With this example I'm getting +30sec instead of +60sec
I'm thinking that this formula has been used a lot with racing to optimize time but I can't seem to get a formula that works for all inputs.
Thanks in advance.