Time will reset to zero - How do I compensate? - java

SystemClock.uptimeMillis()
This call from the Android SystemClock will reset back to zero before it 'Maxes Out'
Right now I use this to base off animations, movement etc below is an example of where a reset would essentially freeze my application.
if (currentTime > frameTime + sequenceTime)
{
frameTime = currentTime;
}
Here lies the problem currentTime is say 50 then the frameTime is set to 50 right? Ideally the currentTimewould increase with the SystemClock.uptimeMillis() but if its reset? The currentTime becomes very small compared to the frameTime How would I go about fixing this or reset the currentTime for all objects?
This is just a small example if I have different objects having a similar dilemma.

From my reading of the documentation, the "uptime" clock only gets reset when the device is rebooted. Unless your application ... somehow ... manages to keep running over a reboot, you shouldn't need to worry about the clock resetting.
(On the other hand, if your application does need to do animations that keep going after a reboot, then maybe you should use the 'currentTimeMillis' clock. The Android documentation for SystemClock describes the alternatives.)
The documentation says this "Note: This value may get reset occasionally (before it would otherwise wrap around).".
This doesn't make a lot of sense to me. The clock is a millisecond clock and is returned as a long, so you would not expect it to ever wrap around. (2^64 milliseconds is a very, very long time.) The only explanation I can think of is that some devices use a 32 bit hardware timer to implement this clock ... which is kind of lame.

Related

How can I know how much time has passed between two operations in my app? [duplicate]

This question already has answers here:
How do I measure time elapsed in Java? [duplicate]
(15 answers)
Closed 5 years ago.
I'm working on an Android app and I have this problem.
For example, if I delete a file (operation A) and I receive a new file (operation B), I want to know how much time has passed between the two operations. I want it to work also if, in between, the user changes the date of the system, turns off the internet or restarts the device.
I know that exists SystemClock.elapsedRealtime() but its value restarts from zero if the user restarts the device.
I cannot use System.currentTimeMillis() because it changes if user changes the date.
I cannot get a date from the internet.
Thanks
Use System.currentTimeMillis(). It gets the time elapsed since the epoch (January 1st 1970).
You need a global var:
long start;
on the first action:
start = System.currentTimeMillis();
Since it's the time from the epoch, restarting the device isn't going to change it (i.e. System.nanoTime would be reset). However, as with most other methods, it isn't safe from changing the time of the device. If someone changes the time on the device back to the start of the epoch, you will experience some problems.
Note that there is no way to get the exact time since the event happened if the time is changed. I.e. if the user does operation A, waits a few hours, sets the clock back to a few hours ago, there's basically no offline ways you can check that. If you use a server, you can get the time from that, but there's not any way to get the accurate, unmodified time difference offline that's tamper proof (where tampering is changing the time).
TL;DR: System.currentTimeMillis is an offline option, but it isn't safe from time changing. If you need it to show the right time difference independently of the user changing the time of the device, use a server.
EDIT:
If you can't use System.currentTimeMillis or get a time from the internet, you can't measure the time at all. AFAIK, every Java/Android API relies on System.currentTimeMillis (or get the current time some other way). Example: the Date class can be converted to a Long representing the current time in milliseconds. For long-term timing, you either have to use System.currentTimeMillis or a server. System.nanoTime restarts when the JVM restarts. So does elapsedRealTime.
You just need to grab the time from somewhere before and after the activities you want to time and take one from the other. This uses the system clock but you could equally get the real time from some other source
long startTime = System.currentTimeMillis();
// Your code
System.out.println("Operation took " + (System.currentTimeMillis() - startTime) + " milliseconds");

Is there a way to make Timer.ScheduleAtFixedRate more accurate?

I'm writing an Android app in java. Trying to make a simple rhythm game where you just tap the button on a beat. I was using a timer object with Schedule at Fixed Rate to make the button flash but then I discovered that the time is variable by a few milliseconds.
Obviously a rhythm game needs particular timing to come out right, so is it possible to make this more precise and accurate or am I barking up the wrong tree with using this method for precise timing?
I don't know for Android but here is what happens for "real" Java...
A Timer uses System.currentTimeMillis() to keep track of time; this method is sensitive to system time changes (say, you run an NTP server for instance).
Which is why, if you want better precision, you use a ScheduledExecutorService; this relies o System.nanoTime(), which is a nanosecond precision counter which keeps increasing for the life of the process, even if the system time changes.
So --> try a ScheduledExecutorService instead.
I suggest you don't use ScheduleAtFixedRate!! Use a Looper and Handler.sendMessageDelayed()

What should Timertask.scheduleAtFixedRate do if the clock changes?

We want to run a task every 1000 seconds (say).
So we have
timer.scheduleAtFixedRate(task, delay, interval);
Mostly, this works fine. However, this is an embedded system and the user can change the real time clock. If they set it to a time in the past after we set up the timer, it seems the timer doesn't execute until the original real-time date/time. So if they set it back 3 days, the timer doesn't execute for 3 days :(
Is this permissible behaviour, or a defect in the Java library? The Oracle javadocs don't seem to mention anything about the dependency or not on the underlying value of the system clock.
If it's permissible, how do we spot this clock change and reschedule our timers?
Looking at the source of Timer for Java 1.7, it appears that is uses System.currentTimeMillis() to determine the next execution of a task.
However, looking at the source of ScheduledThreadPoolExecutor, it uses System.nanoTime().
Which means you won't see that behaviour if you use one in place of a Timer. To create one, use, for instance, Executors.newScheduledThreadPool().
Why you wouldn't see this behaviour is because of what the doc for System.nanoTime() says:
This method can only be used to measure elapsed time and is not related to any other notion of system or wall-clock time. The value returned represents nanoseconds since some fixed but arbitrary origin time [emphasis mine].
As to whether this is a bug in Timer, maybe...
Note that unlike a ScheduledExecutorService, a Timer supports absolute time, and maybe this explains its use of System.currentTimeMillis(); also, Timer has been there since Java 1.3 while System.nanoTime() only appears in 1.5.
But a consequence of using System.currentTimeMillis() is that Timer is sensitive to the system date/time... And that is not documented in the javadoc.
It is reported here http://bugs.sun.com/view_bug.do?bug_id=4290274
Similarly, when the system clock is set to a later time, the task may be run multiple times without any delay to "catch up" the missed executions. Exactly this happens when the computer is set to standby/hibernate and the application is resumed (this is how I found out).
This behavior can also be seen in a Java debugger by suspending the timer thread and resuming it.

Java while loop timing

I'm making a Java game and I'm stuck on how to do timing easily. I know there's the delta time thing, but that would mean that I have to insert that delta variable in almost every movement/update function?
I was wondering if I can just time the main While loop of the game so it runs/loops every 50 miliseconds, rather then as fast as it can.
Changes in position are functions of time, so it makes sense to have to pass the time delta into most functions that update game state. It's the natural thing to do; I wouldn't be apprehensive of doing so.
While it is possible to insert delays every frame (and you almost certainly want to do so, as there's no point in having a game run at 1000 fps), it is usually detrimental to have your update logic assume that each frame will have taken a preset amount of time. Doing so results in choppiness when one frame happens to take longer than others.
To give your game a certain framerate, the general approach for each frame is as follows:
Record the current time as startTime.
Do the state updates.
Redraw.
Record the current time as endTime.
Sleep for (1 / framerate) - (endTime - startTime)

Android game loop timing discrepancy [duplicate]

When programming animations and little games I've come to know the incredible importance of Thread.sleep(n); I rely on this method to tell the operating system when my application won't need any CPU, and using this making my program progress in a predictable speed.
My problem is that the JRE uses different methods of implementation of this functionality on different operating systems. On UNIX-based (or influenced) OS:es such as Ubuntu and OS X, the underlying JRE implementation uses a well-functioning and precise system for distributing CPU-time to different applications, and so making my 2D game smooth and lag-free. However, on Windows 7 and older Microsoft systems, the CPU-time distribution seems to work differently, and you usually get back your CPU-time after the given amount of sleep, varying with about 1-2 ms from target sleep. However, you get occasional bursts of extra 10-20 ms of sleep time. This causes my game to lag once every few seconds when this happens. I've noticed this problem exists on most Java games I've tried on Windows, Minecraft being a noticeable example.
Now, I've been looking around on the Internet to find a solution to this problem. I've seen a lot of people using only Thread.yield(); instead of Thread.sleep(n);, which works flawlessly at the cost of the currently used CPU core getting full load, no matter how much CPU your game actually needs. This is not ideal for playing your game on laptops or high energy consumption workstations, and it's an unnecessary trade-off on Macs and Linux systems.
Looking around further I found a commonly used method of correcting sleep time inconsistencies called "spin-sleep", where you only order sleep for 1 ms at a time and check for consistency using the System.nanoTime(); method, which is very accurate even on Microsoft systems. This helps for the normal 1-2 ms of sleep inconsistency, but it won't help against the occasional bursts of +10-20 ms of sleep inconsistency, since this often results in more time spent than one cycle of my loop should take all together.
After tons of looking I found this cryptic article of Andy Malakov, which was very helpful in improving my loop: http://andy-malakov.blogspot.com/2010/06/alternative-to-threadsleep.html
Based on his article I wrote this sleep method:
// Variables for calculating optimal sleep time. In nanoseconds (1s = 10^-9ms).
private long timeBefore = 0L;
private long timeSleepEnd, timeLeft;
// The estimated game update rate.
private double timeUpdateRate;
// The time one game loop cycle should take in order to reach the max FPS.
private long timeLoop;
private void sleep() throws InterruptedException {
// Skip first game loop cycle.
if (timeBefore != 0L) {
// Calculate optimal game loop sleep time.
timeLeft = timeLoop - (System.nanoTime() - timeBefore);
// If all necessary calculations took LESS time than given by the sleepTimeBuffer. Max update rate was reached.
if (timeLeft > 0 && isUpdateRateLimited) {
// Determine when to stop sleeping.
timeSleepEnd = System.nanoTime() + timeLeft;
// Sleep, yield or keep the thread busy until there is not time left to sleep.
do {
if (timeLeft > SLEEP_PRECISION) {
Thread.sleep(1); // Sleep for approximately 1 millisecond.
}
else if (timeLeft > SPIN_YIELD_PRECISION) {
Thread.yield(); // Yield the thread.
}
if (Thread.interrupted()) {
throw new InterruptedException();
}
timeLeft = timeSleepEnd - System.nanoTime();
}
while (timeLeft > 0);
}
// Save the calculated update rate.
timeUpdateRate = 1000000000D / (double) (System.nanoTime() - timeBefore);
}
// Starting point for time measurement.
timeBefore = System.nanoTime();
}
SLEEP_PRECISION I usually put to about 2 ms, and SPIN_YIELD_PRECISION to about 10 000 ns for best performance on my Windows 7 machine.
After tons of hard work, this is the absolute best I can come up with. So, since I still care about improving the accuracy of this sleep method, and I'm still not satisfied with the performance, I would like to appeal to all of you java game hackers and animators out there for suggestions on a better solution for the Windows platform. Could I use a platform-specific way on Windows to make it better? I don't care about having a little platform specific code in my applications, as long as the majority of the code is OS independent.
I would also like to know if there is anyone who knows about Microsoft and Oracle working out a better implementation of the Thread.sleep(n); method, or what's Oracle's future plans are on improving their environment as the basis of applications requiring high timing accuracy, such as music software and games?
Thank you all for reading my lengthy question/article. I hope some people might find my research helpful!
You could use a cyclic timer associated with a mutex. This is IHMO the most efficient way of doing what you want. But then you should think about skipping frames in case the computer lags (You can do it with another nonblocking mutex in the timer code.)
Edit: Some pseudo-code to clarify
Timer code:
While(true):
if acquireIfPossible(mutexSkipRender):
release(mutexSkipRender)
release(mutexRender)
Sleep code:
acquire(mutexSkipRender)
acquire(mutexRender)
release(mutexSkipRender)
Starting values:
mutexSkipRender = 1
mutexRender = 0
Edit: corrected initialization values.
The following code work pretty well on windows (loops at exactly 50fps with a precision to the millisecond)
import java.util.Date;
import java.util.Timer;
import java.util.TimerTask;
import java.util.concurrent.Semaphore;
public class Main {
public static void main(String[] args) throws InterruptedException {
final Semaphore mutexRefresh = new Semaphore(0);
final Semaphore mutexRefreshing = new Semaphore(1);
int refresh = 0;
Timer timRefresh = new Timer();
timRefresh.scheduleAtFixedRate(new TimerTask() {
#Override
public void run() {
if(mutexRefreshing.tryAcquire()) {
mutexRefreshing.release();
mutexRefresh.release();
}
}
}, 0, 1000/50);
// The timer is started and configured for 50fps
Date startDate = new Date();
while(true) { // Refreshing loop
mutexRefresh.acquire();
mutexRefreshing.acquire();
// Refresh
refresh += 1;
if(refresh % 50 == 0) {
Date endDate = new Date();
System.out.println(String.valueOf(50.0*1000/(endDate.getTime() - startDate.getTime())) + " fps.");
startDate = new Date();
}
mutexRefreshing.release();
}
}
}
Your options are limited, and they depend on what exactly you want to do. Your code snippet mentions the max FPS, but the max FPS would require that you never sleep at all, so I'm not entirely sure what you intend with that. None of that sleep or yield checking is going to make any difference in most of the problem situations however - if some other app needs to run now and the OS doesn't want to switch back soon, it doesn't matter which one of those you call, you'll get control back when the OS decides to do so, which will almost certainly be more than 1ms in the future. However, the OS can certainly be coaxed into making switches more often - Win32 has the timeBeginPeriod call for precisely this purpose, which you may be able to use somehow. But there is a good reason for not switching too often - it's less efficient.
The best thing to do, although somewhat more complex, is usually to go for a game loop that doesn't require real-time updates, but instead performs logic updates at fixed intervals (eg. 20x a second) and renders whenever possible (perhaps with arbitrary short sleeps to free up CPU for other apps, if not running in full-screen). By buffering a past logic state as well as the current one you can interpolate between them to make the rendering appear as smooth as if you were doing logic updates each time. For more information on this approach, you can see the Fix Your Timestep article.
I would also like to know if there is anyone who knows about Microsoft and Oracle working out a better implementation of the Thread.sleep(n); method, or what's Oracle's future plans are on improving their environment as the basis of applications requiring high timing accuracy, such as music software and games?
No, this won't be happening. Remember, sleep is just a method saying how long you want your program to be asleep for. It is not a specification for when it will or should wake up, and never will be. By definition, any system with sleep and yield functionality is a multitasking system, where the requirements of other tasks have to be considered, and the operating system always gets the final call on the scheduling of this. The alternative wouldn't work reliably, because if a program could somehow demand to be reactivated at a precise time of its choosing it could starve other processes of CPU power. (eg. A program that spawned a background thread and had both threads performing 1ms of work and calling sleep(1) at the end could take turns to hog a CPU core.) Thus, for a user-space program, sleep (and functionality like it) will always be a lower bound, never an upper bound. To do better than that requires the OS itself to allow certain apps to pretty much own the scheduling, and this is not a desirable feature in operating systems for consumer hardware (while being a common and useful feature for industrial applications).
Thread.Sleep says you're app needs no more time. This means that in a worst case scenario you'll have to wait for an entire thread slice (40ms or so).
Now in bad cases when a driver or something takes up more time it could be you have to wait for 120ms (3*40ms) so Thread.Sleep is not the way to go. Go another way, like registering a 1ms callback and starting draw code very X callbacks.
(This is on windows, i'd use MultiMedia tools to get those 1ms resolution callbacks)
Timing stuff is notoriously bad on windows. This article is a good place to start. Not sure if you care, but also note that there can be worse problems (especially with System.nanoTime) on virtual systems as well (when windows is the guest operating system).
Thread.sleep is inaccurate and makes the animation jittery most of the time.
If you replace it completely with Thread.yield you'll get a solid FPS without lag or jitter, however the CPU usage increases greatly. I moved to Thread.yield a long time ago.
This problem has been discussed on Java Game Development forums for years.

Categories