I have a requirment to run a while loop the 5 min.
I looked for the timer api but I could not found to do this.
Can any one provide a code snipet for this.
Thanks
The easiest way will be to just check how much time has elapsed on each iteration. Example:
final long NANOSEC_PER_SEC = 1000l*1000*1000;
long startTime = System.nanoTime();
while ((System.nanoTime()-startTime)< 5*60*NANOSEC_PER_SEC){
// do stuff
}
This will run the loop, until more than 5 minutes have elapsed.
Notes:
The current loop iteration will always complete, so in practice it will always run for a bit more than 5 minutes.
For this application System.nanoTime() is more suitable than System.currentTimeMillis() because the latter will change if the computer's system clock is adjusted, thus throwing off the calculation. Thanks to Shloim for pointing this out.
This loop will run for 5 minutes. It will not be effected by changes made to the computer's date/time (either by user or by NTP).
long endTime = System.nanoTime() + TimeUnit.NANOSECONDS.convert(5L, TimeUnit.MINUTES);
while ( System.nanoTime() < endTime ){
// do whatever
}
Other methods like System.currentTimeMillis() should be avoided, because they rely on the computer date/time.
Because you are talking about the timer API I guess what you are after is a delay instead of a "loop running for 5min". If this is the case you could use something like Thread.sleep(..) which would allow to let the CPU do more usefull stuff that busy-waiting. Or at least save some energy and the planet.
Related
Currently this is how I get time elapsed for a process to get completed.
long start = System.currentTimeMillis();
process();
long end = System.currentTimeMillis();
long duration = stop - start;
System.out.println(duration);
The problem I am facing currently is the process() gets interrupted(paused) by my pc's hibernation whenever there's no power(My work environment is outside so I have to be on battery) so that it can continue whenever I restart the pc when plugged. I am fully aware I would get a wrong duration if for example I am not able to restart for about 2hrs since the process got paused(during hibernation) as this would mean the extra 2hrs will be accounted for by the CPU clock via the CMOS battery backup whenever the line
long end = System.currentTimeMillis();
is reached. How would it be possible to get the exact duration irrespective of the process being paused severally.
So your question is "how much wall clock time was taken by the process, without time spent in hibernation". Well, wall clock time is easy to get, but you don't really have a chance to know whether hibernation happened or not.
However...
If your process is a set of discrete steps, you could do something like the following
List<Duration> durations = new ArrayList<>();
for(Step step : steps) {
Instant stepStart = Instant.now();
process(step);
durations.add(Duration.between(stepStart, Instant.now()));
}
long totalMillis = durations.stream()
.mapToLong(Duration::toMillis)
.filter(ms -> ms < 1000) // Cut off limit, to disregard hibernate steps
.sum();
This times each step separately, and if the time for a step takes more than 1 second, it's not taken into account in the total. You could also use an "average" time for those steps, so the end result would be a bit more realistic (of course this depends on the number of steps, the assumed runtime of a single step, etc.).
This only works if there is a good limit to what is "too much" time, and it provides a less accurate result. If you're doing something with BigInteger, it's likely that steps with larger values take more time, so a single cutoff value would not work (although you could consider some kind of dynamic cutoff value, based on the input).
Cheapest, easiest and best solution: run the code on a server.
Is there library in Android that can provide me the total spend time of a user in my application without using my own time count?
I believe that Android OS is counting all application use time the same way as they count battery use ,network, etc..
If my assumptions are right, What I need is this system count for my application use time.
You can use Fabric (https://fabric.io/) there's some a lot useful tools that you can use, it also can track Daily Active User, Crash Reporting, etc.
They also provide us easy integration step, just take few minutes to integrate our system with Fabric.
The general approach to this is to:
Get the time at the start of your benchmark, say at the start of main().
Run your code.
Get the time at the end of your benchmark, say at the end of main().
Subtract the start time from the end time and convert into appropriate units.
A simpler way is this.
long startTime = System.nanoTime();
.....your program....
long endTime = System.nanoTime();
long totalTime = endTime - startTime;
System.out.println(totalTime);
I am trying to make a Timer that will start counting from Date,
so every time i launch the app, the Timer will always be updated
for example if i start the timer at 20:00 22/11/18, tomorrow at 21:00 it will show 25:00:00.
I have only found how to do a CountdownTimer, or just a simple timer.
You can get the current time when you start the timer with:
long timerStart = System.currentTimeMillis();
And then when you want to show what the timer is at calculate it by doing
long timePassed = System.currentTimeMillis() - timerStart;
And that will give you the number of milliseconds since you started the timer. And to format it the way you want you can pass it into this function:
public static String convertMillisToHMmSs(long millis) {
long seconds = millis / 1000
long s = seconds % 60;
long m = (seconds / 60) % 60;
long h = (seconds / (60 * 60));
return String.format("%d:%02d:%02d", h,m,s);
}
Edit: As mentioned by the other answers, you will need to store the timerStart somewhere to keep track of it after the app is closed/reopened. I would recommend something like shared preferences you can look at this question to figure out how to do that
Unless you are willing to create an app that will run in background the whole time for few days (which would be highly unoptimized for an app of this complexity)
I think the best solution would be to store your start date (start timestamp) somewhere. Either in Room or in Shared Preferences and not to program your APP to increase or decrease your counter by one every second, than rather to calculate difference between start and current timestamp every second.
There are obviously a lot questions about performance but according to your question I guess that you are not concerned by this, and it will be a good practice for you to optimize this solution to be faster and more precise.
Agree above with Quinn however, somewhere you need to create a file that stores the current time. Otherwise every time app restarts the variable timerStart will reset.
So you need to create a file that stores the 'timerStart' so that every time you start, it updates from the value.
Timer timer = new Timer(true);
timer.scheduleAtFixedRate(timerTask, 0, 1); // 1 = 1ms delay between each iteration
Each time it triggers it runs a super quick operation which essentially take no time at all: basically it takes the current milliseconds elapsed value and increments it while doing a quick lookup in a Map where the millis is the key.
Do you think 1ms delay would be too fast? Is this going to bog down the system? Are there any dangers in trying to use this super fast timer?
Maybe. A lot can happen in a millisecond on contemporary computers so it depends on lots of things. You probably should figure out the slowest rate acceptable and then pick something reasonable between 1 and that number. This will execute more around 86,400,000 times a day. Does that make sense for what you are trying to accomplish?
EDIT: As some of the comments to the question note, there might be a fundamental flaw in this approach if you assume that the timer will always succeed to execute at the rate you have provided. You can never make this assumption regardless of the rate. It's hard to tell because there are very few details but I have a sense you should look into using queues instead of a Map.
That depends.
Ask yourself these questions:
how often do I absolutely need that value to be checked?
how quick would be acceptable?
how much of your systems CPU time are you willing to sacrifice on this task?
One ms can be long (high end gaming PC) or very, very short (older gen smartphone) and depending on your CPU architecture you'll end up stuffing one of many cores or the one and only core with calculating time differences.
As for your data structure: you'll probably need something like a sorted map containing the start as key and duration as one field of the value. You'd fetch the closest key less than your time and check if the caption stored is still valid... or similar
"Do you think 1ms delay would be too fast?" - Yes, and it will not be able to start task each millisecond (especially on-loading), you can test it yourself (with small example below).
"Is this going to bog down the system?" - actually not very much (I mean mostly CPU time). If you run your operation for example in a loop and will not make any pause, you will get much more load. Your system will spend big amount of time on context switch to start your task, officials it is inevitable overhead, but if you execute very simple task this overhead will be comparable with your useful workload.
"Are there any dangers in trying to use this super fast timer?" - as you wrote above " quick lookup in a Map where the millis is the key" you use millis as a key but you will not run each millisecond so your logic if count to be corrupted.
static int inc = 0;
public static void main(String[] args) throws InterruptedException {
TimerTask timerTask = new TimerTask() {
public void run() {
inc++;
}
};
Timer timer = new Timer(true);
timer.scheduleAtFixedRate(timerTask, 0, 1);
Thread.sleep(500000);
timer.cancel();
timer.purge();
System.out.println(inc);
}
I have a simple java program, and I want to know the time difference between some set of operations. For this question the details are not important, but let us take the following scenario.
long beginTime = System.currentTimeMillis();
//Some operations. Let us asssume some database operations etc. which are time consuming.
//
long endTime = System.currentTimeMillis();
long difference = endTime - beginTime;
When the code is run on a machine, how reliable will the difference be?
Let us say, that the processor starts executing some instructions from my code, then gives context to another process, which executes for some time, and then comes back to execute instructions related to this java process.
So, the time difference should depend on the current state of my machine, i.e. how many processes are running etc? So, in profiling time it takes for some operations to run, is this mechanism not reliable?
The granularity of System.currentTimeMillis() depends on the implementation and on the Operating system and is usually around 10 ms.
Instead use the System.nanoTime() which returns the current value of the most precise available system timer, in nanoseconds. Note that you can only use this to calculate elapsed time, you cannot use its value as an absolute time.
Example:
long startTime = System.nanoTime();
// do something you want to measure
long elapsedTimeNs = System.nanoTime() - startTime;
I think your code will do just fine as same is used in several projects. I don't think that if your code calls some other database process etc then it has any effect on your time. It should work fine in that case too.
Ex.
Main Process Started
long beginTime = System.currentTimeMillis();
.
.
.
Called another process (DB or command etc) -> (Another Process start)
.
<Now in this time main process is waiting> .
.
.
Returned from process <- (Another Process end)
.
.
long endTime = System.currentTimeMillis();
long difference = endTime - beginTime;
Now this difference will be total time taken for main process including time taken by another process. THis timing is in reference to the machine of main process & that is fine.