I’m starting to work on a music/metronome application in Java and I’m running into some problems with the timing and speed.
For testing purposes I’m trying to play two sine wave tones at the same time at regular intervals, but instead they play in sync for a few beats and then slightly out of sync for a few beats and then back in sync again for a few beats.
From researching good metronome programming, I found that Thread.sleep() is horrible for timing, so I completely avoided that and went with checking System.nanoTime() to determine when the sounds should play.
I’m using AudioSystem’s SourceDataLine for my audio player and I’m using a thread for each tone that constantly polls System.nanoTime() in order to determine when the sound should play. I create a new SourceDataLine and delete the previous one each time a sound plays, because the volume fluctuates if I leave the line open and keep playing sounds on the same line. I create the player before polling nanoTime() so that the player is already created and all it has to do is play the sound when it is time.
In theory this seemed like a good method for getting each sound to play on time, but it’s not working correctly. I’m not sure if the timing problems are from running different threads or if it has to do with deleting and recreating the SourceDataLine or if it’s in playing sounds or what exactly...
At the moment this is just a simple test in Java, but my goal is to create my app on mobile devices (Android, iOS, Windows Phone, etc)...however my current method isn’t even keeping perfect time on a PC, so I’m worried that certain mobile devices with limited resources will have even more timing problems. I will also be adding more sounds to it to create more complex rhythms, so it needs to be able to handle multiple sounds going simultaneously without sounds lagging.
Another problem I’m having is that the max tempo is controlled by the length of the tone since the tones don’t overlap each other. I tried adding additional threads so that every tone that played would get its own thread...but that really screwed up the timing, so I took it out. I would like to have a way to overlap the previous sound to allow for much higher tempos.
Any help getting these timing and speed issues straightened out would be greatly appreciated!
Thanks.
SoundTest.java:
import java.awt.*;
import java.awt.event.*;
import javax.swing.*;
import javax.swing.event.*;
import java.io.*;
import javax.sound.sampled.*;
public class SoundTest implements ActionListener {
static SoundTest soundTest;
// ENABLE/DISABLE SOUNDS
boolean playSound1 = true;
boolean playSound2 = true;
JFrame mainFrame;
JPanel mainContent;
JPanel center;
JButton buttonPlay;
int sampleRate = 44100;
long startTime;
SourceDataLine line = null;
int tickLength;
boolean playing = false;
SoundElement sound01;
SoundElement sound02;
public static void main (String[] args) {
soundTest = new SoundTest();
SwingUtilities.invokeLater(new Runnable() { public void run() {
soundTest.gui_CreateAndShow();
}});
}
public void gui_CreateAndShow() {
gui_FrameAndContentPanel();
gui_AddContent();
}
public void gui_FrameAndContentPanel() {
mainContent = new JPanel();
mainContent.setLayout(new BorderLayout());
mainContent.setPreferredSize(new Dimension(500,500));
mainContent.setOpaque(true);
mainFrame = new JFrame("Sound Test");
mainFrame.setContentPane(mainContent);
mainFrame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
mainFrame.pack();
mainFrame.setVisible(true);
}
public void gui_AddContent() {
JPanel center = new JPanel();
center.setOpaque(true);
buttonPlay = new JButton("PLAY / STOP");
buttonPlay.setActionCommand("play");
buttonPlay.addActionListener(this);
buttonPlay.setPreferredSize(new Dimension(200, 50));
center.add(buttonPlay);
mainContent.add(center, BorderLayout.CENTER);
}
public void actionPerformed(ActionEvent e) {
if (!playing) {
playing = true;
if (playSound1)
sound01 = new SoundElement(this, 800, 1);
if (playSound2)
sound02 = new SoundElement(this, 1200, 1);
startTime = System.nanoTime();
if (playSound1)
new Thread(sound01).start();
if (playSound2)
new Thread(sound02).start();
}
else {
playing = false;
}
}
}
SoundElement.java
import java.io.*;
import javax.sound.sampled.*;
public class SoundElement implements Runnable {
SoundTest soundTest;
// TEMPO CHANGE
// 750000000=80bpm | 300000000=200bpm | 200000000=300bpm
long nsDelay = 750000000;
int clickLength = 4100;
byte[] audioFile;
double clickFrequency;
double subdivision;
SourceDataLine line = null;
long audioFilePlay;
public SoundElement(SoundTest soundTestIn, double clickFrequencyIn, double subdivisionIn){
soundTest = soundTestIn;
clickFrequency = clickFrequencyIn;
subdivision = subdivisionIn;
generateAudioFile();
}
public void generateAudioFile(){
audioFile = new byte[clickLength * 2];
double temp;
short maxSample;
int p=0;
for (int i = 0; i < audioFile.length;){
temp = Math.sin(2 * Math.PI * p++ / (soundTest.sampleRate/clickFrequency));
maxSample = (short) (temp * Short.MAX_VALUE);
audioFile[i++] = (byte) (maxSample & 0x00ff);
audioFile[i++] = (byte) ((maxSample & 0xff00) >>> 8);
}
}
public void run() {
createPlayer();
audioFilePlay = soundTest.startTime + nsDelay;
while (soundTest.playing){
if (System.nanoTime() >= audioFilePlay){
play();
destroyPlayer();
createPlayer();
audioFilePlay += nsDelay;
}
}
try { destroyPlayer(); } catch (Exception e) { }
}
public void createPlayer(){
AudioFormat af = new AudioFormat(soundTest.sampleRate, 16, 1, true, false);
try {
line = AudioSystem.getSourceDataLine(af);
line.open(af);
line.start();
}
catch (Exception ex) { ex.printStackTrace(); }
}
public void play(){
line.write(audioFile, 0, audioFile.length);
}
public void destroyPlayer(){
line.drain();
line.close();
}
}
This sort of thing is difficult to get right. What you have to realise is that in order to even play a sound, it has to be loaded into an audio driver (and possibly a sound card). This takes time, and you have to account for that. There are basically two options for you:
Rather than counting down a delay between every beat, count down a delay from the start, when the metronome activates. As an example, say for instance that you want a beat every second. Because of the ~20ms delay, in your old method you'd get beats at 20ms, 1040, 2060, 3080, etc... If you count down from the start and place beats at 1000, 2000, 3000, etc. then they will play at 20ms, 1020, 2020, 3020, etc... There will still be some variance since the dalay itself varies a bit, but there should be about 1000ms between beats and it will not go out of sync (or at least, the problem will not get worse over time and likely can't be heard).
The better option, and the one that most of such programs use, is to generate larger pieces of music. Buffer for instance 20 seconds ahead and play that. The timing should be perfect during those 20 seconds. When those 20 seconds are almost over you must generate some new sound. If you can find out how to do this, you should append the new waveform to the old and have it play continuously. Otherwise, just generate a new 20 second soundbit and accept the delay between them.
Now as for your problem with the sounds not being able to overlap... I'm no expert and I don't really know an answer, but this I do know: Something has to mix the sounds if you need them to overlap. Either you can do that yourself in software by combining the waveform bytes (I think it's an addition in some logarithmic space), or you need to send the different overlapping sounds to different 'channels', in which case the audio driver or sound card does it for you. I don't know how this works in Java though, or I forgot, but I learned this through trial-and-error and working with .mod files.
Related
I am trying to develop a video player application with java - vlcj player. I need to give random start time for bunch of videos. However some of videos are shorter than my random start time. I need to skip these videos and play next ones.
Here is my code:
for(int i=0;i<videoCount;i++){
int delay = playTime*1000; //milliseconds
ActionListener taskPerformer = new ActionListener() {
public void actionPerformed(ActionEvent evt) {
videoPath = randomvVideoPath(directory);
//*** times[0] is start time, times[1] is stop time***//
String[] times = startStop();
/*** I DONT KNOW IF THE START TIME IS SMALLER THAN THE VIDEO DURATION******/
mediaPlayerComponent.getMediaPlayer().playMedia(videoPath,":start-time="+times[0], ":stop-time="+times[1]);
}
};
new Timer(delay, taskPerformer).start();
}
How can I check this? getLength() and getTime() of mediaPlayerComponent.getMediaPlayer() does not give the correct time.
Thanks in advance,,
getLength() and getTime() do return the correct time, but sometimes this information is not available until after the media is playing. That's just how VLC works.
I would suggest instead looking at Java bindings for the native MediaInfo command-line utility and using that to get your video length.
I've been programming sound with simple Swing graphics for a little while, but my frame rates are choppy for some reason.
Generally I'm doing something like the following on a background thread:
for(;;) {
// do some drawing
aPanel.updateABufferedImage();
// ask for asynchronous repaint
aPanel.repaint();
// write the sound
aSourceDataLine.write(bytes, 0, bytes.length);
}
Through debugging, I think I've already traced the problem to the blocking behavior of SourceDataLine#write. Its doc states the following:
If the caller attempts to write more data than can currently be written [...], this method blocks until the requested amount of data has been written.
So, what this seems to mean is SourceDataLine actually has its own buffer that it is filling when we pass our buffer to write. It only blocks when its own buffer is full. This seems to be the holdup: getting it to block predictably.
To demonstrate the issue, here's a minimal example which:
writes 0's to a SourceDataLine (no audible sound) and times it.
draws an arbitrary graphic (flips each pixel color) and times the repaint cycle.
import javax.swing.*;
import java.awt.*;
import java.awt.event.*;
import java.awt.image.*;
import javax.sound.sampled.*;
class FrameRateWithSound implements Runnable {
public static void main(String[] args) {
SwingUtilities.invokeLater(new FrameRateWithSound());
}
volatile boolean soundOn = true;
PaintPanel panel;
#Override
public void run() {
JFrame frame = new JFrame();
JPanel content = new JPanel(new BorderLayout());
final JCheckBox soundCheck = new JCheckBox("Sound", soundOn);
soundCheck.addActionListener(new ActionListener() {
#Override
public void actionPerformed(ActionEvent e) {
soundOn = soundCheck.isSelected();
}
});
panel = new PaintPanel();
content.add(soundCheck, BorderLayout.NORTH);
content.add(panel, BorderLayout.CENTER);
frame.setContentPane(content);
frame.pack();
frame.setLocationRelativeTo(null);
frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
frame.setVisible(true);
new Thread(new Worker()).start();
}
class Worker implements Runnable {
#Override
public void run() {
AudioFormat fmt = new AudioFormat(
AudioFormat.Encoding.PCM_SIGNED,
44100f, 8, 1, 1, 44100f, true
);
// just 0's
byte[] buffer = new byte[1000];
SourceDataLine line = null;
try {
line = AudioSystem.getSourceDataLine(fmt);
line.open(fmt);
line.start();
for(;;) {
panel.drawNextPixel();
panel.repaint();
if(soundOn) {
// time the write
long t = System.currentTimeMillis();
line.write(buffer, 0, buffer.length);
t = ( System.currentTimeMillis() - t );
System.out.println("sound:\t" + t);
}
// just so it doesn't fly off the handle
Thread.sleep(2);
}
} catch(Exception e) {
// lazy...
throw new RuntimeException(e);
} finally {
if(line != null) {
line.close();
}
}
}
}
class PaintPanel extends JPanel {
Dimension size = new Dimension(200, 100);
BufferedImage img = new BufferedImage(
size.width, size.height, BufferedImage.TYPE_INT_RGB);
int x, y;
int repaints;
long begin, prev;
String fps = "0";
PaintPanel() {
setPreferredSize(size);
setOpaque(false);
Graphics2D g = img.createGraphics();
g.setColor(Color.LIGHT_GRAY);
g.fillRect(0, 0, size.width, size.height);
g.dispose();
}
synchronized void drawNextPixel() {
img.setRGB(x, y, img.getRGB(x, y) ^ 0xFFFFFF); // flip
if( ( ++x ) == size.width ) {
x = 0;
if( ( ++y ) == size.height ) {
y = 0;
}
}
}
#Override
protected synchronized void paintComponent(Graphics g) {
super.paintComponent(g);
g.drawImage(img, 0, 0, size.width, size.height, null);
long curr = System.currentTimeMillis();
// time this cycle
long cycle = ( curr - prev );
System.out.println("paint:\t" + cycle);
++repaints;
// time FPS every 1 second
if(curr - begin >= 1000) {
begin = curr;
fps = String.valueOf(repaints);
repaints = 0;
}
prev = curr;
g.setColor(Color.RED);
g.drawString(fps, 12, size.height - 12);
}
}
}
I recommend actually running the example if you are curious about this.
A typical System.out feed during "playback" is something like the following:
sound: 0
paint: 2
sound: 0
paint: 2
sound: 0
paint: 3
sound: 0
paint: 2
paint: 2
sound: 325 // <- 'write' seems to be blocking here
sound: 0
paint: 328
sound: 0
paint: 2
This shows the behavior of write pretty clearly: it spins a majority of the time, then blocks for an extended period, at which point the repaints chug as well. The FPS meter typically displays ~45 during playback, but the animation is obviously choppy.
When sound is turned off, FPS climbs and the animation is smooth.
So is there a way to fix it? What am I doing wrong? How can I get write to block at a regular interval?
This behavior is apparent both on Windows and OSX environments.
One thing I've tried is using Thread.sleep to regulate it, but it's not very good. It's still choppy.
The solution seems to be to use open(AudioFormat, int) to open the line with a specified buffer size.
line.open(fmt, buffer.length);
Timing it again, we can see that write blocks much more consistently:
sound: 22
paint: 24
sound: 21
paint: 24
sound: 20
paint: 22
sound: 21
paint: 23
sound: 20
paint: 23
And the animation is smooth.
I seriously doubt that the sound playback is the culprit here. Please see my comment on the main question. The blocking that occurs in the audio write() method pertains to the rate at which the audio is presented to the playback system. Since audio processing is usually an order of magnitude faster than the audio system can play back (limited to 44100 fps), a majority of the time is spent blocking, for BOTH SourceDataLine and Clip. During this form of blocking, the CPU is FREE to do other things. It does not hang.
I'm more suspicious of the your use of synchronization for the images, and the editing being done to the image. I'm pretty sure editing on a bit level will wipe out the default graphics acceleration for that image.
You might check out this link on Graphics2D optimizations at Java-Gaming.org
http://www.java-gaming.org/topics/java2d-clearing-up-the-air/27506/msg/0/view/topicseen.html#new
I found it very helpful for optimizing my 2D graphics.
I'm not sure why you are getting coalescence in your particular case. The few times it has been an issue for me is when the looping code for the frame and the component were in the same class. By just putting the "game loop" code and the component in separate classes, the problem always went away for me, so I never bothered to think about it further. As a consequence I don't have a clear understanding of why that worked, or if that action was even a factor.
[EDIT: just looked more closely at your audio code. I think there is room for optimization. There are calculations being redone needlessly and could be consuming cpu. For example, since you have final values in your inner loop, why recalculate that part every iteration? Take the constant part and calculate that into a value once, and only calculate the unknowns in the inner loop. I recommend refactoring to allow avoiding all synchronizing and optimizing the data generation of the audio data, and then seeing if there is still a problem.]
I'm working on an application that records the users screen, webcam and microphone whilst he/she is performing certain activities. It will be used for research purposes. The application has been successfully tested on Windows, but on Mac OS X (Maverick with Java 7.0.45) the application becomes slow and unresponsive when recording is started.
This is why I find this difficult to comprehend:
The recording is done in a separate thread, so how could it influence the responsiveness of another thread? Especially as after each run either Thread.yield() or Thread.sleep(...) are called.
Logs show that whilst attempting to record at 15 FPS, the resulting frame rate was 2 FPS. So it seems the code that does the capturing of a single frame might be too slow. But why then does it work fine on Windows?
Just a quick note: the application was successfully tested by tons of users on Windows, but I only got to test it on a single Mac. However, that one was just formatted and got a clean install of OS X Maverick, Java (and Netbeans).
Below you will find the code that records the screen and writes it to a video using Xuggler. The code for recording the webcam is similar, and I'd doubt recording the audio has anything to do with it. My question is:
What might be the cause of the application becoming unresponsive?, and
How could the code be made more efficient and so improve FPS?
IMediaWriter writer = ToolFactory.makeWriter(file.getAbsolutePath());
Dimension size = Globals.sessionFrame.getBounds().getSize();
Rectangle screenRect;
BufferedImage capture;
BufferedImage mousePointImg;
writer.addVideoStream(0, 0, ICodec.ID.CODEC_ID_H264, size.width, size.height);
int i = 0;
while (stop == false) {
// Get mouse cursor to draw over screen image.
PointerInfo mousePointer = MouseInfo.getPointerInfo();
Point mousePoint = mousePointer.getLocation();
Point screenPoint = new Point((int) (mousePoint.getX() -
Globals.sessionFrame.getBounds().getX()), (int) (mousePoint.getY() -
Globals.sessionFrame.getBounds().getY()));
// Get the screen image.
try {
screenRect = new Rectangle(Globals.sessionFrame.getBounds());
capture = new Robot().createScreenCapture(screenRect);
} catch ( ... ) { ... }
// Convert and resize the screen image.
BufferedImage image = ConverterFactory.convertToType(capture,
BufferedImage.TYPE_3BYTE_BGR);
IConverter converter = ConverterFactory.createConverter(image,
IPixelFormat.Type.YUV420P);
// Draw the mouse cursor if necessary.
if (mouseWithinScreen()) {
Graphics g = image.getGraphics();
g.drawImage(mousePointImg, (int) screenPoint.getX(),
(int) screenPoint.getY(), null);
}
// Prepare the frame.
IVideoPicture frame = converter.toPicture(image, (System.currentTimeMillis() -
startTimeMillis()) * 1000);
frame.setKeyFrame(i % (getDesiredFPS() * getDesiredKeyframeSec()) == 0);
// Write to the video
writer.encodeVideo(0, frame);
// Delay the next capture if we are at the desired FPS.
try {
if (atDesiredFPS()) {
Thread.yield();
} else {
Thread.sleep(1000 / getDesiredFPS());
}
} catch ( ... ) { ... }
i++;
}
writer.close();
There are several architectural issues that I can see in your code:
First if you want to execute something at a fixed rate, use the ScheduledThreadPoolExecutor.scheduleAtFixedRate(...) function. It will make your entire delay code part obsolete as well as ensuring that certain OS timing issues will not interfere with your scheduling.
Then to make things faster you need to take your code apart a bit. As far as I can see you have 3 tasks: the capture, the mouse-drawing/conversion and the stream writing. If you put the capture part in a scheduled Runnable, the conversion into multi-parallel execution as Callables into an Executor, and then in a 3rd thread take the results from a result list and write it into the stream, you can fully utilize multi-cores.
Pseudocode:
Global declarations (or hand them over to the various classes):
final static Executor converterExecutor = Executors.newFixedThreadPoolExecutor(Runtime.getRuntime().availableProcessors());
final static LinkedBlockingQueue<Future<IVideoPicture>> imageQueue = new LinkedBlockingQueue<>();
// ...
Capture Runnable (scheduled at fixed rate):
capture = captureScreen();
final Converter converter = new Converter(capture);
final Future<IVideoPicture> conversionResult = converterExecutor.submit(converter);
imageQueue.offer(conversionResult); // returns false if queue is full
Conversion Callable:
class Converter implements Callable<IVideoPicture> {
// ... variables and constructor
public IVideoPicture call() {
return convert(this.image);
}
}
Writer Runnable:
IVideoPicture frame;
while (this.done == false) {
frame = imageQueue.get();
writer.encodeVideo(0, frame);
}
You can ensure that the imageQueue does not overflow with images to render if the CPU is too slow by limiting the size of this queue, see the constructor of LinkedBlockingQueue.
My project is using Processing core jar and the GSVideo library on a OSX 10.8.5 using Eclipse.
I cannot get GSVideo jump(int frame) or jump(float time) to actually redraw the next frames. The image displayed toggles back and forth between frames when I repeatedly press the RIGHT to advance the frame in the example program below. Because the example below works with *.mov, but not *.mpg video I want to ask if there are any known problems with gstreamer advancing frames in MPEG2 video. Or perhaps something's up with either java-gstreamer or GSVideo?
I'm working with video in MPEG2 format.. And there is no problem just to play and pause the MPEG2. It just seems that movie.jump(frameNum or time) functions are not working. I've started looking for an example of frame stepping using playbin2's seek method.
Here is info about the video I'm trying to jump.
stream 0: type: CODEC_TYPE_VIDEO; codec: CODEC_ID_MPEG2VIDEO; duration: 7717710; start time: 433367; timebase: 1/90000; coder tb: 1001/60000;
width: 1920; height: 1080; format: YUV420P; frame-rate: 29.97;
The example code.
import processing.core.*;
import codeanticode.gsvideo.*;
public class FramesTest extends PApplet {
GSPlayer player;
GSMovie movie;
int newFrame = 0;
PFont font;
public void setup() {
size(320, 240);
background(0);
//movie = new GSMovie(this, "station.mov"); // sample works
movie = new GSMovie(this, "myMovie.mpg"); // mpg does not
movie.play();
movie.goToBeginning();
movie.pause();
textSize(24);
}
public void movieEvent(GSMovie movie) {
System.out.println("movie"+ movie.frame());
movie.read();
}
public void draw() {
image(movie, 0, 0, width, height);
fill(240, 20, 30);
text(movie.frame() + " / " + (movie.length() - 1), 10, 30);
}
public void keyPressed() {
if (movie.isSeeking()) return;
if (key == CODED) {
if (keyCode == LEFT) {
if (0 < newFrame) newFrame--;
} else if (keyCode == RIGHT) {
if (newFrame < movie.length() - 1) newFrame++;
}
}
movie.play();
movie.jump(newFrame);
movie.pause();
if (movie.available()){
System.out.println(movie.frame());
movie.read();
}
System.out.println(newFrame);
}
public static void main(String[] args) {
// TODO Auto-generated method stub
PApplet.main(new String[] { FramesTest.class.getName() }); //
}
}
The example code was pulled from here...
http://gsvideo.sourceforge.net/examples/Movie/Frames/Frames.pde
I've searched the internet for a few days and this attempted contact with this forum as well...
https://sourceforge.net/projects/gsvideo/forums
This post seems similar but my problem is not playing (that's fine). I cannot jump to a specific frame.... GStreamer: Play mpeg2
Many thanks to the SO community for any help I might receive.
Update:
To work around the MPEG2 compression issue (described by v.k. below) I am trying to create a gstreamer pipeline to do on-the-fly transcoding to mp4 using either GSVideo Pipeline or with java-gstreamer. The command below works in Ubuntu.
gst-launch-0.10 filesrc location=myMpeg2Video.mpg ! mpegdemux name=demux demux.video_00 ! ffdec_mpeg2video ! queue ! x264enc ! ffdec_h264 ! xvimagesink
But the following GSVideo Pipeline displays an empty gray window :(
pipeline = new GSPipeline(this, "filesrc location=file:/path/movie.mpg ! mpegdemux name=demux demux.video_00 ! ffdec_mpeg2video");
pipeline.play();
as v.k. pointed out, seeking is in general not accurate.
One important thing to note is that development on gsvideo has basically stopped. The main elements of it were ported to the built-in video library in Processing 2.0. I did some work in built-in video to try to improve seeking, and the example Frames in Libraries|video|Movie shows how (to try) to jump to specific frames by indicating a time value. Maybe this helps in your case?
Also if you find a more accurate way of doing seeking as you suggest in your last post, I could include that in video library.
This is a rather tricky question as I have found no information online. Basically, I wish to know how to check if a computer is idle in Java. I wish a program to only work if the computer is in active use but if it is idle then to not.
The only way i can think of doing this is hooking into the mouse/keyboard and having a timer.
MSN Messenger has that "away" feature, I wish for something similar to this.
Java has no way of interacting with the Keyboard, or Mouse at the system level outside of your application.
That being said here are several ways to do it in Windows. The easiest is probably to set up JNI and poll
GetLastInputInfo
for keyboard and mouse activity.
Im not a professional, but i have an idea:
you can use the java's mouse info class to check mouse position at certian intervals say like:
import java.awt.MouseInfo;
public class Mouse {
public static void main(String[] args) throws InterruptedException{
while(true){
Thread.sleep(100);
System.out.println("("+MouseInfo.getPointerInfo().getLocation().x+", "+MouseInfo.getPointerInfo().getLocation().y+")");
}
}
}
replace the print statement with your logic, like if for some interval say 1 min the past position of mouse is the same as new position (you can simply compare only the x-coordinates), that means the system is idle, and you can proceed with your action as you want (Hopefully it is a legal activity that you want to implement :-)
Besure to implement this in a new thread, otherwise your main program will hang in order to check the idle state.
You can solve this with the help of Java's robot class.
Use the robot class to take a screenshot, then wait for lets say 60 seconds and take another screenshot. Compare the screenshots with each other to see if any changes
has happened, but don't just compare the screenshots pixel by pixel. Check for the percentage of the pixels that has changed. The reason is that you don't want small differences like Windows clock to interfere with the result. If the percentage is less that 0.005% (or whatever), then the computer is probably idling.
import java.awt.AWTException;
import java.awt.DisplayMode;
import java.awt.GraphicsDevice;
import java.awt.GraphicsEnvironment;
import java.awt.Rectangle;
import java.awt.Robot;
import java.awt.Toolkit;
import java.awt.image.BufferedImage;
public class CheckIdle extends Thread {
private Robot robot;
private double threshHold = 0.05;
private int activeTime;
private int idleTime;
private boolean idle;
private Rectangle screenDimenstions;
public CheckIdle(int activeTime, int idleTime) {
this.activeTime = activeTime;
this.idleTime = idleTime;
// Get the screen dimensions
// MultiMonitor support.
int screenWidth = 0;
int screenHeight = 0;
GraphicsEnvironment graphicsEnv = GraphicsEnvironment
.getLocalGraphicsEnvironment();
GraphicsDevice[] graphicsDevices = graphicsEnv.getScreenDevices();
for (GraphicsDevice screens : graphicsDevices) {
DisplayMode mode = screens.getDisplayMode();
screenWidth += mode.getWidth();
if (mode.getHeight() > screenHeight) {
screenHeight = mode.getHeight();
}
}
screenDimenstions = new Rectangle(0, 0, screenWidth, screenHeight);
// setup the robot.
robot = null;
try {
robot = new Robot();
} catch (AWTException e1) {
e1.printStackTrace();
}
idle = false;
}
public void run() {
while (true) {
BufferedImage screenShot = robot
.createScreenCapture(screenDimenstions);
try {
Thread.sleep(idle ? idleTime : activeTime);
} catch (InterruptedException e) {
e.printStackTrace();
}
BufferedImage screenShot2 = robot
.createScreenCapture(screenDimenstions);
if (compareScreens(screenShot, screenShot2) < threshHold) {
idle = true;
System.out.println("idle");
} else {
idle = false;
System.out.println("active");
}
}
}
private double compareScreens(BufferedImage screen1, BufferedImage screen2) {
int counter = 0;
boolean changed = false;
// Count the amount of change.
for (int i = 0; i < screen1.getWidth() && !changed; i++) {
for (int j = 0; j < screen1.getHeight(); j++) {
if (screen1.getRGB(i, j) != screen2.getRGB(i, j)) {
counter++;
}
}
}
return (double) counter
/ (double) (screen1.getHeight() * screen1.getWidth()) * 100;
}
public static void main(String[] args) {
CheckIdle idleChecker = new CheckIdle(20000, 1000);
idleChecker.run();
}
}
Nothing in the platform-independent JRE will answer this question. You might be able to guess by measuring clock time for a calculation, but it wouldn't be reliable. On specific platforms, there might be vendor APIs that might help you.
1) Make a new thread.
2) Give it a super super low priority (the lowest you can)
3) Every second or two, have the thread do some simple task. If super fast, at least 1 CPU is prolly idle. If it does it slow, then at least 1 core is prolly not idle.
Or
Just run your program at a low priority. That will let the OS deal with letting other programs run over your program.