Refresh BufferedImage periodically - java

I have BufferedImage displayed in JFrame which I want to refresh periodically with raw R, G, B data I receive through Socket(byte[] buffer). Sequence of actions should look something like this:
Receive byte[1280 * 480] array of pure RGB data(one byte per component) -> this part works flawless
Iterate through byte array and call BufferedImage.setRgb(x, y, RGB) for every pixel
I have no problem with receiving and displaying one frame, but when I wrap code which does steps 1. and 2. I receive data regularly but not a single frame is ever shown. My guess was that receiving data is significantly faster than displaying images, in other words my received image gets somehow overwritten by new image etc.
My next idea was to hand over buffer with image to another background thread and block main thread which does network communication until background thread has done 'displaying' image.
Then I heard it can easily be done with SwingWorker here: Can a progress bar be used in a class outside main? but it does exactly the same thing as if I was still doing everything on one thread: no image was ever shown. Here is my code:
public class ConnectionManager {
public static final String serverIp = "192.168.1.10";
public static final int tcpPort = 7;
public static final int bufferSize = 1280;
private Socket client;
private BufferedInputStream networkReader;
private PrintStream printStream;
byte[] buffer;
public ConnectionManager(){}
public void connect() throws IOException{
int dataRead;
while(true){
client = new Socket(serverIp, tcpPort);
printStream = new PrintStream(client.getOutputStream());
networkReader = new BufferedInputStream(client.getInputStream());
dataRead = 0;
buffer = new byte[1280 * 480];
printStream.println(""); // CR is code to server to send data
while(dataRead < (1280 * 480)){
dataRead += networkReader.read(buffer, dataRead, (1280 * 480) - dataRead);
}
DrawBack d = new DrawBack();
d.execute();
try {
d.get(); // here im trying to block main thread purposely
} catch (InterruptedException e) {
e.printStackTrace();
} catch (ExecutionException e) {
e.printStackTrace();
}
}
}
private class DrawBack extends SwingWorker<Void, Void>{
#Override
protected Void doInBackground() throws Exception {
byte Y, U, V;
int R, G, B, RGB, Yi, Ui, Vi;
boolean alternate = false;
for(int i = 0; i < 480; ++i){
for(int j = 1; j < 1280; j += 2){
if(alternate){
Y = buffer[i * 1280 + j];
V = buffer[i * 1280 + j -1];
U = buffer[i * 1280 + j -3];
} else {
Y = buffer[i * 1280 + j];
U = buffer[i * 1280 + j -1];
V = buffer[i * 1280 + j +1];
}
Yi = Y & 0xFF;
Ui = U & 0xFF;
Vi = V & 0xFF;
R = (int)(Yi + 1.402 * (Vi - 128));
G = (int)(Yi - 0.34414 * (Ui - 128) - 0.71414 * (Vi - 128));
B = (int)(Yi + 1.772 * (Ui - 128));
RGB = R;
RGB = (RGB << 8) + G;
RGB = (RGB << 8) + B;
alternate = !alternate;
Masapp.window.getImage().setRGB(j/2, i, RGB);// reference to buffered image on JFrame
if((i == 100) && (j == 479)){
System.out.println(Yi + " " + Ui + " " + Vi);
}
}
}
return null;
}
}
}
I even tried to wait for completion with while:
DrawBack d = new DrawBack(); // DrawBack extends SwingWorker<Void, Void>
d.execute();
while(!d.isDone());
but it makes no improvement. I tried with calling BufferedImage.flush() and JFrame.invalidate() when I set all pixels.
My question basically is: How to refresh and display buffered image periodically?

Your implementation is incorrectly synchronized in that it updates the GUI from the worker's background thread rather than the event dispatch thread. The resulting behavior is unpredictable. Instead, define a SwingWorker<BufferedImage, BufferedImage> and publish() the image for later rendering in your implementation of process(). For improved liveness, publish portions of the image as they are ready, e.g. publish() a BufferedImage containing one row at a time. Compare the example cited with this related example to see the approach.

Related

Create simple 2D image preview from panoramic HDR

Is there some really simple and basic code for making preview for HDR images (like getting 2D BufferedImage output or something)?
I am using this HDR image.
I tried this (it uses TwelveMonkeys), but no success at all (it simply stuck/frozen at ImageReader reader = readers.next();)
I edited it a bit to suit my needs like this, testing where it got broken/stuck/frozen...and it always happen after TEST 1, that is TEST 2 is never reached, tho no IllegalArgumentException is thrown - if I remove the if() section, then TEST 3 is never reached (I am using NetBeansIDE v12.4, Win7 x64):
public BufferedImage hdrToBufferedImage(File hdrFile) throws IOException {
BufferedImage bi = null;
// Create input stream
// I WROTE DOWN THE STRING FOR THIS EXAMPLE, normally it is taken from the hdrFile
// HDR image size is 23.7MB if it matters at all?
ImageInputStream input = ImageIO.createImageInputStream(new File("Z:/HDR/spiaggia_di_mondello_4k.hdr"));
try {
// Get the reader
Iterator<ImageReader> readers = ImageIO.getImageReaders(input);
System.err.println("=====>>> TEST 1");
if (!readers.hasNext()) {
throw new IllegalArgumentException("No reader for: " + hdrFile);
}
System.err.println("=====>>> TEST 2");
ImageReader reader = readers.next();
System.err.println("=====>>> TEST 3");
try {
reader.setInput(input);
// Disable default tone mapping
HDRImageReadParam param = (HDRImageReadParam) reader.getDefaultReadParam();
param.setToneMapper(new NullToneMapper());
// Read the image, using settings from param
bi = reader.read(0, param);
} finally {
// Dispose reader in finally block to avoid memory leaks
reader.dispose();
}
} finally {
// Close stream in finally block to avoid resource leaks
input.close();
}
// Get float data
float[] rgb = ((DataBufferFloat) bi.getRaster().getDataBuffer()).getData();
// Convert the image to something easily displayable
BufferedImage converted = new ColorConvertOp(null).filter(bi, new BufferedImage(bi.getWidth(), bi.getHeight(), BufferedImage.TYPE_INT_RGB));
return converted;
}
Well, if you don't mind occasional extreme halucinogenic oversaturation of some colors here and there (I was unable solving the issue - if anyone knows how to, please, feel free to update my code), you can try this (it is using JavaHDR) + I also added a bit of brightness and contrast to it as all HDR I tested looked too dark for the preview, so if you do not like that you can remove that part from the code:
public int rgbToInteger(int r, int g, int b) {
int rgb = r;
rgb = (rgb << 8) + g;
rgb = (rgb << 8) + b;
return rgb;
}
public BufferedImage hdrToBufferedImage(File hdrFile) throws IOException {
HDRImage hdr = HDREncoder.readHDR(hdrFile, true);
int width = hdr.getWidth();
int height = hdr.getHeight();
BufferedImage bi = new BufferedImage(width, height, BufferedImage.TYPE_INT_RGB);
for (int x = 0; x < width; x++) {
for (int y = 0; y < height; y++) {
int r = (int) (hdr.getPixelValue(x, y, 0) * 255);
int g = (int) (hdr.getPixelValue(x, y, 1) * 255);
int b = (int) (hdr.getPixelValue(x, y, 2) * 255);
bi.setRGB(x, y, rgbToInteger(r, g, b));
}
}
//***** YOU CAN REMOVE THIS SMALL SECTION IF YOU FEEL THE IMAGE IS TOO BRIGHT FOR YOU
float brightness = 2f;
float contrast = 20f;
RescaleOp rescaleOp = new RescaleOp(brightness, contrast, null);
rescaleOp.filter(bi, bi);
//*****
return bi;
}
I can compile and run the code you posted (changing the path obviously) without problems on my two macOS machines, testing on all the LTS Java versions (8, 11 and 17). In addition, I run code similar to this as part of the CI/CD pipeline of my project that tests on Windows and Linux as well. I think there is something wrong with the setup in your IDE or Java on your computer. I am not able to reproduce the "freeze"-situation you describe...
Here is the output of running the program (I also printed the resulting BufferedImage for verification):
=====>>> TEST 1
=====>>> TEST 2
=====>>> TEST 3
image = BufferedImage#5a42bbf4: type = 1 DirectColorModel: rmask=ff0000 gmask=ff00 bmask=ff amask=0 IntegerInterleavedRaster: width = 1024 height = 512 #Bands = 3 xOff = 0 yOff = 0 dataOffset[0] 0
Running with the code as-is (with the NullToneMapper and no post-processing), the image looks like this, due to unnormalized values:
Running with the default/built-in tone mapper, or simply reading the image with ImageIO.read(hdrFile) as suggested in the comments, the image will look like this:
Finally, playing a bit with the code using a custom global tone mapper; param.setToneMapper(new DefaultToneMapper(0.75f)), I get a result like this:
After a long discussion with #HaraldK and his code addition, I am posting the final correct code for this problem, that is in fact mix of #qraqatit code updated a bit with the #HaraldK addition that corrects wrong color tone mapping, here it is:
public int rgbToInteger(int r, int g, int b) {
int rgb = r;
rgb = (rgb << 8) + g;
rgb = (rgb << 8) + b;
return rgb;
}
public BufferedImage hdrToBufferedImage(File hdrFile) throws IOException {
HDRImage hdr = HDREncoder.readHDR(hdrFile, true);
int width = hdr.getWidth();
int height = hdr.getHeight();
BufferedImage bi = new BufferedImage(width, height, BufferedImage.TYPE_INT_RGB);
float colorToneCorrection = 0.75f;
for (int x = 0; x < width; x++) {
for (int y = 0; y < height; y++) {
float r = hdr.getPixelValue(x, y, 0);
int red = (int) ((r / (colorToneCorrection + r)) * 255);
float g = hdr.getPixelValue(x, y, 1);
int green = (int) ((g / (colorToneCorrection + g)) * 255);
float b = hdr.getPixelValue(x, y, 2);
int blue = (int) (int) ((b / (colorToneCorrection + b)) * 255);
bi.setRGB(x, y, rgbToInteger(red, green, blue));
}
}
//MAKE THE RESULTING IMAGE A BIT BRIGHTER
float brightness = 1.35f;
float contrast = 0f;
RescaleOp rescaleOp = new RescaleOp(brightness, contrast, null);
rescaleOp.filter(bi, bi);
return bi;
}

Different code(.java file) for different platform?

I have a code where image data is passed from bitmap to FFmpeg frame recorder and converted to a video. But i need to make small changes while running it on LG G3(armv7) from Asus zenfone 5(x86).
Following are the class variables that create the issue:(declared under, class Main Activity)
inputWidth = 1024;
inputHeight = 650;
Following is the method where the issue occurs:
byte [] getNV21(int inputWidth, int inputHeight, Bitmap bitmap) {
int [] argb = new int[inputWidth * inputHeight];
bitmap.getPixels(argb, 0, inputWidth, 0, 0, inputWidth, inputHeight);
byte [] yuv = new byte[inputWidth*inputHeight*3/2];
encodeYUV420SP(yuv, argb, inputWidth, inputHeight);
return yuv;
}
void encodeYUV420SP(byte[] yuv420sp, int[] argb, int width, int height) {
final int frameSize = width * height;
int yIndex = 0;
int uvIndex = frameSize;
int a, R, G, B, Y, U, V;
int index = 0;
for (int j = 0; j < height; j++) {
for (int i = 0; i < width; i++) {
a = (argb[index] & 0xff000000) >> 24; // a is not used obviously
R = (argb[index] & 0xff0000) >> 16;
G = (argb[index] & 0xff00) >> 8;
B = (argb[index] & 0xff) >> 0;
// well known RGB to YUV algorithm
Y = ( ( 66 * R + 129 * G + 25 * B + 128) >> 8) + 16;
U = ( ( -38 * R - 74 * G + 112 * B + 128) >> 8) + 128;
V = ( ( 112 * R - 94 * G - 18 * B + 128) >> 8) + 128;
// NV21 has a plane of Y and interleaved planes of VU each sampled by a factor of 2
// meaning for every 4 Y pixels there are 1 V and 1 U. Note the sampling is every other
// pixel AND every other scanline.
yuv420sp[yIndex++] = (byte) ((Y < 0) ? 0 : ((Y > 255) ? 255 : Y));
if (j % 2 == 0 && index % 2 == 0) {
yuv420sp[uvIndex++] = (byte)((V<0) ? 0 : ((V > 255) ? 255 : V));
yuv420sp[uvIndex++] = (byte)((U<0) ? 0 : ((U > 255) ? 255 : U));
}
index ++;
}
}
}
Working CODE:
LG G3 :I can use the above variables at any place in the code to get the required output.
Bitmap size returned = 2734200
Asus Zenfone 5: Except at creating the bitmap, I have to use everywhere else bitmap.getHeight() and bitmap.getWidth(), to get the required output.
Surprisingly here Bitmap size returned = 725760 (So its not setting according to set bitmap parameters?)
INCORRECT CODE:
LG G3 : IF i use bitmap.getHeight() and bitmap.getWidth(), i get java.lang.ArrayIndexOutOfBoundsException: length = 102354 , index = 102354. #getNV21 method
Asus Zenfone 5 : If i use inputWidth , inputHeight i get
java.lang.IllegalArgumentException: x + width must be <= bitmap.width() #getNV21 method
How can i generalize the above code for both phones?
In cases like this you can use a Strategy pattern.
Strategy pattern allows you to change algorithms during runtime based on your environment. Basically you define an interface for your strategy. Something like this:
interface MyStrategy {
byte[] getNV21(int inputWidth, int inputHeight, Bitmap bitmap);
}
Then you make multiple implementations of your interface, one for LG, one for Asus and, for example, one for all other devices (device neutral):
class MyStrategyForLG implements MyStrategy {
public byte[] getNV21(int inputWidth, int inputHeight, Bitmap bitmap) {
// ...
}
}
class MyStrategyForAsus implements MyStrategy {
public byte[] getNV21(int inputWidth, int inputHeight, Bitmap bitmap) {
// ...
}
}
class DefaultMyStrategy implements MyStrategy {
public byte[] getNV21(int inputWidth, int inputHeight, Bitmap bitmap) {
// ...
}
}
You can create a factory for MyStrategy so you can avoid use of if-else in your MainActivity. Something like this:
class MyStrategyFactory {
public void createMyStrategy() {
// ...
if ( deviceIsAsus ) {
return new MyStrategyForAsus();
}
if ( deviceIsLg ) {
return new MyStrategyForLG();
}
return new DefaultMyStrategy();
}
}
In your MainActivity you can invoke your strategy like this:
// ...
MyStrategy strategy = new MyStrategyFactory().createMyStrategy();
byte[] bytes = strategy.getNV21(width, height, image);
// ...
The advantage of this method is that you do not need to modify calling site when you add another device, for example, when you notice that Samsung is also a bit weird. Instead you implement MyStrategyForSamsung and change the factory to return it when the code is executed on Samsung device.

Plotting an audio signal using jfreechart (Amplitude vs time)

I have inherited a code snippet which draws audio waveform of a given file. But this waveform is a simple image built using JAVA vector graphics without any labeling, Axes information etc. I would like to port it to the jfreechart to increase it's informative value. My problem is that the code is cryptic to say the least.
public class Plotter {
AudioInputStream audioInputStream;
Vector<Line2D.Double> lines = new Vector<Line2D.Double>();
String errStr;
Capture capture = new Capture();
double duration, seconds;
//File file;
String fileName = "out.png";
SamplingGraph samplingGraph;
String waveformFilename;
Color imageBackgroundColor = new Color(20,20,20);
public Plotter(URL url, String waveformFilename) throws Exception {
if (url != null) {
try {
errStr = null;
this.fileName = waveformFilename;
audioInputStream = AudioSystem.getAudioInputStream(url);
long milliseconds = (long)((audioInputStream.getFrameLength() * 1000) / audioInputStream.getFormat().getFrameRate());
duration = milliseconds / 1000.0;
samplingGraph = new SamplingGraph();
samplingGraph.createWaveForm(null);
} catch (Exception ex) {
reportStatus(ex.toString());
throw ex;
}
} else {
reportStatus("Audio file required.");
}
}
/**
* Render a WaveForm.
*/
class SamplingGraph implements Runnable {
private Thread thread;
private Font font10 = new Font("serif", Font.PLAIN, 10);
private Font font12 = new Font("serif", Font.PLAIN, 12);
Color jfcBlue = new Color(000, 000, 255);
Color pink = new Color(255, 175, 175);
public SamplingGraph() {
}
public void createWaveForm(byte[] audioBytes) {
lines.removeAllElements(); // clear the old vector
AudioFormat format = audioInputStream.getFormat();
if (audioBytes == null) {
try {
audioBytes = new byte[
(int) (audioInputStream.getFrameLength()
* format.getFrameSize())];
audioInputStream.read(audioBytes);
} catch (Exception ex) {
reportStatus(ex.getMessage());
return;
}
}
int w = 500;
int h = 200;
int[] audioData = null;
if (format.getSampleSizeInBits() == 16) {
int nlengthInSamples = audioBytes.length / 2;
audioData = new int[nlengthInSamples];
if (format.isBigEndian()) {
for (int i = 0; i < nlengthInSamples; i++) {
/* First byte is MSB (high order) */
int MSB = (int) audioBytes[2*i];
/* Second byte is LSB (low order) */
int LSB = (int) audioBytes[2*i+1];
audioData[i] = MSB << 8 | (255 & LSB);
}
} else {
for (int i = 0; i < nlengthInSamples; i++) {
/* First byte is LSB (low order) */
int LSB = (int) audioBytes[2*i];
/* Second byte is MSB (high order) */
int MSB = (int) audioBytes[2*i+1];
audioData[i] = MSB << 8 | (255 & LSB);
}
}
} else if (format.getSampleSizeInBits() == 8) {
int nlengthInSamples = audioBytes.length;
audioData = new int[nlengthInSamples];
if (format.getEncoding().toString().startsWith("PCM_SIGN")) {
for (int i = 0; i < audioBytes.length; i++) {
audioData[i] = audioBytes[i];
}
} else {
for (int i = 0; i < audioBytes.length; i++) {
audioData[i] = audioBytes[i] - 128;
}
}
}
int frames_per_pixel = audioBytes.length / format.getFrameSize()/w;
byte my_byte = 0;
double y_last = 0;
int numChannels = format.getChannels();
for (double x = 0; x < w && audioData != null; x++) {
int idx = (int) (frames_per_pixel * numChannels * x);
if (format.getSampleSizeInBits() == 8) {
my_byte = (byte) audioData[idx];
} else {
my_byte = (byte) (128 * audioData[idx] / 32768 );
}
double y_new = (double) (h * (128 - my_byte) / 256);
lines.add(new Line2D.Double(x, y_last, x, y_new));
y_last = y_new;
}
saveToFile();
}
public void saveToFile() {
int w = 500;
int h = 200;
int INFOPAD = 15;
BufferedImage bufferedImage = new BufferedImage(w, h, BufferedImage.TYPE_INT_RGB);
Graphics2D g2 = bufferedImage.createGraphics();
createSampleOnGraphicsContext(w, h, INFOPAD, g2);
g2.dispose();
// Write generated image to a file
try {
// Save as PNG
File file = new File(fileName);
System.out.println(file.getAbsolutePath());
ImageIO.write(bufferedImage, "png", file);
JOptionPane.showMessageDialog(null,
new JLabel(new ImageIcon(fileName)));
} catch (IOException e) {
}
}
private void createSampleOnGraphicsContext(int w, int h, int INFOPAD, Graphics2D g2) {
g2.setBackground(imageBackgroundColor);
g2.clearRect(0, 0, w, h);
g2.setColor(Color.white);
g2.fillRect(0, h-INFOPAD, w, INFOPAD);
if (errStr != null) {
g2.setColor(jfcBlue);
g2.setFont(new Font("serif", Font.BOLD, 18));
g2.drawString("ERROR", 5, 20);
AttributedString as = new AttributedString(errStr);
as.addAttribute(TextAttribute.FONT, font12, 0, errStr.length());
AttributedCharacterIterator aci = as.getIterator();
FontRenderContext frc = g2.getFontRenderContext();
LineBreakMeasurer lbm = new LineBreakMeasurer(aci, frc);
float x = 5, y = 25;
lbm.setPosition(0);
while (lbm.getPosition() < errStr.length()) {
TextLayout tl = lbm.nextLayout(w-x-5);
if (!tl.isLeftToRight()) {
x = w - tl.getAdvance();
}
tl.draw(g2, x, y += tl.getAscent());
y += tl.getDescent() + tl.getLeading();
}
} else if (capture.thread != null) {
g2.setColor(Color.black);
g2.setFont(font12);
//g2.drawString("Length: " + String.valueOf(seconds), 3, h-4);
} else {
g2.setColor(Color.black);
g2.setFont(font12);
//g2.drawString("File: " + fileName + " Length: " + String.valueOf(duration) + " Position: " + String.valueOf(seconds), 3, h-4);
if (audioInputStream != null) {
// .. render sampling graph ..
g2.setColor(jfcBlue);
for (int i = 1; i < lines.size(); i++) {
g2.draw((Line2D) lines.get(i));
}
// .. draw current position ..
if (seconds != 0) {
double loc = seconds/duration*w;
g2.setColor(pink);
g2.setStroke(new BasicStroke(3));
g2.draw(new Line2D.Double(loc, 0, loc, h-INFOPAD-2));
}
}
}
}
public void start() {
thread = new Thread(this);
thread.setName("SamplingGraph");
thread.start();
seconds = 0;
}
public void stop() {
if (thread != null) {
thread.interrupt();
}
thread = null;
}
public void run() {
seconds = 0;
while (thread != null) {
if ( (capture.line != null) && (capture.line.isActive()) ) {
long milliseconds = (long)(capture.line.getMicrosecondPosition() / 1000);
seconds = milliseconds / 1000.0;
}
try { thread.sleep(100); } catch (Exception e) { break; }
while ((capture.line != null && !capture.line.isActive()))
{
try { thread.sleep(10); } catch (Exception e) { break; }
}
}
seconds = 0;
}
} // End class SamplingGraph
/**
* Reads data from the input channel and writes to the output stream
*/
class Capture implements Runnable {
TargetDataLine line;
Thread thread;
public void start() {
errStr = null;
thread = new Thread(this);
thread.setName("Capture");
thread.start();
}
public void stop() {
thread = null;
}
private void shutDown(String message) {
if ((errStr = message) != null && thread != null) {
thread = null;
samplingGraph.stop();
System.err.println(errStr);
}
}
public void run() {
duration = 0;
audioInputStream = null;
// define the required attributes for our line,
// and make sure a compatible line is supported.
AudioFormat format = audioInputStream.getFormat();
DataLine.Info info = new DataLine.Info(TargetDataLine.class,
format);
if (!AudioSystem.isLineSupported(info)) {
shutDown("Line matching " + info + " not supported.");
return;
}
// get and open the target data line for capture.
try {
line = (TargetDataLine) AudioSystem.getLine(info);
line.open(format, line.getBufferSize());
} catch (LineUnavailableException ex) {
shutDown("Unable to open the line: " + ex);
return;
} catch (SecurityException ex) {
shutDown(ex.toString());
//JavaSound.showInfoDialog();
return;
} catch (Exception ex) {
shutDown(ex.toString());
return;
}
// play back the captured audio data
ByteArrayOutputStream out = new ByteArrayOutputStream();
int frameSizeInBytes = format.getFrameSize();
int bufferLengthInFrames = line.getBufferSize() / 8;
int bufferLengthInBytes = bufferLengthInFrames * frameSizeInBytes;
byte[] data = new byte[bufferLengthInBytes];
int numBytesRead;
line.start();
while (thread != null) {
if((numBytesRead = line.read(data, 0, bufferLengthInBytes)) == -1) {
break;
}
out.write(data, 0, numBytesRead);
}
// we reached the end of the stream. stop and close the line.
line.stop();
line.close();
line = null;
// stop and close the output stream
try {
out.flush();
out.close();
} catch (IOException ex) {
ex.printStackTrace();
}
// load bytes into the audio input stream for playback
byte audioBytes[] = out.toByteArray();
ByteArrayInputStream bais = new ByteArrayInputStream(audioBytes);
audioInputStream = new AudioInputStream(bais, format, audioBytes.length / frameSizeInBytes);
long milliseconds = (long)((audioInputStream.getFrameLength() * 1000) / format.getFrameRate());
duration = milliseconds / 1000.0;
try {
audioInputStream.reset();
} catch (Exception ex) {
ex.printStackTrace();
return;
}
samplingGraph.createWaveForm(audioBytes);
}
} // End class Capture
}
I have gone through it several times and know that the below part is where the audio values are calculated but my problem is that I have no idea how can I retrieve the time information at that point, i.e that value belongs to what time interval.
int frames_per_pixel = audioBytes.length / format.getFrameSize()/w;
byte my_byte = 0;
double y_last = 0;
int numChannels = format.getChannels();
for (double x = 0; x < w && audioData != null; x++) {
int idx = (int) (frames_per_pixel * numChannels * x);
if (format.getSampleSizeInBits() == 8) {
my_byte = (byte) audioData[idx];
} else {
my_byte = (byte) (128 * audioData[idx] / 32768 );
}
double y_new = (double) (h * (128 - my_byte) / 256);
lines.add(new Line2D.Double(x, y_last, x, y_new));
y_last = y_new;
}
I would like to plot it using XYSeriesPLot of jfreechart but having trouble calculating required values of x(time ) and y (this is amplitude but is it y_new in this code)?
I understand it is a very easy thing but I am new to this whole audio stuff, I understand the theory behind audio files but this seems to be a simple problem with a tough solution
enter link description here
The key thing to realize is that, in the provided code, the plot is expected to be at a much lower resolution than the actual audio data. For example, consider the following waveform:
The plotting code then represents the data as the blue boxes in the graph:
When the boxes are 1-pixel wide, this correspond to the lines with endpoints (x,y_last) and (x,y_new). As you can see, when the waveform is sufficiently smooth the range of amplitudes from y_last to y_new is a fair approximation to the samples within the box.
Now this representation can be convenient when trying to render the waveform in a pixel-by-pixel fashion (raster display). However, for XYPlot graphs (as can be found in jfreechart) you only need to specify a sequence of (x,y) points and the XYPlot takes care of drawing segments between those point. This corresponds to the green line in the following graph:
In theory, you could just provide every single sample as-is to the XYPlot. However, unless you have few samples, this tends to be quite heavy to plot. So, typically one would downsample the data first. If the waveform is sufficiently smooth the downsampling process reduces to a decimation (i.e. taking 1 every N samples). The decimation factor N then controls the tradeoff between rendering performance and waveform approximation accuracy. Note that if the decimation factor frames_per_pixel used in the provided code to generate a good raster display (i.e. one where the waveform feature that you'll like to see are not hidden by the blocky pixel look, and that does not show aliasing artifacts), the same factor should still be sufficient for the XYPlot (in fact you may be able to downsample a bit more).
As far as mapping the samples to a time/amplitude axes, I would not use the x and y parameters as they are defined in the plotting code provided: they are just pixel indices applicable to a raster-type display (as is the blue box representation above).
Rather I'd map the sample index (idx in the provided code) directly to the time axis by dividing by the sampling rate (which you can get from format.getFrameRate()).
Similarly, I'd map the full-scale sample values to [-1,+1] range by dividing the audioData[idx] samples by either 128 for 8-bits-per-sample data, and by 32768 for 16-bits-per-sample data.
The w and h parameters' main purpose would remain to configure the plotting area size, but would no longer be directly required to compute the XYPlot input (the XYPlot itself takes care of mapping time/amplitude values to pixel coordinates). The w parameter on the other hand also served the additional purpose of determining the number of points to draw. Now you may want to control the number of points based on how much decimation the waveform can sustain without showing too much distortion, or you could keep it as-is to display the waveform at the maximum available plot resolution (with some performance cost).
Note however that you may have to convert frames_per_pixel to a floating point value if you are expecting to display waveforms with fewer than w samples.

why java multithread isnt speeding up graphic program

i am trying to draw a triangle unsing multiple threads, each thread will draw an independent piece of the triangle. But its runs a lot slower than using just one thread. whats is the problem?
here is the code:
(...)
int nCores = Runtime.getRuntime().availableProcessors();
Thread[] threads = new Thread[nCores];
int width = box[1][0] - box[0][0];
int incr = width / nCores;
int x = box[0][0];
for (int i = 0; i < nCores; i++) {
threads[i] = new Thread(new TriFiller(x, x + incr, z - nx * incr
* i));
threads[i].start();
x += incr;
}
try {
for (int i = 0; i < nCores; i++)
threads[i].join();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
and the runnable:
public class TriFiller implements Runnable {
int xi, xf;
double z;
public TriFiller(int xi, int xf, double z) {
super();
this.xi = xi;
this.xf = xf;
this.z = z;
}
#Override
public void run() {
boolean inOut = false;
double z0 = z;
int rgbColor = shade.getRGB();
BufferedImage image = wd.getImage();
for (int i = xi; i < xf; i++) {
for (int j = box[0][1]; j < box[1][1]; j++) {
if (isOnSet(i, j, polyNormals, intBuffer)
&& z < zBuffer[i][j] && z > zd) {
image.setRGB(i, j, rgbColor);
zBuffer[i][j] = z;
inOut = true;
} else {
if (inOut) {
break;
}
}
z += -ny;
}
z0 += -nx;
z = z0;
inOut = false;
}
}
}
The reason you're having trouble is, that swing painting doesn't work with multithreading. Read this extract from another forum (jfree.org):
"I think the reason that you are not seeing any performance improvement is that you are not introducing any parrallelism by spinning off another thread.
The way updating the screen works in Swing is essentially:
1) As soon as the component decides that it should be repainted on the screen, JComponent.repaint() is called. This results in an asynchronous repaint request being sent to the RepaintManager, which uses invokeLater() to queue a Runnable on the EDT.
2) When the Runnable executes, it invokes the RepaintManager, which invokes paintImmediately() on the Component. The component then sets the clip rectangle and calls paint() which ends up calling paintComponent() which you have overridden. Remember that the screen is locked and will remain locked until the component has entirely repainted the dirty rectangle.
There is no point in spinning off a thread to generate the image buffer, because the RepaintManager HAS TO block until the buffer is ready so it can finish updating the dirty rectangle before releasing the lock on the screen.
All the toolkits that swing supports (windows, linux, mac) are single threaded by design. It is not possible to concurrently update more than one region of the screen."

Need to understand some java code

I'm fairly new to android programming, but I am a quick learner.
So I found an intresting piece of code here: http://code.google.com/p/camdroiduni/source/browse/trunk/code/eclipse_workspace/camdroid/src/de/aes/camdroid/CameraView.java
And it's about live streaming from your device's camera to your browser.
But I want to know how the code actually works.
These are the things I want to understand:
1) How do they stream to the webbrowser. I understand that they send a index.html file to the ip adress of the device (on wifi) and that file reloads the page every second. But how do they send the index.html file to the desired ip address with sockets?
2) http://code.google.com/p/camdroiduni/wiki/Status#save_pictures_frequently , Here they mention they are using video, but I am still convinced they take pictures and send them as I don't see the mediarecorder anywhere.
Now my question is how they keep sending AND saving those images into the SD folder (i think). I think it's done with this code, but how does it works. Like with c.takepicture, it takes long to save and start previewing again, so that's no option to livestream.
public synchronized byte[] getPicture() {
try {
while (!isPreviewOn) wait();
isDecoding = true;
mCamera.setOneShotPreviewCallback(this);
while (isDecoding) wait();
} catch (Exception e) {
return null;
}
return mCurrentFrame;
}
private LayoutParams calcResolution (int origWidth, int origHeight, int aimWidth, int aimHeight) {
double origRatio = (double)origWidth/(double)origHeight;
double aimRatio = (double)aimWidth/(double)aimHeight;
if (aimRatio>origRatio)
return new LayoutParams(origWidth,(int)(origWidth/aimRatio));
else
return new LayoutParams((int)(origHeight*aimRatio),origHeight);
}
private void raw2jpg(int[] rgb, byte[] raw, int width, int height) {
final int frameSize = width * height;
for (int j = 0, yp = 0; j < height; j++) {
int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;
for (int i = 0; i < width; i++, yp++) {
int y=0;
if(yp < raw.length) {
y = (0xff & ((int) raw[yp])) - 16;
}
if (y < 0) y = 0;
if ((i & 1) == 0) {
if(uvp<raw.length) {
v = (0xff & raw[uvp++]) - 128;
u = (0xff & raw[uvp++]) - 128;
}
}
int y1192 = 1192 * y;
int r = (y1192 + 1634 * v);
int g = (y1192 - 833 * v - 400 * u);
int b = (y1192 + 2066 * u);
if (r < 0) r = 0; else if (r > 262143) r = 262143;
if (g < 0) g = 0; else if (g > 262143) g = 262143;
if (b < 0) b = 0; else if (b > 262143) b = 262143;
rgb[yp] = 0xff000000 | ((r << 6) &
0xff0000) | ((g >> 2) &
0xff00) | ((b >> 10) &
0xff);
}
}
}
#Override
public synchronized void onPreviewFrame(byte[] data, Camera camera) {
int width = mSettings.PictureW() ;
int height = mSettings.PictureH();
// API 8 and above
// YuvImage yuvi = new YuvImage(data, ImageFormat.NV21 , width, height, null);
// Rect rect = new Rect(0,0,yuvi.getWidth() ,yuvi.getHeight() );
// OutputStream out = new ByteArrayOutputStream();
// yuvi.compressToJpeg(rect, 10, out);
// byte[] ref = ((ByteArrayOutputStream)out).toByteArray();
// API 7
int[] temp = new int[width*height];
OutputStream out = new ByteArrayOutputStream();
// byte[] ref = null;
Bitmap bm = null;
raw2jpg(temp, data, width, height);
bm = Bitmap.createBitmap(temp, width, height, Bitmap.Config.RGB_565);
bm.compress(CompressFormat.JPEG, mSettings.PictureQ(), out);
/*ref*/mCurrentFrame = ((ByteArrayOutputStream)out).toByteArray();
// mCurrentFrame = new byte[ref.length];
// System.arraycopy(ref, 0, mCurrentFrame, 0, ref.length);
isDecoding = false;
notify();
}
I really hope someone can explain these things as good as possible. That would really much be appreciated.
Ok, If anyone is intrested, I have the answer.
The code repeatedly takes a snapshot from the camera preview using setOneShotPreviewCallback() to call onPreviewFrame(). The frame is delivered in YUV format so raw2jpg() converts it into 32 bit ARGB for the jpeg encoder. NV21 is a YUV planar format as described here.
getPicture() is called, presumably by the application, and produces the jpeg data for the image in the private byte array mCurrentFrame and returns that array. What happens to if afterwards is not in that code fragment. Note that getPicture() does a couple of wait()s. This is because the image acquisition code is running in a separate thread to that of the application.
In the Main activity, the public static byte CurrentJPEG get this: cameraFrame.getPicture(); in public void run(). In this webservice it is send with a socket to the desired ip.
Correct me if I'm wrong.
Now I just still wonder how the image is displayed in the browser as a picture because you send byte data to it right? Please check this out: http://code.google.com/p/camdroiduni/source/browse/trunk/code/eclipse_workspace/camdroid/src/de/aes/camdroid/WebServer.java
Nothing in that code is sending any data to any URL. The getPicture method is returning a byte array, probably being used as an outputstream in some other method/Class that is then funneling it to a web service through some sort of protocol (UDP likely).

Categories