I'm trying to convert a Buffered image into a ByteBuffer but i get this exception
java.awt.image.DataBufferInt cannot be cast to java.awt.image.DataBufferByte
can someone please help me out and suggest a good method of conversion.
Source:
public static ByteBuffer convertImageData(BufferedImage bi)
{
byte[] pixelData = ((DataBufferByte) bi.getRaster().getDataBuffer()).getData();
// return ByteBuffer.wrap(pixelData);
ByteBuffer buf = ByteBuffer.allocateDirect(pixelData.length);
buf.order(ByteOrder.nativeOrder());
buf.put(pixelData);
buf.flip();
return buf;
}
this is my object
ByteBuffer buf = convertImageData(image);
You can't just cast an arbitrary databuffer to DataBufferByte, you need to make sure it actually is the right type:
ByteBuffer byteBuffer;
DataBuffer dataBuffer = bi.getRaster().getDataBuffer();
if (dataBuffer instanceof DataBufferByte) {
byte[] pixelData = ((DataBufferByte) dataBuffer).getData();
byteBuffer = ByteBuffer.wrap(pixelData);
}
else if (dataBuffer instanceof DataBufferUShort) {
short[] pixelData = ((DataBufferUShort) dataBuffer).getData();
byteBuffer = ByteBuffer.allocate(pixelData.length * 2);
byteBuffer.asShortBuffer().put(ShortBuffer.wrap(pixelData));
}
else if (dataBuffer instanceof DataBufferShort) {
short[] pixelData = ((DataBufferShort) dataBuffer).getData();
byteBuffer = ByteBuffer.allocate(pixelData.length * 2);
byteBuffer.asShortBuffer().put(ShortBuffer.wrap(pixelData));
}
else if (dataBuffer instanceof DataBufferInt) {
int[] pixelData = ((DataBufferInt) dataBuffer).getData();
byteBuffer = ByteBuffer.allocate(pixelData.length * 4);
byteBuffer.asIntBuffer().put(IntBuffer.wrap(pixelData));
}
else {
throw new IllegalArgumentException("Not implemented for data buffer type: " + dataBuffer.getClass());
}
If your BufferedImage is one of the standard types (BufferedImage.TYPE_* other than TYPE_CUSTOM) the above should work.
Note that special DatBuffer subclasses may exist, and may store pixels in multiple banks, with different byte order, might be channel interleaved (rather than the standard pixel interleaved) etc. So the above code is still not completely general.
If you are to pass these ByteBuffers to native code, using allocateDirect(..) and copying the pixels over might be faster, otherwise I think using wrap(..) will make for both simpler code and be more efficient.
If you want to encode the image data, you may want to use the ImageIO here. Something like this:
public static ByteBuffer convertImageData(BufferedImage bi) {
ByteArrayOutputStream out = new ByteArrayOutputStream();
try {
ImageIO.write(bi, "png", out);
return ByteBuffer.wrap(out.toByteArray());
} catch (IOException ex) {
//TODO
}
return null;
}
Here is the list of supported formats.
Related
I'm currently working on a dynamic Animation Loader for Characters in a game and for that I'm in need of detecting if the current frame is completely blank in order to stop loading more sprites.
This is what I'm currently using to find out if the current image is blank:
public static boolean isBlankImage(BufferedImage b) {
byte[] pixels1 = getPixels(b);
byte[] pixels2 = getPixels(getBlankImage(b.getWidth(), b.getHeight()));
return Arrays.equals(pixels1, pixels2);
}
private static BufferedImage getBlankImage(int width, int height) {
return new BufferedImage(width, height, BufferedImage.TYPE_INT_ARGB);
}
private static byte[] getPixels(BufferedImage b) {
byte[] pixels = ((DataBufferByte) b.getRaster().getDataBuffer()).getData();
return pixels;
}
However, as soon as I run it, I get this annoying error:
Exception in thread "Thread-0" java.lang.ClassCastException:
java.awt.image.DataBufferInt cannot be cast to java.awt.image.DataBufferByte
I've tried switching the casting type but all I get in return is:
Exception in thread "Thread-0" java.lang.ClassCastException:
java.awt.image.DataBufferByte cannot be cast to java.awt.image.DataBufferInt
I've searched all over the place for an answer to no avail, so here's my question: Is there a better functional way to check if an image is fully transparent ?
Any help will be greatly appreciated.
The method must be return a byte array, you are trying convert a DataBuffer in a DataBufferByte.
I have changed the name getPixels to getByteArray. Since it is not the same.
Try this:
private static byte[] getByteArray(BufferedImage img) {
byte[] imageInByte = null;
String format = "jpeg"; //Needs a image TYPE
try {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ImageIO.write(img, format, baos);
baos.flush();
imageInByte = baos.toByteArray();
baos.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return imageInByte;
}
The blank image's DataBuffer is indeed an instance of DataBufferInt while your original image has a buffer of type DataBufferByte.
You should create your empty image based on the type of the image to compare with:
private static BufferedImage getBlankImage(int width, int height, int type) {
return new BufferedImage(width, height, type);
}
and call it this way:
getBlankImage(b.getWidth(), b.getHeight(), b.getType())
Notice that in terms of performance and memory usage it may be better to create the empty image only once (or once for each image type which may occur).
Probably the image type and size are constant and written wherever the actual images are created.
Now you have a proper empty image and may test its equality as in Is there a simple way to compare BufferedImage instances?:
public static boolean compareImages(BufferedImage imgA, BufferedImage imgB) {
int width = imgA.getWidth();
int height = imgA.getHeight();
// Loop over every pixel.
for (int y = 0; y < height; y++) {
for (int x = 0; x < width; x++) {
// Compare the pixels for equality.
if (imgA.getRGB(x, y) != imgB.getRGB(x, y)) {
return false;
}
}
}
return true;
}
How to convert mat(OpenCV) to image(JavaFX)?
I think this isn't not best method:
MatOfByte byteMat = new MatOfByte();
Highgui.imencode(".bmp", mat, byteMat);
return new Image(new ByteArrayInputStream(byteMat.toArray()));
P.S.
Image - import javafx.scene.image.Image;
One way to do it would be this.I do not remember the source from where I got this:-
private Image mat2Image(Mat frame)
{
int type = BufferedImage.TYPE_BYTE_GRAY;
if ( frame.channels() > 1 ) {
type = BufferedImage.TYPE_3BYTE_BGR;
}
int bufferSize = frame.channels()*frame.cols()*frame.rows();
byte [] b = new byte[bufferSize];
frame.get(0,0,b); // get all the pixels
BufferedImage image = new BufferedImage(frame.cols(),frame.rows(), type);
final byte[] targetPixels = ((DataBufferByte) image.getRaster().getDataBuffer()).getData();
System.arraycopy(b, 0, targetPixels, 0, b.length);
return SwingFXUtils.toFXImage(image,null);
}
There may be a way to make it neater using the Converters class from Opencv along with JDK8. I will update this if I find any such thing.
Paritosh, the issue with your method is that it only applies to the Mats with the type of CvType.CV_8U or CvType.CV_8S, since those Mats can be contained in a byte array. If the type of the Mat is, lets say, CvType.CV_32F, you would need a float array to hold the data. The float array cannot be System.arraycopied into a byte[] targetPixels array.
I am trying to convert image to byte[] using code
public static byte[] extractBytes(String ImageName) throws IOException {
// open image
File imgPath = new File(ImageName);
BufferedImage bufferedImage = ImageIO.read(imgPath);
// get DataBufferBytes from Raster
WritableRaster raster = bufferedImage.getRaster();
DataBufferByte data = (DataBufferByte) raster.getDataBuffer();
return (data.getData());
}
When I am testing it using code
public static void main(String[] args) throws IOException {
String filepath = "image_old.jpg";
byte[] data = extractBytes(filepath);
System.out.println(data.length);
BufferedImage img = ImageIO.read(new ByteArrayInputStream(data));
File outputfile = new File("image_new.jpg");
ImageIO.write(img, "jpeg", outputfile);
}
I am getting data.length = 4665600 and getting error
Exception in thread "main" java.lang.IllegalArgumentException: image == null!
at javax.imageio.ImageTypeSpecifier.createFromRenderedImage(ImageTypeSpecifier.java:925)
at javax.imageio.ImageIO.getWriter(ImageIO.java:1591)
at javax.imageio.ImageIO.write(ImageIO.java:1520)
at com.medianet.hello.HbaseUtil.main(HbaseUtil.java:138)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
But when I am changing my extractBytes code to
public static byte[] extractBytes (String ImageName) throws IOException {
ByteArrayOutputStream baos=new ByteArrayOutputStream();
BufferedImage img=ImageIO.read(new File(ImageName));
ImageIO.write(img, "jpg", baos);
baos.flush();
return baos.toByteArray();
}
I am getting data.length = 120905 and getting success(image.jpg getting created in the desired location)
The thing is, the first version of extractBytes reads an image, and just returns the image's pixels as an array of bytes (assuming it uses DataBufferByte). These bytes are not in a file format, and are useless without extra information, such as width, height, color space etc. ImageIO can't read these bytes back, and because of this, null is returned (and assigned to img, later causing an IllegalArgumentException from ImageIO.write(...)).
The second version decodes the image, then encodes it again in JPEG format. This is a format ImageIO will be able to read, and you get an image (assigned to img) as you expect.
However, you code seems like just a very, very CPU-expensive way of copying images (you decode an image, then encode, then decode again, before finally encoding)... For JPEG files this decode/encode cycle will also degrade the image quality. Unless you are planning to use the image data for anything, and just want to copy an image from one place to another, don't use ImageIO and BufferedImages. These types are intended for image manipulation.
Here's a modified version of your main method:
public static void main(String[] args) throws IOException {
byte[] buffer = new byte[1024];
File inFile = new File("image_old.jpg");
File outFile = new File("image_new.jpg");
InputStream in = new FileInputStream(inFile);
try {
OutputStream out = new FileOutputStream(outFile);
try {
int len;
while ((len = in.read(buffer)) > 0) {
out.write(buffer, 0, len);
}
}
finally {
out.close();
}
}
finally {
in.close();
}
}
(It's possible to write this better/more elegant using try-with-resources in Java 7, or NIO2 Files.copy in Java 8, but I leave that to you. :-) )
Hello I was curious on how data can be downloaded in java, so I looked through few methods and decided to use BufferedInputStream.
Now when I download, I download the file by 1024 bytes burst, and everytime it downloads 1kb I concat the temp array to the main data array.tH
I use this concat method:
public static byte[] concat(byte[] data, byte[] bytes) {
byte[] result = Arrays.copyOf(data, data.length + bytes.length);
System.arraycopy(bytes, 0, result, data.length, bytes.length);
return result;
}
This is my download process:
URL target = new URL ("http://freshhdwall.com/wp-content/uploads/2014/08/Image-Art-Gallery.jpg");
URLConnection t = target.openConnection();
t.setRequestProperty("User-Agent", "NING/1.0");
t.connect();
BufferedInputStream stream = new BufferedInputStream(t.getInputStream());
final byte[] bytes = new byte[1024];
int count;
byte[] data = new byte[0];
while ( (count = stream.read(bytes, 0, bytes.length)) != -1) {
System.out.println(count);
data = concat(data, bytes);
}
Now after downloading, I convert the bytes array to BufferedImage using ByteArrayInputStream:
InputStream s = new ByteArrayInputStream(data);
BufferedImage m = ImageIO.read(s);
And then I display the result:
JFrame j = new JFrame();
j.add(new JLabel(new ImageIcon(m)));
j.pack();
j.setVisible(true);
Now the result image looks like this:
(source: gyazo.com)
As you see, the image looks broken, missing bytes when downloading.
This is the real image:
img http://freshhdwall.com/wp-content/uploads/2014/08/Image-Art-Gallery.jpg
What did I do wrong that it displays the image like that?
On each loop iteration, you potentially read less than bytes.length bytes. As such, you can't use the full length of the array. You need to use exactly that part that was actually read.
One solution is to use
while ((count = stream.read(bytes, 0, bytes.length)) != -1) {
System.out.println(count); // this should hint at this
data = concat(data, bytes, count); // use the count
}
and
public static byte[] concat(byte[] data, byte[] bytes, int count) {
byte[] result = Arrays.copyOf(data, data.length + count);
System.arraycopy(bytes, 0, result, data.length, count);
return result;
}
so as to only copy over the bytes you've actually received.
Consider using some of the solutions here. They are probably more efficient or, at least, more readable.
I am looking for the fastest way to download an image from the GPU to a file, for later loading in the same application (so not necessarily PNG e.g.).
I noticed however that when I use a DeflaterStream directly, it is considerably slower than ImageIO with DeflaterStream ( so using ImageIO.write(deflaterstream) ).
Am I doing something wrong? Or is ImageIO just heavily optimized/better than fastest GZIP compression?
glBindTexture(GL_TEXTURE_2D, textureId);
int bpp = 4; // Assuming a 32-bit display with a byte each for red, green, blue, and alpha.
ByteBuffer buffer = BufferUtils.createByteBuffer(SAVE_WIDTH * SAVE_HEIGHT * bpp);
glGetTexImage(GL_TEXTURE_2D, 0, GL_RGBA8, GL_UNSIGNED_BYTE, buffer );
// File file = new File("saves/s-" + layer + ".png"); // The file to save to.
String format = "png"; // Example: "PNG" or "JPG"
// BufferedImage image = new BufferedImage(SAVE_WIDTH, SAVE_HEIGHT, BufferedImage.TYPE_INT_ARGB);
try {
FileOutputStream bOut = new FileOutputStream("saves/p-"+ layer + ".gz");
DeflaterOutputStream gzipOut = new DeflaterOutputStream (bOut, new Deflater(Deflater.BEST_SPEED));
//buffer.flip();
System.out.println("Bytes remaining " + buffer.remaining());
while (buffer.hasRemaining()) gzipOut.write(buffer.get());
gzipOut.close();
bOut.close();
} catch (IOException ex) {
Logger.getLogger(SaveStateManager.class.getName()).log(Level.SEVERE, null, ex);
}
Conmpression is always expensive but you might be able to improve with
OutputStream bOut = new BufferedOutputStream(new FileOutputStream("saves/p-" + layer + ".gz"));
DeflaterOutputStream defOut = new DeflaterOutputStream(bOut, new Deflater(Deflater.BEST_SPEED));
//buffer.flip();
byte[] bytes = new byte[1024];
while (buffer.hasRemaining()) {
int len = Math.min(buffer.remaining(), bytes.length);
buffer.get(bytes, 0, len);
defOut.write(bytes, 0, len);
}
defOut.close();