I am trying to threshold my images, but some of them have problem with casting to DataBufferByte. I post here two types of pictures - the first (the test1.jpg is good and can be casted to DataBufferByte), the second (test2.jpg throwing an exception).
Images:
Here is my code:
public static void main(String[] args) throws IOException {
System.loadLibrary( Core.NATIVE_LIBRARY_NAME );
BufferedImage bufferedImage = ImageIO.read(new File("/test/test2.jpg"));
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
ImageIO.write(bufferedImage, "jpg", byteArrayOutputStream);
byte[] input = byteArrayOutputStream.toByteArray();
InputStream is = new ByteArrayInputStream(input);
bufferedImage = ImageIO.read(is);
byte[] data = ((DataBufferByte) bufferedImage.getRaster().getDataBuffer()).getData();
Mat mat = new Mat(bufferedImage.getHeight(), bufferedImage.getWidth(), CvType.CV_8UC3);
mat.put(0, 0, data);
Mat mat1 = new Mat(bufferedImage.getHeight(),bufferedImage.getWidth(),CvType.CV_8UC1);
Imgproc.cvtColor(mat, mat1, Imgproc.COLOR_RGB2GRAY);
byte[] data1 = new byte[mat1.rows() * mat1.cols() * (int)(mat1.elemSize())];
mat1.get(0, 0, data1);
Mat dst = new Mat();
Imgproc.adaptiveThreshold(mat1, dst, 255, Imgproc.ADAPTIVE_THRESH_MEAN_C, Imgproc.THRESH_BINARY, 11, 15);
MatOfByte matOfByte = new MatOfByte();
Imgcodecs.imencode(".jpg", dst, matOfByte);
ByteArrayInputStream inputStream = new ByteArrayInputStream(matOfByte.toArray());
bufferedImage = ImageIO.read(inputStream);
ImageIO.write(bufferedImage, "jpg", new File("/test_output/test2.jpg"));
}
The output for test1.jpg image is correct:
But the second image throwing an exception
Exception in thread "main" java.lang.ClassCastException:
java.awt.image.DataBufferInt cannot be cast to
java.awt.image.DataBufferByte
at line
byte[] data = ((DataBufferByte) bufferedImage.getRaster().getDataBuffer()).getData();
What is the problem ? Why the one .jpg image is converted without problems but second throwing an exception?
Both are not of the same format. One using 8-bit for color representation and the other 3x8-bit (so 4 bytes, with padding or transparency)?
Looking at it give:
image1 : JPEG image data, JFIF standard 1.01, aspect ratio, density 1x1, segment length 16, baseline, precision 8, 449x361, components 3
image2 : PNG image data, 617 x 353, 8-bit/color RGBA, non-interlaced
You may use the getDataType() method on the buffer to get the size of the color representation.
Related
I'm using JMRTD library (https://github.com/E3V3A/JMRTD/tree/master/wsq_imageio) to encode jpg to WSQ. I set Bitmap by manually instead of decode from WSQ file.
BufferedImage bufferedImage = ImageIO.read(fileInput.getInputStream());
WritableRaster raster = bufferedImage.getRaster();
DataBufferByte data = (DataBufferByte) raster.getDataBuffer();
Bitmap bitmap = new Bitmap(data.getData(), width, height, ppi, depth, lossyflag);
OutputStream outputStream = new FileOutputStream("c.wsq");
String commentText = "";
WSQEncoder.encode(outputStream, bitmap, bitrate, commentText);
Here is my original picture jpg:
And below is my result WSQ file:
How can I fix it. Many thanks!
I resolved this problem, here is my code for convert jpg, png to wsq format:
// 1. Read files to BufferedImage for get width, height. Convert Bit depth to 8-gray
BufferedImage bufferedImage = ImageIO.read(fileInput.getInputStream());
// 2. Convert Bit depth to 8-gray (This is what i had to do to solve this problem)
BufferedImage img = new BufferedImage(bufferedImage.getWidth(), bufferedImage.getHeight(), BufferedImage.TYPE_BYTE_GRAY);
Graphics g = img.getGraphics();
g.drawImage(bufferedImage, 0, 0, null);
g.dispose();
// 3. Convert file format to byte[] and convert to type Bitmap
WritableRaster raster = img.getRaster();
DataBufferByte data = (DataBufferByte) raster.getDataBuffer();
Bitmap bitmap = new Bitmap(data.getData(), bufferedImage.getWidth(), bufferedImage.getHeight(), 500, 8, 1);
// 4. Create file wsq
OutputStream outputStream = new FileOutputStream("c.wsq");
double bitrate = 0.75f;
String commentText = "";
// 5. Write the input file to the generated wsq file
WSQEncoder.encode(outputStream, bitmap, bitrate, commentText);
outputStream.close();
Hope help you # Dan Ortega
There are arrays of bytes YUV NV21
How to do the conversion and save it to jpg file?
I use next code
byte[] mYUVData = YUV_420_888toNV21(planeY, planeU, planeV);
Mat mYuv = new Mat(1080 + 1080/2, 1040, CvType.CV_8UC1);
mYuv.put(0, 0, mYUVData);
Mat mRgba = new Mat();
Imgproc.cvtColor( mYuv, mRgba, Imgproc.COLOR_YUV420sp2RGBA );
// Read image as before
MatOfByte mob=new MatOfByte();
Imgcodecs.imencode(".jpg", mYuv, mob);
byte ba[]=mob.toArray();
BufferedImage bi= ImageIO.read(new ByteArrayInputStream(ba));
ImageIO.write(bi, "jpg", new File("C://Users/it/Downloads/image.jpg"));
But i have wrong jpg image
I am trying to get back a bufferedImage from array of bytes, but I am getting an error saying bufferedimage is null. I actually tried several ways, everything ended up in the same way. Here goes my code:
1)
byte[] arr = Base64.decode(base64String);
BufferedImage bImageFromConvert =ImageIO.read(new ByteArrayInputStream(arr));
2)
InputStream in = new ByteArrayInputStream(arr);
BufferedImage bImageFromConvert = ImageIO.read(in);
I am pretty sure my byte array contains data and I think ImageIO.read() is where my code goes wrong.
The error is in your BufferedImage to Base64 encode method as you have posted in the comments.
You are never writing the BufferedImage to the ByteArrayOutputStream. Therefore the Base64 string is empty, and reading the empty string produces a null BufferedImage.
You should use this code to encode your image:
BufferedImage originalImage = ImageIO.read(new File("G:\\a.jpg"));
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ImageIO.write( originalImage, "jpg", baos );
String base64String=Base64.encode(baos.toByteArray());
To decode the image use this code:
byte[] arr = Base64.decode(base64String);
BufferedImage bImageFromConvert =ImageIO.read(new ByteArrayInputStream(arr));
System.out.println(bImageFromConvert.getWidth());
Try this code.Maybe it works. It worked for me.
byte[] aByteArray = {};
int width = ;
int height = ;
DataBuffer buffer = new DataBufferByte(aByteArray, aByteArray.length);
WritableRaster raster = Raster.createInterleavedRaster(buffer, width, height, 3 * width, 3, new int[] {0, 1, 2}, (Point)null);
ColorModel cm = new ComponentColorModel(ColorModel.getRGBdefault().getColorSpace(), false, true, Transparency.OPAQUE, DataBuffer.TYPE_BYTE);
BufferedImage image = new BufferedImage(cm, raster, true, null);
Just add the byte, width and height to code and customize it.
I am trying to convert a 8 bit gray scale image byte array to jpg image format in java.
static byte[] bytes = new byte[]{126, 126, 127, -128};
public static void main(String[] args) throws IOException {
ByteArrayInputStream bis = new ByteArrayInputStream(bytes);
Iterator<?> readers = ImageIO.getImageReadersByFormatName("jpg");
ImageReader reader = (ImageReader) readers.next();
Object source = bis;
ImageInputStream iis = ImageIO.createImageInputStream(source);
reader.setInput(iis, true);
ImageReadParam param = reader.getDefaultReadParam();
Image image = reader.read(0, param);
BufferedImage bufferedImage = new BufferedImage(image.getWidth(null), image.getHeight(null), BufferedImage.TYPE_BYTE_GRAY);
Graphics2D g2 = bufferedImage.createGraphics();
g2.drawImage(image, null, null);
File imageFile = new File("C:\\newrose3.jpg");
ImageIO.write(bufferedImage, "jpg", imageFile);
System.out.println(imageFile.getPath());
}
I have bytes having image data and I want to convert it into readable image format in java.
Assuming bytes is supposed to be pixel data, you should create an image from those bytes, then write it out as JPEG.
Something like:
// Create an image type grayscale
BufferedImage image = new BufferedImage(2, 2, BufferedImage.TYPE_BYTE_GRAY);
// Get the backing pixels, and copy into it
byte[] data = ((DataBufferByte) image.getRaster().getDataBuffer()).getData();
System.arraycopy(bytes, 0, data, 0, bytes.length);
// Write it out:
ImageIO.write(image, "jpg", new File("yourPathHere");
I have seen some code source, but I do not understand...
I use Java 7
Please, how to convert a RGB (Red,Green,Blue) Byte Array (or something similar) to a .PNG file format ?
Example from an array that could represent "a RGB pixel" :
byte[] aByteArray={0xa,0x2,0xf};
Important Aspect :
I try to generate a .PNG file only from a byte[] "not from a previous existing file"
is it possible with an existing API? ;)
Here my first code :
byte[] aByteArray={0xa,0x2,0xf};
ByteArrayInputStream bais = new ByteArrayInputStream(aByteArray);
File outputfile = new File("image.png");
ImageIO.write(bais, "png", outputfile);
....Error : No suitable Method Found
Here the other version modified from Jeremy but look similar :
byte[] aByteArray={0xa,0x2,0xf};
ByteArrayInputStream bais = new ByteArrayInputStream(aByteArray);
final BufferedImage bufferedImage = ImageIO.read(newByteArrayInputStream(aByteArray));
ImageIO.write(bufferedImage, "png", new File("image.png"));
....multiple Errors : image == null! ...... Sure ? Note : I do not search to use a source file
The Image I/O API deals with images, so you need to make an image from your byte array first before you write it out.
byte[] aByteArray = {0xa,0x2,0xf,(byte)0xff,(byte)0xff,(byte)0xff};
int width = 1;
int height = 2;
DataBuffer buffer = new DataBufferByte(aByteArray, aByteArray.length);
//3 bytes per pixel: red, green, blue
WritableRaster raster = Raster.createInterleavedRaster(buffer, width, height, 3 * width, 3, new int[] {0, 1, 2}, (Point)null);
ColorModel cm = new ComponentColorModel(ColorModel.getRGBdefault().getColorSpace(), false, true, Transparency.OPAQUE, DataBuffer.TYPE_BYTE);
BufferedImage image = new BufferedImage(cm, raster, true, null);
ImageIO.write(image, "png", new File("image.png"));
This assumes the byte array has three bytes per pixel (red, green then blue) and the range of values is 0-255.