Android bitmap pixels byte array without alpha channel - java

I'm trying to get an byte array of pixels. I'm using the ARGB_8888 for
decodeByteArray function. The getPixels() or copyPixelsToBuffer(),
return a array in R G B A form. Is it possible to get only R G B from them, without creating a new array and copying bytes that i don't need. I know there is a RGB_565, but it is not optimal for my case where i need a byte per color.
Thanks.

Usecolor=bitmap.getPixel(x,y) to obtain a Color integer at the specified location. Next, use the red(color), green(color) and blue(color) methods from the Color class, which represents each color value in the [0..255] range.
With regards to the alpha channel, one could multiply its ratio into every other color.
Here is an example implementation:
int width = bitmap.getWidth();
int height = bitmap.getHeight();
ByteBuffer b = ByteBuffer.allocate(width*height*3);
for (int y=0;y<height;y++)
for (int x=0;x<width;x++) {
int index = (y*width + x)*3;
int color = bitmap.getPixel(x,y);
float alpha = (float) Color.alpha(color)/255;
b.put(index, (byte) round(alpha*Color.red(color)));
b.put(index+1, (byte) round(alpha*Color.green(color)));
b.put(index+2, (byte) round(alpha*Color.blue(color)));
}
byte[] pixelArray = b.array();

Related

How to do arithmetic operations on pixels in Java

I have to add some constant value to all pixels in my image - for gray image and colored. But I don't know how can I do that. I read image by BufferedImage, and I'm trying to get 2d array of pixels.
I found something like BufferedImage.getRGB() but it returns weird values (negative and huge). How to add some value to my bufferedimage?
You can use:
byte[] pixels = ((DataBufferByte) bufferedImage.getRaster().getDataBuffer()).getData();
To get a byte[] of all pixels in the image and then loop over the byte[] adding your constant to each byte element.
If you want the bytes converted to a 2-dimensional byte[], I found an example that does just that (Get Two Dimensional Pixel Array) .
In summary the code looks like:
private static int[][] convertToArrayLocation(BufferedImage inputImage) {
final byte[] pixels = ((DataBufferByte) inputImage.getRaster().getDataBuffer()).getData(); // get pixel value as single array from buffered Image
final int width = inputImage.getWidth(); //get image width value
final int height = inputImage.getHeight(); //get image height value
int[][] result = new int[height][width]; //Initialize the array with height and width
//this loop allocates pixels value to two dimensional array
for (int pixel = 0, row = 0, col = 0; pixel < pixels.length; pixel++) {
int argb = 0;
argb = (int) pixels[pixel];
if (argb < 0) { //if pixel value is negative, change to positive
argb += 256;
}
result[row][col] = argb;
col++;
if (col == width) {
col = 0;
row++;
}
}
return result; //return the result as two dimensional array
} //!end of method!//
To add a constant value to all pixels, you can use RescaleOp. Your constant will be the offset for each channel. Leave scale at 1.0 and hints may be null.
// Positive offset makes the image brighter, negative values makes it darker
int offset = 100; // ...or whatever your constant value is
BufferedImage brighter = new RescaleOp(1, offset, null)
.filter(image, null);
To change the current image, instead of creating a new one, you may use:
new RescaleOp(1, offset, null)
.filter(image, image);

How to display HDR picture in Android?

I'm doing an android app to decompress, decode and display HDR pictures.
These HDR pictures use 2 bytes per component (A,R,G,B) so one pixel is represented by a 8 bytes value that can only fit with the long type.
I'm using android's Bitmap to display picture as they have a constructor allowing to do HDR by using Bitmap.Config.RGBA_F16:
int width = 1;
int height = 1;
Bitmap image = Bitmap.createBitmap(width, height, Bitmap.Config.RGBA_F16);
Unfortunately I can't find any way to fill a pixel of the Bitmap. I use the recommended formula but it cannot be used in the setPixel(x,y,color) method of Bitmap because color has to be a int:
long color = (A & 0xffff) << 48 | (B & 0xffff) << 32 | (G & 0xffff) << 16 | (R & 0xffff);
image.setPixel(0,0,color); //Argument type error
.
I have also tried with Color (which has a HDR compatible method), Paint and Canvas but no Bitmap method accepts them to set only one pixel.
Thanks for any help!
If you need to open an HDR file, such as 16-bit png, and then display it, you can use ImageDecoder.setTargetColorSpace to create bitmap with format Bitmap.Config.RGBA_F16 like so:
File file = new File(...);
ImageDecoder.Source source = ImageDecoder.createSource(file);
Drawable drawable = ImageDecoder.decodeDrawable(source, (decoder, info, src) -> {
decoder.setTargetColorSpace(ColorSpace.Named.EXTENDED_SRGB);
});
If you need to display HDR image that is stored in memory you can use Bitmap.copyPixelsFromBuffer, as this method allows to set pixels of the bitmap without conversion of color space the way Bitmap.setPixel does. In this case you need to pack 4 channels represented by Half values into long for each pixel, then write these long values into a Buffer and finally copy pixels from the buffer to the bitmap.
LongBuffer buffer = LongBuffer.allocate(width * height);
for (int y = 0; y < height; y++) {
for (int x = 0; x < width; x++) {
// fill pixels values as needed
float r = (float)y / height;
float g = (float)y / height;
float b = (float)y / height;
float a = 1f;
long rBits = Half.halfToShortBits(Half.toHalf(r));
long gBits = Half.halfToShortBits(Half.toHalf(g));
long bBits = Half.halfToShortBits(Half.toHalf(b));
long aBits = Half.halfToShortBits(Half.toHalf(a));
long color = aBits << 48 | bBits << 32 | gBits << 16 | rBits;
buffer.put(color);
}
}
buffer.rewind();
bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.RGBA_F16);
bitmap.copyPixelsFromBuffer(buffer);

Android parse base64Binary pgm to Bitmap

I need to convert a base64Binary-string in the format pgm to a bitmap in android. So I don't have an usual base64-encoded Bitmap.
The base64binary-string came from a xml file
<ReferenceImage Type="base64Binary" Format="pgm" WidthPX="309" HeightPX="233" BytesPerPixel="1" >
NDY4Ojo9QEFDRUVHRklLTE9OUFFTU1VWV1hZWltZWVlZWlpbW1xdXmBgYmJjZGNlZWRkZGRlZmZnZ2ZnaWpqa21ub29ubm9vb3BwcHBxcHFyc3FzcnJzcnJydH[...]VlaW1xbWltcXFxcXFxd.
Pattern.compile("<ReferenceImage .*>((?s).*)<\\/ReferenceImage>");
...
String sub = r; //base64binary string pattern-matched from xml file
byte[] decodedString = Base64.decode(sub.getBytes(), Base64.NO_WRAP); //probably wrong decoding (needs to be ASCII to binary?)
Bitmap decodedByte = BitmapFactory.decodeByteArray(decodedString, 0, decodedString.length); //always null due to wrong byte-array
I think I understand that pgm images are typically stored as ASCII (like in my xml) or binary (0..255). I also think that Base64.decode needs the binary-variant, not the ASCII that I have.
However BitmapFactory.decodeByteArray doesn't understand the decoded byte-array and returns null.
So how can I convert my base64binary-pgm-ASCII-string to a valid byte-array in order to create a valid bitmap?
I think your Base64-decoding is fine. But Android's BitmapFactory probably has no direct support for PGM format. I'm not sure how to add support to it, but it seems you could create a Bitmap using one of the createBitmap(...) factory methods quite easily.
See PGM spec for details on how to parse the header, or see my implementation for Java SE (if you look around, you'll also find a class that supports ASCII reading, if needed).
Could also be that there's no header, and that you can get the height/width from the XML. In that case, dataOffset will be 0 below.
When the header is parsed, you know width, height and where the image data starts:
int width, height; // from header
int dataOffset; // == end of header
// Create pixel array, and expand 8 bit gray to ARGB_8888
int[] pixels = new int[width * height];
for (int y = 0; y < height; y++) {
for (int x = 0; x < width; x++) {
int gray = decodedString[dataOffset + i] & 0xff;
pixels[i] = 0xff000000 | gray << 16 | gray << 8 | gray;
}
}
Bitmap pgm = Bitmap.createBitmap(metrics, pixels, width, height, BitmapConfig.Config. ARGB_8888);
Thanks for your answer!
You solved me the day!
I found a little error in your code, index i is never initialized or incremented.
I corrected your code and tested it, here is my code:
private static Bitmap getBitmapFromPgm(byte[] decodedString, int width, int height, int dataOffset){
// Create pixel array, and expand 8 bit gray to ARGB_8888
int[] pixels = new int[width * height];
int i = 0;
for (int y = 0; y < height; y++) {
for (int x = 0; x < width; x++) {
int gray = decodedString[dataOffset + i] & 0xff;
pixels[i] = 0xff000000 | gray << 16 | gray << 8 | gray;
i++;
}
}
Bitmap pgm = Bitmap.createBitmap(pixels, width, height, android.graphics.Bitmap.Config.ARGB_8888);
return pgm;
}

how to display picture in java with giving R,G,B

the value of R or G or B is store in int from 0~255.
I already have all the rgb value of every pixel of the picuture, I want to display the pics base on the r,g,b I already know.
BufferedImage imgnew = new BufferedImage(width, height,BufferedImage.TYPE_INT_RGB);
for (int y = 0; y < height; y++) {
for (int x = 0; x < width; x++) {
//I can get access to the rgb of every pixel by R[x][y],G[x][y],B[x][y]
//how to calculate the rgb of every pixel?
imgnew.setRGB(x, y, rgb);
}
}
JFrame frame = new JFrame();
JLabel labelnew = new JLabel(new ImageIcon(imgnew));
frame.getContentPane().add(labelnew, BorderLayout.CENTER);
frame.pack();
frame.setVisible(true);
my question is how to calculate the right pix of every pixel, as the rgb is store as int, should I transfer it to byte? if it is, how to do it, if not, is there any other way to calculate pix?
I know someone use
int rgb = 0xff000000 | ((R[x][y] & 0xff) << 16) | (((G[x][y] & 0xff)<< 8) | ((B[x][y] & 0xff);//the way I calcualte the pix is wrong, which lead to the wrong color of pics
to calculate rgb, but here the R[x][y] G,B is store in type bytes
The BufferedReader class returns one or list of pixel by getRGB() method, and I have to mention that you don't get it as a 2-demition array like int[width][height], for example if you request pixels from 0,0 to 10,20, then you will get it as a 200-length int[] array.
then you need to break up each int value into 4 byte which represents (argb) of each pixel, so you would do it with ByteBuffer class.
here a simple example
int imgWidth=1920,imgHeight=1080;
int[] row=new int[imgWidth];//for storing a line of pixels
for(int i=0;i<imgHeight;i++){
row=img.getRGB(0,i,imgWidth,1,null,0,imgWidth);//get pixel from the current row
for(int k=0;k<row.length;k++){
byte[] argb=ByteBuffer.allocate(4).putInt(4).array();//break up int(color) to 4 byte (argb)
//doing some business with pixel....
}
//setting the processed pixel
//////////////////////////////////////////UPDATED!
//Preparing each pixel using ByteBuffer class, make an int(pixel) using a 4-lenght byte array
int rgb=ByteBuffer.wrap(new byte[]{0xff,R[x][y]&0xff,G[x][y]&0xff,B[x][y]&0xff}).getInt();
imgnew.setRGB(x,y,rgb);//this is bettrer to buffer some pixel then set it to the image, instead of set one-by-one
//////////////////////////////////////////
//img.setRGB(0,i,imgWidth,1,row,0,imgWidth)
}
also check this example too

Image Processing in Java

I want to extract the pixel values of the jpeg image using the JAVA language, and need to store it in array(bufferdArray) for further manipulation. So how i can extract the pixel values from jpeg image format?
Have a look at BufferedImage.getRGB().
Here is a stripped-down instructional example of how to pull apart an image to do a conditional check/modify on the pixels. Add error/exception handling as necessary.
public static BufferedImage exampleForSO(BufferedImage image) {
BufferedImage imageIn = image;
BufferedImage imageOut =
new BufferedImage(imageIn.getWidth(), imageIn.getHeight(), BufferedImage.TYPE_4BYTE_ABGR);
int width = imageIn.getWidth();
int height = imageIn.getHeight();
int[] imageInPixels = imageIn.getRGB(0, 0, width, height, null, 0, width);
int[] imageOutPixels = new int[imageInPixels.length];
for (int i = 0; i < imageInPixels.length; i++) {
int inR = (imageInPixels[i] & 0x00FF0000) >> 16;
int inG = (imageInPixels[i] & 0x0000FF00) >> 8;
int inB = (imageInPixels[i] & 0x000000FF) >> 0;
if ( conditionChecker_inRinGinB ){
// modify
} else {
// don't modify
}
}
imageOut.setRGB(0, 0, width, height, imageOutPixels, 0, width);
return imageOut;
}
The easiest way to get a JPEG into a java-readable object is the following:
BufferedImage image = ImageIO.read(new File("MyJPEG.jpg"));
BufferedImage provides methods for getting RGB values at exact pixel locations in the image (X-Y integer coordinates), so it'd be up to you to figure out how you want to store that in a single-dimensional array, but that's the gist of it.
There is a way of taking a buffered image and converting it into an integer array, where each integer in the array represents the rgb value of a pixel in the image.
int[] pixels = ((DataBufferInt)image.getRaster().grtDataBuffer()).getData();
The interesting thing is, when an element in the integer array is edited, the corresponding pixel in the image is as well.
In order to find a pixel in the array from a set of x and y coordinates, you would use this method.
public void setPixel(int x, int y ,int rgb){
pixels[y * image.getWidth() + x] = rgb;
}
Even with the multiplication and addition of coordinates, it is still faster than using the setRGB() method in the BufferedImage class.
EDIT:
Also keep in mind, the image needs type needs to be that of TYPE_INT_RGB, and isn't by default. It can be converted by creating a new image of the same dimensions, and of the type of TYPE_INT_RGB. Then using the graphics object of the new image to draw the original image to the new one.
public BufferedImage toIntRGB(BufferedImage image){
if(image.getType() == BufferedImage.TYPE_INT_RGB)
return image;
BufferedImage newImage = new BufferedImage(image.getWidth(), image.getHeight, BufferedImage.TYPE_INT_RGB);
newImage.getGraphics().drawImage(image, 0, 0, null);
return newImage;
}

Categories