Removing a number of pixels of an image - java

Pre: Receives a buffered image and a number of pixels to remove
Post: creates and returns a copy of the received image with the given number of the images remaining pixels removed
I am having trouble with this method because I need to remove random pixels...I have only made a new copy of the image to print, but I need to change it so that the number of pixels given are removed...can anyone help?
public static BufferedImage removePixels(BufferedImage img,int numToRemove)
{
//so far what I have gotten
BufferedImage copy = new BufferedImage(img.getWidth(), img.getHeight(), BufferedImage.TYPE_INT_ARGB);
copy.getGraphics().drawImage(img, 0,0,null);
return copy;
}

bufferedImage.setRGB(int x , int y , int rgb) and bufferedImage.getRGB(int x , int y) might be what you're looking for.

Related

How can I get white colored pixel value from gray scale image and replace it with another color?

I am trying to get the value of the White Colored pixel from a GrayScale image and replace it with another Color but when I run my code, the whole GrayScale image is transfered to another Color. Can anyone please tell me where is fault in the code or how can I get my desired results??
This is the code...
public class gray {
public static void main (String args[])throws IOException{
int width;
int height;
BufferedImage myImage = null;
File f = new File("E:\\eclipse\\workspace\\Graphs\\src\\ColorToGray\\1.png");
myImage = ImageIO.read(f);
width = myImage.getWidth();
height = myImage.getHeight();
BufferedImage image = new BufferedImage(width,height,BufferedImage.TYPE_INT_ARGB);
int pixels[];
pixels = new int[width * height];
myImage.getRGB(0, 0, width, height, pixels, 0, width);
for (int i = 0; i < pixels.length; i++) {
if (pixels[i] == 0xFFFFFF) {
pixels[i] = 0x000000FF;
}
}
File f2 = new File("E:\\eclipse\\workspace\\Graphs\\src\\ColorToGray\\out 1.png");
image.setRGB(0, 0, width, height, pixels, 0, width);
ImageIO.write( image, "jpg", f2);
}
}
Image Before:
Image Before Output
Image After:
Image After Output
I looked into it, and found a bunch of problems.
First of all, when specifying the filename to save, you supply a ".png" extension, but when you call the ImageIO.write() function, you specify file type "jpg". That tends not to work very well. If you try to open up the resulting file, most programs will give you a "this is not a valid .PNG file" error. Windows explorer tries to be smart, and re-interprets the .PNG as a .JPG, but this spared you from the chance of discovering your mistake.
This takes care of the strange redness problem.
However, if you specify "png" in ImageIO.write(), you still don't get the right image. One would expect an image that looks mostly like the original, with just a few patches of blue there where bright white used to be, but instead what we get is an overall brighter version of the original image.
I do not have enough time to look into your original image to find out what is really wrong with it, but I suspect that it is actually a bright image with an alpha mask that makes it look less bright, AND there is something wrong with the way the image gets saved that strips away alpha information, thus the apparent added brightness.
So, I tried your code with another image that I know has no tricks in it, and still your code did not appear to do anything. It turns out that the ARGB format of the int values you get from myImage.getRGB(); returns 255 for "A", which means that you need to be checking for 0xFFFFFFFF, not 0x00FFFFFF.
And of course when you replace a value, you must replace it with 0xFF0000FF, specifying a full alpha value. Replacing a pixel with 0x000000FF has no visible effect, because regardless of the high blue value, alpha is zero, so the pixel would be rendered transparent.

Array Index out of Bound in converting an array of pixels into a Image in Java

I have this method that I found on stackoverflow:
public static Image getImageFromArray(int[] pixels, int width, int height) {
BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_ARGB);
WritableRaster raster = (WritableRaster) image.getData();
raster.setPixels(0,0,width,height,pixels);
return image;
}
using it like this:
image = getImageFromArray(dstpixels,img.getWidth(this),img.getHeight(this));
In order to debug I printed out Width, Height and length of dstpixels, here are the results:
700 389
272300
still I get this error
Exception in thread "AWT-EventQueue-0" java.lang.ArrayIndexOutOfBoundsException: 272300
on this exact line
raster.setPixels(0,0,width,height,pixels);
What am I missing?
It looks like Raster doesn't treat pixels as array in which each element represents single pixel. It treats it as array where each element contains single information about pixel.
So if it is ARGB type of image it looks like pixel array will contain info about first pixel in first four elements (at indexes [0,1,2,3]) where
R will be stored at position [0]
G at position [1]
B at position [2]
and A (alpha) at position [3].
Info about second pixel will be placed at [4,5,6,7] indexes, third [8,9,10,11] and so on.
So main problem from your question can be solved by assigning 4 times larger int[] pixel array than amount of pixels for ARGB type of image (for RGB 3 times larger).
Another problem in your code is that image.getData()
Returns the image as one large tile. The Raster returned is a copy of the image data is not updated if the image is changed.
(emphasis mine)
so manipulating data from that raster will not affect image. To update image with data from raster you need to add image.setData(raster); in your getImageFromArray method like
public static Image getImageFromArray(int[] pixels, int w, int h) {
BufferedImage image = new BufferedImage(w, h, BufferedImage.TYPE_INT_ARGB);
WritableRaster raster = (WritableRaster) image.getData();
raster.setPixels(0,0,w,h,pixels);
image.setData(raster); //<-- add this line
return image;
}
OR don't use image.getData() at all and instead manipulate raster used by image. You can get it via image.getRaster().
Demo:
public static void main(String[] args) {
int width = 200, height = 300;
//array needs to be 4 times larger than amount of pixels
int[] pixels = new int[4*width*height];
for (int i = 0; i < pixels.length; i++) {
//if (i%4==0){pixels[i]=255;}//R default 0
//if (i%4==1){pixels[i]=255;}//G default 0
if (i%4==2){pixels[i]=255;}//B default 0
//Alpha
if (i%4==3){
pixels[i]=(int)(255*(i/4%width)/(double)width);
}
}
Image image = getImageFromArray(pixels, width, height);
showImage(image);
}
public static void showImage(Image img){
JFrame frame = new JFrame();
frame.setLayout(new FlowLayout());
frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
JLabel lab = new JLabel(new ImageIcon(img));
frame.add(lab);
frame.pack();
frame.setVisible(true);
}
public static Image getImageFromArray(int[] pixels, int w, int h) {
BufferedImage image = new BufferedImage(w, h, BufferedImage.TYPE_INT_ARGB);
WritableRaster raster = image.getRaster();
raster.setPixels(0,0,w,h,pixels);
return image;
}
In order to debug I printed out Width, Height and length of dstpixels, here are the results: 700 389 272300
And
still I get this error Exception in thread "AWT-EventQueue-0" java.lang.ArrayIndexOutOfBoundsException: 272300
If the array size is N, then the index of the first element is 0 and the index of the last element is N-1 (in your case 0 and 272299)
Seems that one of your parameters should be 272300-1!
The exception is telling you that something accesses index 272300, which won't work if that is the SIZE of the dimension; then the last index is as said, 272300-1.
In other words: always read the exception message carefully, it tells you all you need to know!

Manipulating the pixels within a BufferedImage through an Array

I'm currently following a series on Java game development from scratch. I understand most java and oop concepts but have very little experience when dealing with graphics and hardware acceleration.
The lines of code that I am questioning are:
private BufferedImage image = new BufferedImage(WIDTH, HEIGHT, BufferedImage.TYPE_INT_RGB);
private int[] pixels = ((DataBufferInt) image.getRaster().getDataBuffer()).getData();
The BufferedImage "image" variable is always the variable being drawn to the screen through a render method. Usually something like:
public void render() {
BufferStrategy bs = this.getBufferStrategy;
if(bs == null) { this.createBufferStrategy(3); return; }
Graphics g = bs.getDrawGraphics();
g.drawImage(image, 0, 0, WIDTH, HEIGHT, null);
g.dispose();
bs.show();
}
I understand the array of pixels contains every pixel within the BufferedImage, however, it seems as though everytime that array is filled with values it directly effects the contents of the "image" variable. There is never a method used to copy the pixels array values into the image.
Are these varibales actually linked in such a way? Does the use of:
private int[] pixels = ((DataBufferInt) image.getRaster().getDataBuffer()).getData();
create an automated link between the image and the array being created in the code above? Maybe I am going cray and just missed something but I have reviewed the code several times and not once is the "image" varibale manipulated after it's initial creation. (besides being rendered to the screen of course.) It's always the array "pixels" just being filled with different values that causes the change in the rendered image.
Some insight on this would be wonderful. Thank you in advance!
Why don't you call
image.getData()
instead of
private int[] pixels = ((DataBufferInt) image.getRaster().getDataBuffer()).getData();
image.getData() returns a copy of the Raster. getRaster() returns a WriteableRaster with the ability to modify pixels. I'm guessing but getRaster() probably returns a child of the image Raster and therefore is writeable if you modify the array. Try image.getData() to see if it works. If not, post back here and I'll take a closer look.
I looked into this further. The source code that comes with the JDK shows that image.getRaster().getDataBuffer().getData() returns the source data array. image.getData() indeed returns a copy. If the image is modified, the data in getData() will not be modified.
You can call getPixels on the returned Raster:
public int[] getPixels(int x,
int y,
int w,
int h,
int[] iArray)
Returns an int array containing all samples for a rectangle of pixels,
one sample per array element. An ArrayIndexOutOfBoundsException may be
thrown if the coordinates are not in bounds. However, explicit bounds
checking is not guaranteed.
Use int[] pixels = ((DataBufferInt) image.getRaster().getDataBuffer()).getData(); instead of image.getData(). image.getData() just returns a copy of the image data where image.getRaster() returns the original pixel array allowing to actually write pixels to it.

Why ImageIO.write() method modifies pixel values?

I am trying to run a simple Java program that tries to do the following : Extract pixel data from a given image. Then use this data to create a new image of the same type. The problem is that when I read the pixel data of this created image, the pixel values differ from the ones I have written into it. This happens not only or .jpg images but also for some .png images(so it's not even restricted to image type).
Here is my code:
package com.alex;
import java.awt.image.BufferedImage;
import java.io.File;
import java.io.IOException;
import javax.imageio.ImageIO;
public class Test {
public static void main(String[] args) {
try{
// Read source image
BufferedImage img = ImageIO.read(new File("D:/field.png"));
int width = img.getWidth();
int height = img.getHeight();
int[] imagePixels = new int[width*height];
img.getRGB(0, 0, width, height, imagePixels, 0, width);
// Create copy image
BufferedImage destImg = new BufferedImage(img.getWidth(), img.getHeight(), img.getType());
destImg.setRGB(0, 0, img.getWidth(), img.getHeight(), imagePixels, 0, img.getWidth());
File out = new File("D:/test.png");
ImageIO.write(destImg, "png", out);
// Extract copy image pixels
BufferedImage copy = ImageIO.read(new File("D:/test.png"));
int width1 = copy.getWidth();
int height1 = copy.getHeight();
int[] extractedPixels = new int[width1*height1];
copy.getRGB(0, 0, width1, height1, extractedPixels, 0, width1);
System.out.println("The 2 dimensions are " + imagePixels.length + " " + extractedPixels.length );
// Compare the piels from the 2 images
int k=0;
for(int i=0; i<imagePixels.length; i++) {
if(imagePixels[i] != extractedPixels[i]) {
k++;
}
}
System.out.println("Number of different pixels was: " + k);
}catch(IOException e) {
System.out.println("Exception was thrown during reading of image: " + e.getMessage());
}
}
}
Unfortunately quite often and impredictable the 2 images pixel data differ. Could someone please help me find a method so that, at least for an image type, the values don't get modiied ?
Edit Here is an image that fails in the above process
Make sure you are using the correct color model for reading and writing.
According to the BufferedImage.getRGB() documentation,
Returns an array of integer pixels in the default RGB color model (TYPE_INT_ARGB) and default sRGB color space, from a portion of the image data. Color conversion takes place if the default model does not match the image ColorModel. There are only 8-bits of precision for each color component in the returned data when using this method. With a specified coordinate (x, y) in the image, the ARGB pixel can be accessed in this way:
pixel = rgbArray[offset + (y-startY)*scansize + (x-startX)];
[Edit]
You need to use the constructor BufferedImage(width, height, type, ColorModel), as indicated in the Javadoc for your image type (TYPE_BYTE_BINARY):
When this type is used as the imageType argument to the BufferedImage constructor that takes an imageType argument but no ColorModel argument, a 1-bit image is created with an IndexColorModel with two colors in the default sRGB ColorSpace: {0, 0, 0} and {255, 255, 255}.
Images with 2 or 4 bits per pixel may be constructed via the BufferedImage constructor that takes a ColorModel argument by supplying a ColorModel with an appropriate map size.
(emphasis mine)
Try this
First print same index in two array
If the result is not same means your problem in your color mode, But if same means your comparision is not work.

How do I create a BufferedImage from array containing pixels?

I get the pixels from BufferedImage using the method getRGB(). The pixels are stored in array called data[]. After some manipulation on data array, I need to create a BufferedImage again so that I can pass it to a module which will display the modified image, from this data array, but I am stuck with it.
I get the pixels from the BufferedImage using the method getRGB(). The
pixels are stored in array called data[].
Note that this can possibly be terribly slow. If your BufferedImage supports it, you may want to instead access the underlying int[] and directly copy/read the pixels from there.
For example, to fastly copy your data[] into the underlying int[] of a new BufferedImage:
BufferedImage bi = new BufferedImage( w, h, BufferedImage.TYPE_INT_ARGB );
final int[] a = ( (DataBufferInt) res.getRaster().getDataBuffer() ).getData();
System.arraycopy(data, 0, a, 0, data.length);
Of course you want to make sure that your data[] contains pixels in the same representation as your BufferedImage (ARGB in this example).
BufferedImage bufferedImage = new BufferedImage(width, height, BufferedImage.TYPE_INT_RGB);
Then set the pixels again.
bufferedImage.setRGB(x, y, your_value);
PS: as stated in the comments, please use the answer from #TacticalCoder
You can set the RGB (int) values for the pixels in the new image using the setRGB methods.

Categories