I'm trying to implement Matlab's rgb2gray in Java according to http://www.mathworks.com/help/toolbox/images/ref/rgb2gray.html . I have the following code:
public BufferedImage convert(BufferedImage bi){
int heightLimit = bi.getHeight();
int widthLimit = bi.getWidth();
BufferedImage converted = new BufferedImage(widthLimit, heightLimit,
BufferedImage.TYPE_BYTE_GRAY);
for(int height = 0; height < heightLimit; height++){
for(int width = 0; width < widthLimit; width++){
// Remove the alpha component
Color c = new Color(bi.getRGB(width, height) & 0x00ffffff);
// Normalize
int newRed = (int) 0.2989f * c.getRed();
int newGreen = (int) 0.5870f * c.getGreen();
int newBlue = (int) 0.1140f * c.getBlue();
int roOffset = newRed + newGreen + newBlue;
converted.setRGB(width, height, roOffset);
}
}
return converted;
}
Now, I do get a grayscale image but it is too dark compared to what I get from Matlab. AFAIK, the easiest way to turn an image to grayscale is have a BufferedImage of type TYPE_BYTE_GRAY and then just copy the pixels of a BufferedImage of TYPE_INT_(A)RGB. But even this method gives an image that is darker than Matlab's though grayscale decently enough. I've also looked into using RescaleOp. However, I don't find anyway in RescaleOp to set the grayness per pixel.
As an added test, I print out the image matrices produced by Java nad by Matlab. In Java, I get figures like 6316128 6250335 6118749 6118749 6250335 6447714 while in Matlab, I only get something like 116 117 119 120 119 115 (first six figures of both matrices).
How do I get an output similar to Matlab's?
The operator precedence in Java specifies that type-casting is higher than multiplication. You're casting your floating-point constants to 0, so I don't understand how you're getting a grayscale result at all. Easy to fix:
int newRed = (int) (0.2989f * c.getRed());
int newGreen = (int) (0.5870f * c.getGreen());
int newBlue = (int) (0.1140f * c.getBlue());
I would also replace 0.2989 with 0.2990 as it appears to be a typo in the documentation.
Related
when I tested my code with JUnit, the following error occured:
java.lang.IllegalArgumentException: Color parameter outside of expected range: Red Green Blue
Honestly, I don't know why. My code is not very long, so I would like to post it for better help.
BufferedImage img = ImageIO.read(f);
for (int w = 0; w < img.getWidth(); w++) {
for (int h = 0; h < img.getHeight(); h++) {
Color color = new Color(img.getRGB(w, h));
float greyscale = ((0.299f * color.getRed()) + (0.587f
* color.getGreen()) + (0.144f * color.getBlue()));
Color grey = new Color(greyscale, greyscale, greyscale);
img.setRGB(w, h, grey.getRGB());
When I run the JUnit test, eclipse marks up the line with
Color grey = new Color(greyscale, greyscale, greyscale);
So, I suppose the problem might be, that I work with floating numbers and as you can see I recalculate the red, green and blue content of the image.
Could anyone help me to solve that problem?
You are calling the Color constructor with three float parameters so the values are allowed to be between 0.0 and 1.0.
But color.getRed() (Blue, Green) can return a value up to 255. So you can get the following:
float greyscale = ((0.299f *255) + (0.587f * 255) + (0.144f * 255));
System.out.println(greyscale); //262.65
Which is far to high for 1.0f and even for 252 which the Color(int,int,int) constructor allows. So scale your factors like dasblinkenlight said and cast the greyscale to an int or else you will call the wrong constructor of Color.`
new Color((int)greyscale,(int)greyscale,(int)greyscale);
I want to generate a 64x64 pixel art for any image provided by the user. I'm currently lost on how to do it. OpenCV does not seem to provide any such functionality. Any pointers in the right direction would be gladly appreciated.
Edit
Pixel art is a pixelated image of a given image. The screen shot below shows what I've achieved so far.
Basically what I've done is as follows.
Scale the image with aspect ratios maintained so that it would fit within a 64x64 grid
Image sourceImage;
Rectangle imageBound = sourceImage.getBounds();
double sx = (double) 64 / (double) imageBound.width;
double sy = (double) 64 / (double) imageBound.height;
double s = Math.min(sx, sy);
int dx = (int) (s * imageBound.width);
int dy = (int) (s * imageBound.height);
Image scaledImage = new Image(d, sourceImage.getImageData().scaledTo(dx, dy));
Read each pixel from the scaled image and output it on the screen.
Following shows how I extract the color from each pixel. I use SWT framework.
Display d;
ImageData data = scaledImage.getImageData();
for(int i=0; i<data.width; i++) {
for(int j=0; j<data.height; j++) {
int pixel = data.getPixel(i,j);
int red = (pixel & 0x00ff0000) >> 16;
int green = (pixel & 0x0000ff00) >> 8;
int blue = pixel & 0x000000ff;
pixelGrid[i][j].setBackground(new Color(d, red, green, blue));
}
}
But as you can see, there is a major difference in the colors after resizing the image. What I want to know is whether I'm on the correct path to achieve this and if so, how can I retain the actual colors while scaling.
You probably have an image in RGBA format, and are extracting the wrong values from the image (as if it were ARGB). It looks very blue currently, and I fail to see any reds. That should have put you on the right track:
int red = (pixel>>24) & 0xff;
int green = (pixel>>16) & 0xff;
int blue = (pixel>>8) & 0xff;
int alpha = pixel & 0xff; // always 0xff for images without transparency
I'm looking to use a very crude heightmap I've created in Photoshop to define a tiled isometric grid for me:
Map:
http://i.imgur.com/jKM7AgI.png
I'm aiming to loop through every pixel in the image and convert the colour of that pixel to a scale of my choosing, for example 0-100.
At the moment I'm using the following code:
try
{
final File file = new File("D:\\clouds.png");
final BufferedImage image = ImageIO.read(file);
for (int x = 0; x < image.getWidth(); x++)
{
for (int y = 0; y < image.getHeight(); y++)
{
int clr = image.getRGB(x, y) / 99999;
if (clr <= 0)
clr = -clr;
System.out.println(clr);
}
}
}
catch (IOException ex)
{
// Deal with exception
}
This works to an extent; the black pixel at position 0 is 167 and the white pixel at position 999 is 0. However when I insert certain pixels into the image I get slightly odd results, for example a gray pixel that's very close to white returns over 100 when I would expect it to be in single digits.
Is there an alternate solution I could use that would yield more reliable results?
Many thanks.
Since it's a grayscale map, the RGB parts will all be the same value (with range 0 - 255), so just take one out of the packed integer and find out what percent of 255 it is:
int clr = (int) ((image.getRGB(x, y) & 0xFF) / 255.0 * 100);
System.out.println(clr);
getRGB returns all channels packed into one int so you shouldn't do arithmetic with it. Maybe use the norm of the RGB-vector instead?
for (int x = 0; x < image.getWidth(); ++x) {
for (int y = 0; y < image.getHeight(); ++y) {
final int rgb = image.getRGB(x, y);
final int red = ((rgb & 0xFF0000) >> 16);
final int green = ((rgb & 0x00FF00) >> 8);
final int blue = ((rgb & 0x0000FF) >> 0);
// Norm of RGB vector mapped to the unit interval.
final double intensity =
Math.sqrt(red * red + green * green + blue * blue)
/ Math.sqrt(3 * 255 * 255);
}
}
Note that there is also the java.awt.Color class that can be instantiated with the int returned by getRGB and provides getRed, getGreen and getBlue methods if you don't want to do the bit manipulations yourself.
I don't can convert exactely my rgb image to an gray image..
My final image it's to dark, and affects my work.
I use this code:
public static BufferedImage rgb2gray(BufferedImage bi)//converter
{
int heightLimit = bi.getHeight();
int widthLimit = bi.getTileWidth();
BufferedImage converted = new BufferedImage(widthLimit, heightLimit, BufferedImage.TYPE_BYTE_GRAY);
for (int height = 0; height < heightLimit; height++) {
for (int width = 0; width < widthLimit; width++) {
Color c = new Color(bi.getRGB(width, height) & 0x00fffff);
int newRed = (int) ((0.2989f * c.getRed()) * 1.45);//0.2989f
int newGreen = (int) ((0.5870f * c.getGreen()) * 1.45);//0.5870f
int newBlue = (int) ((0.1140f * c.getBlue()) * 1.45);
int roOffset = newRed + newGreen + newBlue;
converted.setRGB(width, height, roOffset);
}
}
return converted;
}
What is wrong?
In matlab the result it's perfect, but with this code in java is poor.
I am not sure why this works in Matlab and not in java - except that perhaps color representation is not handled the same way in both platforms (but that's speculating). However, there is a very good way to do what you want to do - it doesn't answer your specific question ("what's wrong?"), but it should get you back to doing what you were hoping to do:
How do I desaturate a BufferedImage in Java?
Your mistake seems to be multiplying the new red, green and blue values with a mysterious 1.45.
Just remove that and make your code:
int newRed = (int) (0.2989f * c.getRed());
int newGreen = (int) (0.5870f * c.getGreen());
int newBlue = (int) (0.1140f * c.getBlue());
Matlab's documentation for it's rgb2gray function has these coefficients.
I have a list of values from 0-1. I want to convert this list to an image by using a gradient that converts these floating point values to RGB values. Are there any tools in Java that provide you with this functionality?
0 should be mapped 0
1 should be mapped 255
keep in mind that you need 3 of them to make a color
so multiply by 255 the floating number and cast it to int.
Perhaps GradientPaint can do what you want. It's unclear how you want a list of floating point values to be converted into a gradient. Normally a gradient consists of two colors and some mechanism that interpolates between those colors. GradientPaint implements a linear gradient.
Say you have an array made of 64 000 triples corresponding to RGB values, like this:
final Random rand = new Random();
final float[] f = new float[320*200*3];
for (int i = 0; i < f.length; i++) {
f[i] = rand.nextFloat(); // <-- generates a float between [0...1.0[
}
And say you have a BufferedImage that has a size of 320x200 (64 000 pixels) of type INT_ARGB (8 bits per value + 8 bits for the alpha level):
final BufferedImage bi = new BufferedImage( 320, 200, BufferedImage.TYPE_INT_ARGB );
Then you can convert you float array to RGB value and fill the image doing this:
for (int x = 0; x < 320; x++) {
for (int y = 0; y < 200; y++) {
final int r = (int) (f[x+y*200*3] * 255.0);
final int g = (int) (f[x+y*200*3+1] * 255.0);
final int b = (int) (f[x+y*200*3+2] * 255.0);
bi.setRGB( x, y, 0xFF000000 | (r << 16) | (g << 8) | b );
}
}
Note that would you display this image it would appear gray but if you zoom in it you'll see it's actually made of perfectly random colorful pixels. It's just that the random number generator is so good that it all looks gray on screen :)