I am trying to obtain RGB values from an ImagePlus object. I am getting an exception error when I attempt to do this:
import ij.IJ;
import ij.ImagePlus;
import ij.plugin.filter.PlugInFilter;
import ij.process.ColorProcessor;
import ij.process.ImageProcessor;
import java.awt.image.IndexColorModel;
public class ImageHelper implements PlugInFilter {
public int setup(String arg, ImagePlus img) {
return DOES_8G + NO_CHANGES;
}
public void run(ImageProcessor ip) {
final int r = 0;
final int g = 1;
final int b = 2;
int w = ip.getWidth();
int h = ip.getHeight();
ImagePlus ips = new ImagePlus("C:\\Lena.jpg");
int width = ips.getWidth();
int height = ips.getHeight();
System.out.println("width of image: " + width + " pixels");
System.out.println("height of image: " + height + " pixels");
// retrieve the lookup tables (maps) for R,G,B
IndexColorModel icm = (IndexColorModel) ip.getColorModel();
int mapSize = icm.getMapSize();
byte[] Rmap = new byte[mapSize];
icm.getReds(Rmap);
byte[] Gmap = new byte[mapSize];
icm.getGreens(Gmap);
byte[] Bmap = new byte[mapSize];
icm.getBlues(Bmap);
// create new 24-bit RGB image
ColorProcessor cp = new ColorProcessor(w, h);
int[] RGB = new int[3];
for (int v = 0; v < h; v++) {
for (int u = 0; u < w; u++) {
int idx = ip.getPixel(u, v);
RGB[r] = Rmap[idx];
RGB[g] = Gmap[idx];
RGB[b] = Bmap[idx];
cp.putPixel(u, v, RGB);
}
}
ImagePlus cwin = new ImagePlus("RGB Image", cp);
cwin.show();
}
}
The exception is comming from this line:
IndexColorModel icm = (IndexColorModel) ip.getColorModel();
Exception:
Exception in thread "main" java.lang.ClassCastException:
java.awt.image.DirectColorModel cannot be cast to
java.awt.image.IndexColorModel
...Any ideas? ^_^
The error occurs beccause ip.getColorModel() does not return a IndexColorModel object, but a ColorModel object.
To get the IndexColorModel object, you should use the following code:
IndexColorModel icm = ip.getDefaultColorModel();
That should give you a IndexColorModel, according to the ImageJ API.
ColorProcessor contains methods
getChannel()
to get the red, green or blue channels.
To get a ColorProcessor you can cast your processor to ColorProcessor.
ColorProcessor cp = (ColorProcessor) ip;
It would throw an error if the image was a grayscale though.
Related
I need to be able to print out a bitmap QR Code using my Brother QL-720NW.
As of right now, I'm able to generate a QR code bitmap and display it properly in an ImageView. On a button press, the user needs to be able to print that QR code bitmap from the Brother label printer.
I am able to make a connection to the printer, but I can only print out blank labels that do not show the QR code. How can I fix this so that the bitmap appears on the printed label properly?
Method for printing bitmap:
void printImage(Bitmap bitmap) {
// Specify printer
final Printer printer = new Printer();
PrinterInfo settings = printer.getPrinterInfo();
settings.ipAddress = "192.168.2.149";
settings.workPath = "/storage/emulated/0/Download";
settings.printerModel = PrinterInfo.Model.QL_720NW;
settings.port = PrinterInfo.Port.NET;
settings.orientation = PrinterInfo.Orientation.LANDSCAPE;
//settings.paperSize = PrinterInfo.PaperSize.CUSTOM;
settings.align = PrinterInfo.Align.CENTER;
settings.valign = PrinterInfo.VAlign.MIDDLE;
settings.printMode = PrinterInfo.PrintMode.ORIGINAL;
settings.numberOfCopies = 1;
settings.labelNameIndex = LabelInfo.QL700.W62RB.ordinal();
settings.isAutoCut = true;
settings.isCutAtEnd = false;
printer.setPrinterInfo(settings);
// Connect, then print
new Thread(new Runnable() {
#Override
public void run() {
if (printer.startCommunication()) {
Log.e("Tag: ", "Connection made.");
PrinterStatus result = printer.printImage(bitmap);
Log.e("Tag: ", "Printing!");
if (result.errorCode != PrinterInfo.ErrorCode.ERROR_NONE) {
Log.d("TAG", "ERROR - " + result.errorCode);
}
printer.endCommunication();
}
else {
Log.e("Tag: ", "Cannot make a connection.");
}
}
}).start();
}
Generating bitmap:
Bitmap encodeAsBitmap(String str) throws WriterException {
QRCodeWriter writer = new QRCodeWriter();
BitMatrix bitMatrix = writer.encode(str, BarcodeFormat.QR_CODE, 100, 100);
int w = bitMatrix.getWidth();
int h = bitMatrix.getHeight();
int[] pixels = new int[w * h];
for (int y = 0; y < h; y++) {
for (int x = 0; x < w; x++) {
pixels[y * w + x] = bitMatrix.get(x, y) ? Color.BLACK : Color.WHITE;
}
}
Bitmap bitmap = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
bitmap.setPixels(pixels, 0, w, 0, 0, w, h);
return bitmap;
}
Solved it, I was using LabelInfo.QL700.W62RB.ordinal() for the LabelNameIndex when I should have been using LabelInfo.QL700.W62.ordinal().
Works perfectly now!
I am attempting to pull an image from a url using URLImage.createToStorage. However I want that picture to appear rounded so I add a Mask to the image. However when I run the label only shows the placeholder image, not the url image. When I comment out the code that adds the rounded mask to the image the image displays. Is there something wrong with my rounded image code. I used Display.getInstance().callSerially().
//Where I display the image.
public void setUpProfile(Form f) {
Label imageLabel = findMyImage(f);
Image img = getImageFromRes("myprofile.png");
Image scaled = img.scaledWidth(f.getWidth() / 2);
EncodedImage enc = EncodedImage.createFromImage(scaled, false);
Display.getInstance().callSerially(new Runnable() {
#Override
public void run() {
imageLabel.setIcon(getRoundedImage(URLImage.createToStorage(enc,
"profileImage8", me.getPicture(), URLImage.RESIZE_SCALE_TO_FILL)));
f.revalidate();
}
});
findProfNameLabel(f).setText(me.getName());
findProfAgeLabel(f).setText(me.getAge() + " Years old");
findProfPrefLabel(f).setText("Interested in " + me.getPref());
}
public Image getRoundedImage(Image img) {
int w = img.getWidth();
int h = img.getHeight();
Image maskImage = Image.createImage(w, h);
Graphics g = maskImage.getGraphics();
g.setColor(0xffffff);
g.fillArc(0, 0, w, h, 0, 360);
Object mask = maskImage.createMask();
Image ret = img.applyMask(mask);
return ret;
}
The setUpProfile() method is called in the beforeShow of the Form.
EDIT: I edited in the working setUpProfile() method which uses URLImage.createMaskAdapter. and achieves a rounded image.
public void setUpProfile(Form f) {
Label imageLabel = findMyImage(f);
Image mask = getImageFromRes("rounded-mask.png");
Image placeholder = getImageFromRes("myprofile.png").scaled(mask.getWidth(), mask.getHeight());
EncodedImage enc = EncodedImage.createFromImage(placeholder.applyMask(mask.createMask()),
false);
System.out.println("SetUpProfile picture " + me.getPicture());
imageLabel.setIcon(URLImage.createToStorage(enc, "profileImage8",
me.getPicture(), URLImage.createMaskAdapter(mask)));
findProfNameLabel(f).setText(me.getName());
findProfAgeLabel(f).setText(me.getAge() + " Years old");
findProfPrefLabel(f).setText("Interested in " + me.getPref());
}
You can achieve this by creating a custom ImageAdapter that generates a round-mask automatically for you while downloading the image.
public static final URLImage.ImageAdapter RESIZE_SCALE_WITH_ROUND_MASK = new URLImage.ImageAdapter() {
#Override
public EncodedImage adaptImage(EncodedImage downloadedImage, EncodedImage placeholderImage) {
Image tmp = downloadedImage.scaledLargerRatio(placeholderImage.getWidth(), placeholderImage.getHeight());
if (tmp.getWidth() > placeholderImage.getWidth()) {
int diff = tmp.getWidth() - placeholderImage.getWidth();
int x = diff / 2;
tmp = tmp.subImage(x, 0, placeholderImage.getWidth(), placeholderImage.getHeight(), true);
} else if (tmp.getHeight() > placeholderImage.getHeight()) {
int diff = tmp.getHeight() - placeholderImage.getHeight();
int y = diff / 2;
tmp = tmp.subImage(0, y, Math.min(placeholderImage.getWidth(), tmp.getWidth()),
Math.min(placeholderImage.getHeight(), tmp.getHeight()), true);
}
Image roundMask = Image.createImage(tmp.getWidth(), tmp.getHeight(), 0xff000000);
Graphics gr = roundMask.getGraphics();
gr.setColor(0xffffff);
gr.fillArc(0, 0, tmp.getWidth(), tmp.getHeight(), 0, 360);
Object mask = roundMask.createMask();
tmp = tmp.applyMask(mask);
return EncodedImage.createFromImage(tmp, false);
}
#Override
public boolean isAsyncAdapter() {
return true;
}
};
Then apply it this way:
public void setUpProfile(Form f) {
Label imageLabel = findMyImage(f);
Image img = getImageFromRes("myprofile.png");
Image scaled = img.scaledWidth(f.getWidth() / 2);
EncodedImage enc = EncodedImage.createFromImage(scaled, false);
Display.getInstance().callSerially(new Runnable() {
#Override
public void run() {
imageLabel.setIcon(URLImage.createToStorage(enc,
"profileImage8", me.getPicture(), RESIZE_SCALE_WITH_ROUND_MASK));
f.revalidate();
}
});
findProfNameLabel(f).setText(me.getName());
findProfAgeLabel(f).setText(me.getAge() + " Years old");
findProfPrefLabel(f).setText("Interested in " + me.getPref());
}
I m having a sample image(sorry for the type of image) which i m feeding into JAI code to get a compressed image.
Now the output image i m getting is in mono-color. I don't know why is the output is abnormal but other images are getting processed just fine.
The sample original and processed images are -
Original Image -
Processed Image -
The JAI code to process the image -
private static final String JAI_STREAM_ACTION = "stream";
private static final String JAI_SUBSAMPLE_AVERAGE_ACTION = "SubsampleAverage";
private static final String JAI_ENCODE_FORMAT_JPEG = "JPEG";
private static final String JAI_ENCODE_ACTION = "encode";
private static final String JPEG_CONTENT_TYPE = "image/jpeg";
private int mMaxWidth = 800;
//private int mMaxWidthThumbnail = 150;
private byte[] resizeImageAsJPG(byte[] pImageData, int pMaxWidth) throws IOException {
InputStream imageInputStream = new ByteArrayInputStream(pImageData);
SeekableStream seekableImageStream = SeekableStream.wrapInputStream(imageInputStream, true);
RenderedOp originalImage = JAI.create(JAI_STREAM_ACTION, seekableImageStream);
((OpImage) originalImage.getRendering()).setTileCache(null);
int origImageWidth = originalImage.getWidth();
double scale = 1.0;
/*
if (pMaxWidth > 0 && origImageWidth > pMaxWidth) {
scale = (double) pMaxWidth / originalImage.getWidth();
} */
ParameterBlock paramBlock = new ParameterBlock();
paramBlock.addSource(originalImage); // The source image
paramBlock.add(scale); // The xScale
paramBlock.add(scale); // The yScale
paramBlock.add(0.0); // The x translation
paramBlock.add(0.0); // The y translation
RenderingHints qualityHints = new RenderingHints(RenderingHints.KEY_RENDERING,
RenderingHints.VALUE_RENDER_QUALITY);
RenderedOp resizedImage = JAI.create(JAI_SUBSAMPLE_AVERAGE_ACTION, paramBlock, qualityHints);
BufferedImage scaledImage = null ;
ByteArrayOutputStream encoderOutputStream = new ByteArrayOutputStream();
JAI.create(JAI_ENCODE_ACTION, resizedImage, encoderOutputStream, JAI_ENCODE_FORMAT_JPEG, null);
//byte[] resizedImageByteArray = encoderOutputStream.toByteArray();
System.out.println("This is from exiting JAI");
return encoderOutputStream.toByteArray();
}
With imgScalr I m also getting the same output.
In openJDK with JAI I get the error -
javax.media.jai.util.ImagingException: All factories fail for the operation "encode"
With imgScalr in openJDK-
javax.imageio.IIOException: Invalid argument to native writeImage
Any other library i can use in java to get the desired result.
Regards
The solution is to convert the image to 3Byte_BGR -
private synchronized BufferedImage getScaledImage1(BufferedImage imageBytes)
{
BufferedImage scaledImage = null;
int type = 0;
try
{
type = imageBytes.getType();
if(type == 0 || type ==6) {
int w = imageBytes.getWidth();
int h = imageBytes.getHeight();
BufferedImage newImage =
new BufferedImage(w, h, BufferedImage.TYPE_3BYTE_BGR);
Graphics2D g = newImage.createGraphics();
g.drawImage(imageBytes, 0, 0, null);
g.dispose();
imageBytes = newImage;
}
scaledImage = Scalr.resize(imageBytes, Scalr.Method.ULTRA_QUALITY,2000,
Scalr.OP_ANTIALIAS);
} catch(Exception e) {
}
return scaledImage;
}
I am trying to delay my live mjpeg video feed by 10 seconds.
I am trying to modify this code and but I am unable to incorporate the mjpg url.
It keeps on saying 'The constructor CaptureMJPEG(String, int, int, int) is undefined' when I try to put the url in.
The original line said:
capture = new CaptureMJPEG(this, capture_xsize, capture_ysize, capture_frames);
I changed it to:
capture = new CaptureMJPEG ("http:/url.com/feed.mjpg", capture_xsize, capture_ysize, capture_frames);
import processing.video.*;
import it.lilik.capturemjpeg.*;
Capture myCapture;
CaptureMJPEG capture;
VideoBuffer monBuff;
int display_xsize = 800; // display size
int display_ysize = 600;
int capture_xsize = 320; // capture size
int capture_ysize = 240;
int delay_time = 10; // delay in seconds
int capture_frames = 20; // capture frames per second
void setup() {
size(display_xsize,display_ysize, P3D);
// Warning: VideoBuffer must be initiated BEFORE capture- or movie-events start
monBuff = new VideoBuffer(delay_time*capture_frames, capture_xsize,capture_ysize);
capture = new CaptureMJPEG ("http:/url.com/feed.mjpg", capture_xsize, capture_ysize, capture_frames);
}
void captureEvent(Capture capture) {
capture.read();
monBuff.addFrame( capture );
}
void draw() {
PImage bufimg = monBuff.getFrame();
PImage tmpimg = createImage(bufimg.width,bufimg.height,RGB);
tmpimg.copy(bufimg,0,0,bufimg.width,bufimg.height,0,0,bufimg.width,bufimg.height);
tmpimg.resize(display_xsize,display_ysize);
image( tmpimg, 0, 0 );
}
class VideoBuffer
{
PImage[] buffer;
int inputFrame = 0;
int outputFrame = 0;
int frameWidth = 0;
int frameHeight = 0;
/*
parameters:
frames - the number of frames in the buffer (fps * duration)
width - the width of the video
height - the height of the video
*/
VideoBuffer( int frames, int width, int height )
{
buffer = new PImage[frames];
for(int i = 0; i < frames; i++)
{
this.buffer[i] = new PImage(width, height);
}
this.inputFrame = frames - 1;
this.outputFrame = 0;
this.frameWidth = width;
this.frameHeight = height;
}
// return the current "playback" frame.
PImage getFrame()
{
int frr;
if(this.outputFrame>=this.buffer.length)
frr = 0;
else
frr = this.outputFrame;
return this.buffer[frr];
}
// Add a new frame to the buffer.
void addFrame( PImage frame )
{
// copy the new frame into the buffer.
System.arraycopy(frame.pixels, 0, this.buffer[this.inputFrame].pixels, 0, this.frameWidth * this.frameHeight);
// advance the input and output indexes
this.inputFrame++;
this.outputFrame++;
// wrap the values..
if(this.inputFrame >= this.buffer.length)
{
this.inputFrame = 0;
}
if(this.outputFrame >= this.buffer.length)
{
this.outputFrame = 0;
}
}
}
Reading the reference docs:
https://bytebucket.org/nolith/capturemjpeg/wiki/api/index.html
These are the only two constructors:
CaptureMJPEG(PApplet parent, String url)
Creates a CaptureMJPEG without HTTP Auth credential
CaptureMJPEG(PApplet parent, String url, String username, String password)
Creates a CaptureMJPEG with HTTP Auth credential
So the first argument must always point to your processing applet instance. So
capture = new CaptureMJPEG (this, "http:/url.com/feed.mjpg", capture_xsize, capture_ysize, capture_frames);
I'm trying to implement animated textures into an OpenGL game seamlessly. I made a generic ImageDecoder class to translate any BufferedImage into a ByteBuffer. It works perfectly for now, though it doesn't load animated images.
I'm not trying to load an animated image as an ImageIcon. I need the BufferedImage to get an OpenGL-compliant ByteBuffer.
How can I load every frames as a BufferedImage array in an animated image ?
On a similar note, how can I get the animation rate / period ?
Does Java handle APNG ?
The following code is an adaption from my own implementation to accommodate the "into array" part.
The problem with gifs is: There are different disposal methods which have to be considered, if you want this to work with all of them. The code below tries to compensate for that. For example there is a special implementation for "doNotDispose" mode, which takes all frames from start to N and paints them on top of each other into a BufferedImage.
The advantage of this method over the one posted by chubbsondubs is that it does not have to wait for the gif animation delays, but can be done basically instantly.
BufferedImage[] array = null;
ImageInputStream imageInputStream = ImageIO.createImageInputStream(new ByteArrayInputStream(data)); // or any other source stream
Iterator<ImageReader> imageReaders = ImageIO.getImageReaders(imageInputStream);
while (imageReaders.hasNext())
{
ImageReader reader = (ImageReader) imageReaders.next();
try
{
reader.setInput(imageInputStream);
frames = reader.getNumImages(true);
array = new BufferedImage[frames];
for (int frameId : frames)
{
int w = reader.getWidth(0);
int h = reader.getHeight(0);
int fw = reader.getWidth(frameId);
int fh = reader.getHeight(frameId);
if (h != fh || w != fw)
{
GifMeta gm = getGifMeta(reader.getImageMetadata(frameId));
// disposalMethodNames: "none", "doNotDispose","restoreToBackgroundColor","restoreToPrevious",
if ("doNotDispose".equals(gm.disposalMethod))
{
image = new BufferedImage(w, h, BufferedImage.TYPE_INT_ARGB);
Graphics2D g = (Graphics2D) image.getGraphics();
for (int f = 0; f <= frameId; f++)
{
gm = getGifMeta(reader.getImageMetadata(f));
if ("doNotDispose".equals(gm.disposalMethod))
{
g.drawImage(reader.read(f), null, gm.imageLeftPosition, gm.imageTopPosition);
}
else
{
// XXX "Unimplemented disposalMethod (" + getName() + "): " + gm.disposalMethod);
}
}
g.dispose();
}
else
{
image = reader.read(frameId);
// XXX "Unimplemented disposalMethod (" + getName() + "): " + gm.disposalMethod;
}
}
else
{
image = reader.read(frameId);
}
if (image == null)
{
throw new NullPointerException();
}
array[frame] = image;
}
}
finally
{
reader.dispose();
}
}
return array;
private final static class GifMeta
{
String disposalMethod = "none";
int imageLeftPosition = 0;
int imageTopPosition = 0;
int delayTime = 0;
}
private GifMeta getGifMeta(IIOMetadata meta)
{
GifMeta gm = new GifMeta();
final IIOMetadataNode gifMeta = (IIOMetadataNode) meta.getAsTree("javax_imageio_gif_image_1.0");
NodeList childNodes = gifMeta.getChildNodes();
for (int i = 0; i < childNodes.getLength(); ++i)
{
IIOMetadataNode subnode = (IIOMetadataNode) childNodes.item(i);
if (subnode.getNodeName().equals("GraphicControlExtension"))
{
gm.disposalMethod = subnode.getAttribute("disposalMethod");
gm.delayTime = Integer.parseInt(subnode.getAttribute("delayTime"));
}
else if (subnode.getNodeName().equals("ImageDescriptor"))
{
gm.imageLeftPosition = Integer.parseInt(subnode.getAttribute("imageLeftPosition"));
gm.imageTopPosition = Integer.parseInt(subnode.getAttribute("imageTopPosition"));
}
}
return gm;
}
I don't think Java supports APNG by default, but you can use an 3rd party library to parse it:
http://code.google.com/p/javapng/source/browse/trunk/javapng2/src/apng/com/sixlegs/png/AnimatedPngImage.java?r=300
That might be your easiest method. As for getting the frames from an animated gif you have to register an ImageObserver:
new ImageIcon( url ).setImageObserver( new ImageObserver() {
public void imageUpdate( Image img, int infoFlags, int x, int y, int width, int height ) {
if( infoFlags & ImageObserver.FRAMEBITS == ImageObserver.FRAMEBITS ) {
// another frame was loaded do something with it.
}
}
});
This loads asynchronously on another thread so imageUpdate() won't be called immediately. But it will be called for each frame as it parses it.
http://docs.oracle.com/javase/1.4.2/docs/api/java/awt/image/ImageObserver.html