How to convert java.awt.Image in Bitmap in android - java

I am trying to convert BufferedImage into bitmap to set bitmap in ImageView but I am getting an error. Is there any way to set java.awt.Image in ImageView?
BufferedImage bImageFromConvert = null;
InputStream in = new ByteArrayInputStream(byt);
try {
bImageFromConvert = ImageIO.read(in);
Image ima = bImageFromConvert;
Bitmap bmp = Bitmap.;
img.setImageBitmap(bmp);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}

Is there any way to set java.awt.Image in ImageView?
Short answer: No.
You can't use Image, BufferedImage (java.awt package) or ImageIO (javax.imageio package) from Android. So I'm guessing the error you see is related to that.
Maybe if you try to ask how to achieve your goal, rather than this specific implementation issue, maybe we could help you further. :-)

Related

How to increase the quality of the Java Netbeans Icons

How can I enhance the sharpness of an image that has been scaled down from 100px to 25px using an image scaler, as it appears to be heavily pixelated?
The rescale code I am using:
public ImageIcon scaleImage(String location, int size){
BufferedImage img = null;
try {
img = ImageIO.read(new File(location));
} catch (IOException e) {
e.printStackTrace();
}
Image dimg = img.getScaledInstance(size, size, Image.SCALE_SMOOTH);
ImageIcon p = new ImageIcon(dimg);
return p;
}
Using a non-resized picture makes the navigation bar larger and does not fit the label.
Pictures scaled down using other applications look the same.

Android MLKit face detection not detecting faces when using Bitmap

I have an XR app, where display shows the camera (rear) feed. As such, capturing the screen is pretty much the same as capturing the camera feed...
As such, I take screenshots (Bitmaps) and then try to detect faces within them using Googles MLKit.
I'm following the official guide to detect faces.
To do this, I first init my face detector:
FaceDetector detector;
public MyFaceDetector(){
FaceDetectorOptions realTimeOpts =
new FaceDetectorOptions.Builder()
.setContourMode(FaceDetectorOptions.CONTOUR_MODE_ALL)
.build();
detector = FaceDetection.getClient(realTimeOpts);
}
I then have a function which passes in a bitmap. I first convert the bitmap to a byte array. I do this because InputImage.fromBitmap is very slow, and MLKit actually tells me that I should use a byte array:
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG, 85, byteArrayOutputStream);
byte[] byteArray = byteArrayOutputStream .toByteArray();
Next I make a mutable copy of the Bitmap (so that I can draw onto it), and set up a Canvas object, along with a color that will be used when drawing on to the Bitmap:
BitmapFactory.Options options = new BitmapFactory.Options();
options.inMutable = true;
Bitmap bmp = BitmapFactory.decodeByteArray(byteArray, 0, byteArray.length, options);
Canvas canvas = new Canvas(bmp);
Paint p = new Paint();
p.setColor(Color.RED);
After all is set up, I create an InputImage (used by the FaceDetector), using the byte array:
InputImage image = InputImage.fromByteArray(byteArray, bmp.getWidth(), bmp.getHeight(),0, InputImage.IMAGE_FORMAT_NV21);
Note the image format... There is a InputImage.IMAGE_FORMAT_BITMAP, but using this throws an IllegalArgumentException. Anyway, I next try to process the Bitmap, detect faces, fill each detected face with the color defined earlier, and then save the Bitmap to disk:
Task<List<Face>> result = detector.process(image).addOnSuccessListener(
new OnSuccessListener<List<Face>>() {
#Override
public void onSuccess(List<Face> faces) {
Log.e("FACE DETECTION APP", "NUMBER OF FACES: " + faces.size());
Thread processor = new Thread(new Runnable() {
#Override
public void run() {
for (Face face : faces) {
Rect destinationRect = face.getBoundingBox();
canvas.drawRect(destinationRect, p);
canvas.save();
Log.e("FACE DETECTION APP", "WE GOT SOME FACCES!!!");
}
File file = new File(someFilePath);
try {
FileOutputStream fOut = new FileOutputStream(file);
bmp.compress(Bitmap.CompressFormat.JPEG, 85, fOut);
fOut.flush();
fOut.close();
} catch (Exception e) {
e.printStackTrace();
}
}
});
processor.start();
}
})
.addOnFailureListener(
new OnFailureListener() {
#Override
public void onFailure(#NonNull Exception e) {
// Task failed with an exception
// ...
}
});
}
While this code runs (i.e. no exceptions) and the bitmap is correctly written to disk, no faces are ever detected (faces.size() is always 0). I've tried rotating the image. I've tried changing the quality of the Bitmap. I've tried with and without the thread to process any detected faces. I've tried everything I can think of.
Anyone have any ideas?
ML Kit InputImage. fromByteArray only support yv12 and nv21 formats. You will need to convert the bitmap to one of these formats in order for ML kit pipeline to process. Also, if the original image you have is a bitmap, you can probably just use InputImage.fromBitmap to construct an InputImage. It shouldn't be slower than your current approach.
I was having the same issue use ImageInput.fromMediaImage(..., ...)
override fun analyze(image: ImageProxy) {
val mediaImage: Image = image.image.takeIf { it != null } ?: run {
image.close()
return
}
val inputImage = InputImage.fromMediaImage(mediaImage, image.imageInfo.rotationDegrees)
// TODO: Your ML Code
}
Check here for more details
https://developers.google.com/ml-kit/vision/image-labeling/android

How to dynamically update image to jlabel

I am trying to create a simple application which can take an image from web cam and display it in a jlabel. but I is not working. I can't understand the reason. my complete project uploaded to here.
I use this library to take the image, following code does it.
// get default webcam and open it
Webcam webcam = Webcam.getDefault();
webcam.open();
// get image
BufferedImage image = webcam.getImage();
try {
// save image to PNG file
ImageIO.write(image, "PNG", new File("test.png"));
} catch (IOException ex) {
Logger.getLogger(TestFrame.class.getName()).log(Level.SEVERE, null, ex);
}
webcam.close();
after taking the image I wrote the following code to display the image to jlabel
String path = "test.png";
imageLbl.setIcon(null);
imageLbl.setIcon(new ImageIcon(path));
imageLbl.revalidate();
imageLbl.repaint();
imageLbl.update(imageLbl.getGraphics());
if there is an image already then it will display to the jlabel. but most reasonlly taken image is not shown. it's hard to explain the situation, I appreciate if you can download and check my project here.
You can use below code to dynamically update image to jlabel.
String path = "test.png";
imageLbl.setIcon(null);
try {
BufferedImage img=ImageIO.read(new File(path));
imageLbl.setIcon(new ImageIcon(img));
imageLbl.revalidate();
imageLbl.repaint();
imageLbl.update(imageLbl.getGraphics());
} catch (IOException ex) {
}

How to rescale and save an BufferedImage

I got a BufferedImage and want to rescale it before saving it as an jpg/png.
I got the following code:
private BufferedImage rescaleTo(BufferedImage img,int minWidth,int minHeight) {
BufferedImage buf = toBufferedImage(img.getScaledInstance(minWidth, minHeight, Image.SCALE_DEFAULT));
BufferedImage ret = new BufferedImage(buf.getWidth(null),buf.getHeight(null),BufferedImage.TYPE_INT_ARGB);
return ret;
}
public BufferedImage toBufferedImage(Image img) {
BufferedImage ret = new BufferedImage(img.getWidth(null),img.getHeight(null),BufferedImage.TYPE_INT_RGB);
Graphics2D g2 = ret.createGraphics();
g2.drawImage(img,0,0,null);
return ret;
}
public String saveTo(BufferedImage image,String URI) throws UtilityException {
try {
if(image == null)
System.out.println("dododod");
ImageIO.write(image, _type, new File(URI));
} catch (IOException e) {
throw new UtilityException(e.getLocalizedMessage());
}
return URI;
}
But as an result I just get a black picture. It must have to do with the rescaling as when I skip it I can save the expected picture.
As a test, set _type="png" and also use file extension .png when you make the call to ImageIO.write(image, _type, new File(URI));. I had issues like you describe and I started writing type PNG and all works fine. Unfortunately, I never went back to debug why I could not write type JPG, GIF etc.

How do I properly load a BufferedImage in java?

Okay, so I've been trying to load a BufferedImage using this code:
URL url = this.getClass().getResource("test.png");
BufferedImage img = (BufferedImage) Toolkit.getDefaultToolkit().getImage(url);
This gives me a type cast error when I run it though, so how do I properly load a BufferedImage?
Use ImageIO.read() instead:
BufferedImage img = ImageIO.read(url);
BufferedImage img = null;
try {
img = ImageIO.read(new File("D:\\work\\files\\logo.jpg"));
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}

Categories