Generally, I am using below code to take a screenshot and attach in allure report :
#Attachment(value = "Page Screenshot", type = "image/png")
public static byte[] saveScreenshotPNG(WebDriver driver) {
return ((TakesScreenshot)driver).getScreenshotAs(OutputType.BYTES);
}
But now my need is I have already some screenshot on my desktop and want to attach it with an allure report. is that possible?
You can take the existing image and convert it to byte[]. getScreenshotAs() decodes the screenshot string so you might need to do it as well
Java
#Attachment(value = "Page Screenshot", type = "image/png")
public static byte[] saveScreenshotPNG(String path) {
File file = new File(path);
BufferedImage bufferedImage = ImageIO.read(file);
byte[] image = null;
try (ByteArrayOutputStream bos = new ByteArrayOutputStream()) {
ImageIO.write(bufferedImage, "png", bos);
image = bos.toByteArray();
} catch (Exception e) { }
// if decoding is not necessary just return image
return image != null ? Base64.getMimeDecoder().decode(image) : null;
}
Python
with open(path, 'rb') as image:
file = image.read()
byte_array = bytearray(file)
allure.attach(byte_array, name="Screenshot", attachment_type=AttachmentType.PNG)
Related
Im trying to generate a QR code using QRGen, encode it in Base64 and insert it as an image in an HTML string. Later, the HTML string is decoded to be displayed in a JEditorPane (and then sent to a printer). To this end, the ImageView class is extended and a custom View factory is used. This all works fine... sometimes. It completely depends on the input string. Some strings work without issue, others fail cause the decode process to fail with the error java.lang.IllegalArgumentException: Input byte array has wrong 4-byte ending unit.
Here is the encode process:
public BufferedImage generateQRCodeImage(String barcodeText) throws Exception {
ByteArrayOutputStream stream = QRCode.from(barcodeText).to(ImageType.PNG).stream();
ByteArrayInputStream bis = new ByteArrayInputStream(stream.toByteArray());
return ImageIO.read(bis);
}
public static String encodeToString(BufferedImage image, String type) {
String imageString = null;
ByteArrayOutputStream bos = new ByteArrayOutputStream();
try {
ImageIO.write(image, type, bos);
byte[] imageBytes = bos.toByteArray();
Base64.Encoder encoder = Base64.getEncoder();
imageString = encoder.encodeToString(imageBytes);
bos.close();
} catch (IOException e) {
e.printStackTrace();
}
return imageString;
}
and the decode process:
private Image loadImage() {
String b64 = getBASE64Image();
BufferedImage newImage = null;
ByteArrayInputStream bais = null;
try {
bais = new ByteArrayInputStream(Base64.getDecoder().decode(b64.getBytes())); //fails here
newImage = ImageIO.read(bais);
} catch (Throwable ex) {
ex.printStackTrace();
}
return newImage;
}
#Override
public URL getImageURL() {
String src = (String) getElement().getAttributes().getAttribute(HTML.Attribute.SRC);
if (isBase64Encoded(src)) {
this.url = BASE64ImageView.class.getProtectionDomain()
.getCodeSource().getLocation();
return this.url;
}
return super.getImageURL();
}
private boolean isBase64Encoded(String src) {
return src != null && src.contains("base64,");
}
private String getBASE64Image() {
String src = (String) getElement().getAttributes().getAttribute(HTML.Attribute.SRC);
if (!isBase64Encoded(src)) {
return null;
}
return src.substring(src.indexOf("base64,") + 7, src.length() - 1);
}
And here is the QR code in question that fails to decode.
<img width='30' height='30' src='data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAH0AAAB9AQAAAACn+1GIAAAApklEQVR4Xu2UMQ4EMQgD/QP+/0vK6zjsvayUMmavWxQpMAUBkwS12wcveAAkgNSCD3rR5Lkgoai3GUCMgWqbAEYR3HxAkZlzU/0MyBisYRsgI1ERFfcpBpA+ze6k56Cj7KTdXNigFWZvSOpsgqLfd18i2aAukXh9TXBNmdWt5gzA/oqzWkkN8HtA7G8CNOwYAiZt3wZixUfkA32OHNQq7Bxs9oI/gC/9fV8AVCkPjQAAAABJRU5ErkJggg=='/>
I did open the above QR in a browser (Chrome) and it does work, which definitely points to something being wrong in the decode process and not the encode process.
Found the issue. In getBASE64Image(), I have
private String getBASE64Image() {
String src = (String) getElement().getAttributes().getAttribute(HTML.Attribute.SRC);
if (!isBase64Encoded(src)) {
return null;
}
return src.substring(src.indexOf("base64,") + 7, src.length() - 1);
}
The "-1" in the substring call was the cause of my problems. Not sure why this would work only sometimes, but removing seems to have fixed the problem.
So, I am downloading the profile picture from the Google SIgn-in api and I save it to a hidden file. The problem is that when I try to retrieve it, it throws me: D/skia: --- Failed to create image decoder with message 'unimplemented'. However when I retrieve an image from FireBaseStorage and save that one to the hidden file I can retrieve it whithout any problems.
I tried BitmapFactory.decodeByteArray(), but then I had a message telling me skia wasn't able to decode the file and it returned null.
The method I use to retrieve the profile picture and call the method that will save the file
private void getUsersPic() {
Bitmap profilePic;
try {
InputStream in = new URL(AppData.getUser().getPicture()).openConnection().getInputStream();
profilePic = BitmapFactory.decodeStream(in);
int size = profilePic.getRowBytes()*profilePic.getHeight();
ByteBuffer b = ByteBuffer.allocate(size);
byte[] bytes = new byte[size];
profilePic.copyPixelsToBuffer(b);
b.position(0);
b.get(bytes, 0, bytes.length);
SaveBitmapToFile.saveBitmap(bytes , AppData.getUser().getName()+AppData.getUser().getLastName());
} catch(Exception e) {
System.out.println("Get profile pic: "+e.toString());
}
}
Save the file
public static void saveBitmap(byte[] bitmap, String key) {
String path = AppData.getAppContext().getFilesDir()+"/.image"+"/";
File fileDir = new File(path);
if(!fileDir.isDirectory())
fileDir.mkdirs();
try {
File bitmapDir = new File(fileDir+"/"+key);
bitmapDir.createNewFile();
FileOutputStream stream = new FileOutputStream(bitmapDir);
stream.write(bitmap);
stream.close();
} catch (IOException e) {
System.out.println("Problem creating file "+e.toString()+ " Directory: "+fileDir);
}
}
Retrieve and return a bitmap
public static Bitmap getBitmap(String key) {
File file = new File(AppData.getAppContext().getFilesDir()+"/.image/"+key);
try {
BufferedInputStream buf = new BufferedInputStream(new FileInputStream(file));
return BitmapFactory.decodeStream(buf);//BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
} catch(Exception e) {
System.out.println("Exception getting bitmap: "+e.toString());
return null;
}
}
The last method should return a Bitmap and it is doing it. It is just not working when the image comes from the Google Sign-in api.
As pskink said in the comment of the post, I had to use compress() instead of copyPixelToBuffer(). Here is my updated method:
private void getUsersPic() {
Bitmap profilePic;
try {
InputStream in = new URL(AppData.getUser().getPicture()).openConnection().getInputStream();
profilePic = BitmapFactory.decodeStream(in);
ByteArrayOutputStream stream = new ByteArrayOutputStream();
profilePic.compress(Bitmap.CompressFormat.PNG, 100, stream);
SaveBitmapToFile.saveBitmap(stream.toByteArray() , AppData.getUser().getName()+AppData.getUser().getLastName());
} catch(Exception e) {
System.out.println("Get profile pic: "+e.toString());
}
}
I have a server and I want to compress images in it. When I write the image, it goes from 23MB to 650kb and it's okay. But when I'm reading it to send it to my client app, the size is back to 23MB.
public static BufferedImage getProfilePicture(String username) throws IOException {
File input = new File(profilePicturePath + File.separatorChar + username + ".png");
BufferedImage image = ImageIO.read(input);
rewriteImage(image);
return image;
}
public static String getProfilePictureBase64(String username) throws IOException {
BufferedImage img = getProfilePicture(username);
final ByteArrayOutputStream os = new ByteArrayOutputStream();
ImageIO.write(img, "png", os);
return Base64.getEncoder().encodeToString(os.toByteArray());
}
So my question is : How can I keep the compressed size to send image to my client ?
I'm sending a photo via MQTT, and trying to save a copy of that.
This is what the receiver does:
connectMQTTAsReceiver("Controller", "tcp://m2m.eclipse.org:1883", "robot");
byte[] photoByte=receiveMQTT.getBytes();
File photo=new File("image2.jpg");
photo.createNewFile();
FileOutputStream outputStream=new FileOutputStream(photo);
outputStream.write(photoByte);
outputStream.close();
With this, the photo is saved but it's completely white.
Then I tried doing this:
connectMQTTAsReceiver("Controller", "tcp://m2m.eclipse.org:1883", "robot");
byte[] photoByte=Base64.getDecoder().decode(receiveMQTT().getBytes(StandardCharsets.UTF_8));
InputStream in = new ByteArrayInputStream(photoByte);
System.out.println("Received");
BufferedImage bufferedImage;
try {
bufferedImage = ImageIO.read(in);
File outputfile = new File("image3.jpg");
ImageIO.write(bufferedImage, "jpg", outputfile);
} catch (IOException e) {
e.printStackTrace();
}
and it gives me this error:
java.lang.IllegalArgumentException: Illegal base64 character -1e
I generated a QRCode using zxing library.
QRCode qrcode = QRCode.from("Encoding string").withSize(17,17).to(ImageType.PNG);
ByteArrayOutputStream out = QRCode.from(output.toString()).withSize(10, 10).to(ImageType.PNG).stream();
FileOutputStream fout = new FileOutputStream(new File("D:\\QR_Code.JPG"));
fout.write(out.toByteArray());
fout.flush();
fout.close();
It works fine but now I want to decode generated QRCode. Is it possible to decode QRCode from an image with zxing? If so, can you give me a hint how to do it because I haven't found appropriate class or method. Thanks in advance.
Here is what you can do:
You'll need an instance of QRCodeReader to decode qrcode data from a BinaryBitmap
You need to instanciate a HybridBinarizer and pass it as a constructor argument to create your BinaryBitmap
The HybridBinarizer needs an instance of LuminanceSource
Take a look at BufferedImageLuminanceSource
Here is an example that would decode data from a buffered image:
public static String qrDecodeFromImage(BufferedImage img) {
if(img!=null) {
LuminanceSource bfImgLuminanceSource = new BufferedImageLuminanceSource(img);
BinaryBitmap binaryBmp = new BinaryBitmap(new HybridBinarizer(bfImgLuminanceSource));
QRCodeReader qrReader = new QRCodeReader();
Result result;
try {
result = qrReader.decode(binaryBmp);
return result.getText();
} catch (NotFoundException e) {} catch (ChecksumException e) {} catch (FormatException e) {}
}
return null;
}
You'll need to have the ZXing project included in your project(source, etc).
Then you can use all sorts of ZXing classes to performing decoding/encoding, etc:
Look into these classes: BinaryBitmap, QRCodeReader, ParsedResult, ResultParser and give this a try:
Bitmap b = ...;//TODO: create a bitmap from your source...
BinaryBitmap bitmap = new BinaryBitmap(new HybridBinarizer(new RGBLuminanceSource(b)));
Result result = null;
QRCodeReader reader = new QRCodeReader();
try {
result = reader.decode(bitmap);
ParsedResult parsedResult = ResultParser.parseResult(result);
//TODO: use parsedResult
}
catch(OutOfMemoryError e) {
}
catch(Exception e) {
}