I'm sending a photo via MQTT, and trying to save a copy of that.
This is what the receiver does:
connectMQTTAsReceiver("Controller", "tcp://m2m.eclipse.org:1883", "robot");
byte[] photoByte=receiveMQTT.getBytes();
File photo=new File("image2.jpg");
photo.createNewFile();
FileOutputStream outputStream=new FileOutputStream(photo);
outputStream.write(photoByte);
outputStream.close();
With this, the photo is saved but it's completely white.
Then I tried doing this:
connectMQTTAsReceiver("Controller", "tcp://m2m.eclipse.org:1883", "robot");
byte[] photoByte=Base64.getDecoder().decode(receiveMQTT().getBytes(StandardCharsets.UTF_8));
InputStream in = new ByteArrayInputStream(photoByte);
System.out.println("Received");
BufferedImage bufferedImage;
try {
bufferedImage = ImageIO.read(in);
File outputfile = new File("image3.jpg");
ImageIO.write(bufferedImage, "jpg", outputfile);
} catch (IOException e) {
e.printStackTrace();
}
and it gives me this error:
java.lang.IllegalArgumentException: Illegal base64 character -1e
Related
Generally, I am using below code to take a screenshot and attach in allure report :
#Attachment(value = "Page Screenshot", type = "image/png")
public static byte[] saveScreenshotPNG(WebDriver driver) {
return ((TakesScreenshot)driver).getScreenshotAs(OutputType.BYTES);
}
But now my need is I have already some screenshot on my desktop and want to attach it with an allure report. is that possible?
You can take the existing image and convert it to byte[]. getScreenshotAs() decodes the screenshot string so you might need to do it as well
Java
#Attachment(value = "Page Screenshot", type = "image/png")
public static byte[] saveScreenshotPNG(String path) {
File file = new File(path);
BufferedImage bufferedImage = ImageIO.read(file);
byte[] image = null;
try (ByteArrayOutputStream bos = new ByteArrayOutputStream()) {
ImageIO.write(bufferedImage, "png", bos);
image = bos.toByteArray();
} catch (Exception e) { }
// if decoding is not necessary just return image
return image != null ? Base64.getMimeDecoder().decode(image) : null;
}
Python
with open(path, 'rb') as image:
file = image.read()
byte_array = bytearray(file)
allure.attach(byte_array, name="Screenshot", attachment_type=AttachmentType.PNG)
I want to save some images in internal storage. Here is my code:
Bitmap bitmap = ((BitmapDrawable)iv_add.getDrawable()).getBitmap();
File file = getApplicationContext().getDir("Images",MODE_PRIVATE);
file = new File(file, "UniqueFileName"+".jpg");
try{
OutputStream stream = null;
stream = new FileOutputStream(file);
bitmap.compress(Bitmap.CompressFormat.JPEG,100,stream);
stream.flush();
stream.close();
}catch (IOException e)
{
e.printStackTrace();
}
As I understood, the picture needs to go into internal_storage/android/data/project_name/file. When I choose a picture from my gallery and click the button to save it, nothing happens and the program starts lagging. What can I do?
This line is your problem.
ContextWrapper wrapper = new ContextWrapper(getApplicationContext());
You are not supposed to create the context wrapper yourself.
File file = getApplicationContext().getDir("Images",MODE_PRIVATE);
file = new File(file, "UniqueFileName"+".jpg");
try{
OutputStream stream = null;
stream = new FileOutputStream(file);
bitmap.compress(Bitmap.CompressFormat.JPEG,100,stream);
stream.flush();
stream.close();
}catch (IOException e)
{
e.printStackTrace();
}
So, I am downloading the profile picture from the Google SIgn-in api and I save it to a hidden file. The problem is that when I try to retrieve it, it throws me: D/skia: --- Failed to create image decoder with message 'unimplemented'. However when I retrieve an image from FireBaseStorage and save that one to the hidden file I can retrieve it whithout any problems.
I tried BitmapFactory.decodeByteArray(), but then I had a message telling me skia wasn't able to decode the file and it returned null.
The method I use to retrieve the profile picture and call the method that will save the file
private void getUsersPic() {
Bitmap profilePic;
try {
InputStream in = new URL(AppData.getUser().getPicture()).openConnection().getInputStream();
profilePic = BitmapFactory.decodeStream(in);
int size = profilePic.getRowBytes()*profilePic.getHeight();
ByteBuffer b = ByteBuffer.allocate(size);
byte[] bytes = new byte[size];
profilePic.copyPixelsToBuffer(b);
b.position(0);
b.get(bytes, 0, bytes.length);
SaveBitmapToFile.saveBitmap(bytes , AppData.getUser().getName()+AppData.getUser().getLastName());
} catch(Exception e) {
System.out.println("Get profile pic: "+e.toString());
}
}
Save the file
public static void saveBitmap(byte[] bitmap, String key) {
String path = AppData.getAppContext().getFilesDir()+"/.image"+"/";
File fileDir = new File(path);
if(!fileDir.isDirectory())
fileDir.mkdirs();
try {
File bitmapDir = new File(fileDir+"/"+key);
bitmapDir.createNewFile();
FileOutputStream stream = new FileOutputStream(bitmapDir);
stream.write(bitmap);
stream.close();
} catch (IOException e) {
System.out.println("Problem creating file "+e.toString()+ " Directory: "+fileDir);
}
}
Retrieve and return a bitmap
public static Bitmap getBitmap(String key) {
File file = new File(AppData.getAppContext().getFilesDir()+"/.image/"+key);
try {
BufferedInputStream buf = new BufferedInputStream(new FileInputStream(file));
return BitmapFactory.decodeStream(buf);//BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
} catch(Exception e) {
System.out.println("Exception getting bitmap: "+e.toString());
return null;
}
}
The last method should return a Bitmap and it is doing it. It is just not working when the image comes from the Google Sign-in api.
As pskink said in the comment of the post, I had to use compress() instead of copyPixelToBuffer(). Here is my updated method:
private void getUsersPic() {
Bitmap profilePic;
try {
InputStream in = new URL(AppData.getUser().getPicture()).openConnection().getInputStream();
profilePic = BitmapFactory.decodeStream(in);
ByteArrayOutputStream stream = new ByteArrayOutputStream();
profilePic.compress(Bitmap.CompressFormat.PNG, 100, stream);
SaveBitmapToFile.saveBitmap(stream.toByteArray() , AppData.getUser().getName()+AppData.getUser().getLastName());
} catch(Exception e) {
System.out.println("Get profile pic: "+e.toString());
}
}
I'm accessing an external api and I'm expecting to get an image as response (byte[]). My method that connects to this endpoint looks like this:
private byte[] retrieveImage(String uri) {
byte[] imageBytes = null;
try {
URL url = new URL(uri);
BufferedImage bufferedImage = ImageIO.read(url);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ImageIO.write(image, "png", baos);
imageBytes = baos.toByteArray();
} catch (Exception ex) {
throw new ImageNotReadException(ex.getLocalizedMessage());
}
return imageBytes;
}
It turned out that if I pass the wrong parameters to the target endpoint I get this error message:
So basically I would like to throw the same error above but I also would like to throw an ImageNotReadException (java.lang.IllegalArgumentException: image == null! ) in case the program fails to read the image (byte[]). So basically, my method private byte[] retrieveImage(String uri) would have to throw my read image exception and the endpoint response exception.
Any tips?
Appreciate the help!
As i have commented see below options
private Response retrieveImage(String uri) {
byte[] imageBytes = null;
Response r=new Response();
try {
URL url = new URL(uri);
BufferedImage bufferedImage = ImageIO.read(url);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ImageIO.write(image, "png", baos);
imageBytes = baos.toByteArray();
r.setImage(imageBytes);
r.setStatus(1);
} catch (Exception ex) {
r.setStatus(0);
}
return r;
}
Response :
class Response{
String status;
byte[] image;
//getters setters
}
Or :
private Response retrieveImage(String uri)throws CustomException {
byte[] imageBytes = null;
try {
URL url = new URL(uri);
BufferedImage bufferedImage = ImageIO.read(url);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ImageIO.write(image, "png", baos);
imageBytes = baos.toByteArray();
} catch (Exception ex) {
throw new CustomException(ex.getLocalizedMessage());
}
return imageBytes;
}
I'm uploading images (one at a time) from an android application (using retrofit2 and okhttp3) to a springboot server. I'm using base64 to convert the image file to a string. I got it working for one image on my phone but it won't work for any other images.
Android code:
try {
Bitmap bm = BitmapFactory.decodeFile(imageLocation);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
bm.compress(Bitmap.CompressFormat.JPEG, 100, baos);
byte[] b = baos.toByteArray();
String base64Image = Base64.encodeToString(b, Base64.DEFAULT);
baos.flush();
baos.close();
Log.d(TAG, base64Image);
beconnect.uploadPhoto(base64Image, "imagename");
} catch (FileNotFoundException e) {
System.out.println("Image not found" + e);
} catch (IOException ioe) {
System.out.println("Exception while reading the Image " + ioe);
}
#POST("/uploadImage")
#FormUrlEncoded
<Observable<retrofit2.Response<String>> uploadPhoto(#Field("image") String image, #Field("name") String name);//#Part("desc") RequestBody desc, #Part MultipartBody.Part image);
This is the original post for the image that works...
POST http://192.168.1.11:8080/uploadImage http/1.1 (393608-byte body)
This is the post that fails...
POST http://192.168.1.11:8080/uploadImage http/1.1 (11115982-byte body)
Error:
HTTP FAILED: java.net.SocketException: Broken pipe at
okhttp3.internal.http1.Http1Codec$FixedLengthSink.write(Http1Codec.java:286) at
okhttp3.internal.http.CallServerInterceptor.intercept(CallServerInterceptor.java:63)
....
I'm aware that the byte sizes vary between the two images...is this the cause of the issue and how do I fix it?