I'm trying to write a method that accepts an image(Bitmap) and returns a byte[] array. finally, I try to write this byte[] array to a folder so I can see the difference, but my byte[] arraycan not displayed, and in addition, it is not scaled down! This is my method:
private byte[] changeSize(Bitmap image) {
byte[] picture;
int width = image.getWidth();
int height = image.getHeight();
int newHeight = 0, newWidth = 0;
if (width > 250 || height > 250) {
if (width > height) { //landscape-mode
newHeight = 200;
newWidth = (newHeight * width) / height;
} else { //portrait-mode
newWidth = 200;
newHeight = (newWidth * height) / width;
}
} else {
Toast.makeText(this, "Something wrong!", Toast.LENGTH_LONG).show();
}
Bitmap sizeChanged = Bitmap.createScaledBitmap(image, newWidth, newHeight, true);
//Convert bitmap to a byte array
int bytes = sizeChanged.getByteCount();
ByteBuffer bb = ByteBuffer.allocate(bytes);
sizeChanged.copyPixelsFromBuffer(bb);
picture = bb.array();
//Write to a hd
picturePath = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES);
String fileName = edFile.getText().toString() + "_downscaled" + ".jpg";
File file = new File(picturePath, fileName);
FileOutputStream fos = null;
try {
fos = new FileOutputStream(file);
fos.write(picture);
fos.close();
} catch (Exception e) {
e.printStackTrace();
}
return image;
}
I tried several hours to get my byte[] array visible, but I could simply not do this. Any help or hints to show me where I derail is/are very appreciated.
This was working for me
public static Bitmap byteArraytoBitmap(byte[] bytes) {
return (BitmapFactory.decodeByteArray(bytes, 0, bytes.length));
}
public static byte[] bitmaptoByteArray(Bitmap bmp) {
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bmp.compress(Bitmap.CompressFormat.PNG, 100, stream); //PNG format is lossless and will ignore the quality setting!
byte[] byteArray = stream.toByteArray();
return byteArray;
}
public static Bitmap bitmapFromFile(File file) {
//returns null if could not decode
return BitmapFactory.decodeFile(file.getPath());
}
public static boolean saveImage(Bitmap image, String filePath) {
LogInfo(TAG, "Saving image to: " + filePath);
File file = new File(filePath);
File fileDirectory = new File(file.getParent());
LogInfo(TAG, fileDirectory.getPath());
if (!fileDirectory.exists()) {
if (!fileDirectory.mkdirs()) {
Log.e(TAG, "ERROR CREATING DIRECTORIES");
return false;
}
}
try {
file.createNewFile();
FileOutputStream fo = new FileOutputStream(file);
fo.write(bitmaptoByteArray(image));
fo.flush();
fo.close();
return true;
}
catch (Exception e) {
e.printStackTrace();
return false;
}
}
Related
I'm getting an image from gallery into a layout , then I'm getting the bitmap of that layout by using getBitmap(), after getting bitmap I'm saving the image into device storage by using saveImage().
After getting bitmap and saving the bitmap, the quality and pixel of that bitmap reduces too much as shown in the pictureSaved image AND Orignal image
Here is the code for getting and saving bitmap
private Bitmap getBitmap(View v) {
v.clearFocus();
v.setPressed(false);
boolean willNotCache = v.willNotCacheDrawing();
v.setWillNotCacheDrawing(false);
int color = v.getDrawingCacheBackgroundColor();
v.setDrawingCacheBackgroundColor(0);
if (color != 0) {
v.destroyDrawingCache();
}
v.buildDrawingCache();
Bitmap cacheBitmap = v.getDrawingCache();
if (cacheBitmap == null) {
Toast.makeText(this, "Something Wrong", Toast.LENGTH_SHORT).show();
return null;
}
Bitmap lastimage = Bitmap.createBitmap(cacheBitmap);
v.destroyDrawingCache();
v.setWillNotCacheDrawing(willNotCache);
v.setDrawingCacheBackgroundColor(color);
//2048x2048 resolution
int newWidth = 2048;
int newHeight = 2048;
int width = lastimage.getWidth();
int height = lastimage.getHeight();
float scaleWidth = ((float) newWidth) / width;
float scaleHeight = ((float) newHeight) / height;
Matrix matrix = new Matrix();
matrix.postScale(scaleWidth, scaleHeight);
Bitmap resizedBitmap = Bitmap.createBitmap(lastimage, 0, 0, width, height, matrix, true);
lastimage.recycle();
saveImage(resizedBitmap);
return resizedBitmap;
}
For Saving:
private void saveImage (Bitmap finalBitmap) {
String root = Environment.getExternalStorageDirectory().toString();
File myDir = new File(root + "/NewFolder");
myDir.mkdirs();
String fname = "Image-.png";
File file = new File(myDir, fname);
try {
FileOutputStream out = new FileOutputStream(file);
finalBitmap.compress(Bitmap.CompressFormat.PNG, 100, out);
out.flush();
out.close();
} catch (Exception e) {
e.printStackTrace();
}
}
Thanks for anyone's help.
I'm encountering an issue with bitmap factory.
I've got a method to reduce and rotate an image to show a preview in an image view, but I would like to save this with the new size.
I'm just turning around with inputfilestream and outputfilestream but don't get to save it.
Is anybody know a clear method to put my bitmap in an outpufilestream?
Thanks a lot
here's my code
#Override
protected void onResume() {
super.onResume();
File[] fileArray;
final File root;
File chemin = Environment.getExternalStorageDirectory();
String filepath = chemin + "/SmartCollecte/PARC/OUT/" + fichano + "_" + conteneur_s+"_"+cpt+".jpg";
try {
decodeFile(filepath);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}}
public void decodeFile(String filePath) {
// Decode image size
BitmapFactory.Options o = new BitmapFactory.Options();
o.inJustDecodeBounds = true;
BitmapFactory.decodeFile(filePath, o);
// The new size we want to scale to
final int REQUIRED_SIZE = 1024;
// Find the correct scale value. It should be the power of 2.
int width_tmp = o.outWidth, height_tmp = o.outHeight;
int scale = 1;
while (true) {
if (width_tmp < REQUIRED_SIZE && height_tmp < REQUIRED_SIZE)
break;
width_tmp /= 2;
height_tmp /= 2;
scale *= 2;
}
// Decode with inSampleSize
BitmapFactory.Options o2 = new BitmapFactory.Options();
o2.inSampleSize = scale;
Bitmap b1 = BitmapFactory.decodeFile(filePath, o2);
Bitmap b = ExifUtils.rotateBitmap(filePath, b1);
FileOutputStream fos = new FileOutputStream(filePath);
b.compress(Bitmap.CompressFormat.PNG,100,fos);
fos.close();
showImg.setImageBitmap(b);
}
Have you tried doing it like this?
Assuming bitmap is bitmap you want to save.
Also, take a look at some existing system directories.
final FileOutputStream fos = new FileOutputStream(new File(filepath + "_scaled.jpg"));
try {
bitmap.compress(Bitmap.CompressFormat.JPEG, 90, fos);
} catch (IOException e) {
// handle exception
} finally {
fos.close
}
Source
Where first parameter of Bitmap.compress() is your desired output format (see CompressFormat) and the second parameter is compression quality.
ok I found out what was missing.
had to create a new byte array to convert my bitmap to file :
String filepathcomp = Environment.getExternalStorageDirectory()+"/SmartCollecte/PARC/OUT/"+ fichano + "_" + conteneur_s+"_"+cpt+".jpg";
File f = new File(filepathcomp);
Bitmap newbitmap = b;
ByteArrayOutputStream bos = new ByteArrayOutputStream();
newbitmap.compress(Bitmap.CompressFormat.JPEG,80,bos);
byte[] bitmapdata = bos.toByteArray();
FileOutputStream fos = new FileOutputStream(f);
fos.write(bitmapdata);
fos.flush();
fos.close();
I'm getting preview frames using OnImageAvailableListener:
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
byte[] data = new byte[buffer.capacity()];
buffer.get(data);
//data.length=332803; width=3264; height=2448
Log.e(TAG, "data.length=" + data.length + "; width=" + image.getWidth() + "; height=" + image.getHeight());
//TODO data processing
} catch (Exception e) {
e.printStackTrace();
}
if (image != null) {
image.close();
}
}
Each time length of data is different but image width and height are the same.
Main problem: data.length is too small for such resolution as 3264x2448.
Size of data array should be 3264*2448=7,990,272, not 300,000 - 600,000.
What is wrong?
imageReader = ImageReader.newInstance(3264, 2448, ImageFormat.JPEG, 5);
I solved this problem by using YUV_420_888 image format and converting it to JPEG image format manually.
imageReader = ImageReader.newInstance(MAX_PREVIEW_WIDTH, MAX_PREVIEW_HEIGHT,
ImageFormat.YUV_420_888, 5);
imageReader.setOnImageAvailableListener(this, null);
Surface imageSurface = imageReader.getSurface();
List<Surface> surfaceList = new ArrayList<>();
//...add other surfaces
previewRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
previewRequestBuilder.addTarget(imageSurface);
surfaceList.add(imageSurface);
cameraDevice.createCaptureSession(surfaceList,
new CameraCaptureSession.StateCallback() {
//...implement onConfigured, onConfigureFailed for StateCallback
}, null);
#Override
public void onImageAvailable(ImageReader reader) {
Image image = reader.acquireLatestImage();
if (image != null) {
//converting to JPEG
byte[] jpegData = ImageUtils.imageToByteArray(image);
//write to file (for example ..some_path/frame.jpg)
FileManager.writeFrame(FILE_NAME, jpegData);
image.close();
}
}
public final class ImageUtil {
public static byte[] imageToByteArray(Image image) {
byte[] data = null;
if (image.getFormat() == ImageFormat.JPEG) {
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
data = new byte[buffer.capacity()];
buffer.get(data);
return data;
} else if (image.getFormat() == ImageFormat.YUV_420_888) {
data = NV21toJPEG(
YUV_420_888toNV21(image),
image.getWidth(), image.getHeight());
}
return data;
}
private static byte[] YUV_420_888toNV21(Image image) {
byte[] nv21;
ByteBuffer yBuffer = image.getPlanes()[0].getBuffer();
ByteBuffer vuBuffer = image.getPlanes()[2].getBuffer();
int ySize = yBuffer.remaining();
int vuSize = vuBuffer.remaining();
nv21 = new byte[ySize + vuSize];
yBuffer.get(nv21, 0, ySize);
vuBuffer.get(nv21, ySize, vuSize);
return nv21;
}
private static byte[] NV21toJPEG(byte[] nv21, int width, int height) {
ByteArrayOutputStream out = new ByteArrayOutputStream();
YuvImage yuv = new YuvImage(nv21, ImageFormat.NV21, width, height, null);
yuv.compressToJpeg(new Rect(0, 0, width, height), 100, out);
return out.toByteArray();
}
}
public final class FileManager {
public static void writeFrame(String fileName, byte[] data) {
try {
BufferedOutputStream bos = new BufferedOutputStream(new FileOutputStream(fileName));
bos.write(data);
bos.flush();
bos.close();
// Log.e(TAG, "" + data.length + " bytes have been written to " + filesDir + fileName + ".jpg");
} catch (IOException e) {
e.printStackTrace();
}
}
}
I am not sure, but I think you are taking only one of the plane of the YUV_420_888 format (luminance part).
In my case, I usually transform my image to byte[] in this way.
Image m_img;
Log.v(LOG_TAG,"Format -> "+m_img.getFormat());
Image.Plane Y = m_img.getPlanes()[0];
Image.Plane U = m_img.getPlanes()[1];
Image.Plane V = m_img.getPlanes()[2];
int Yb = Y.getBuffer().remaining();
int Ub = U.getBuffer().remaining();
int Vb = V.getBuffer().remaining();
data = new byte[Yb + Ub + Vb];
//your data length should be this byte array length.
Y.getBuffer().get(data, 0, Yb);
U.getBuffer().get(data, Yb, Ub);
V.getBuffer().get(data, Yb+ Ub, Vb);
final int width = m_img.getWidth();
final int height = m_img.getHeight();
And I use this byte buffer to transform to rgb.
Hope this helps.
Cheers.
Unai.
Your code is requesting JPEG-format images, which are compressed. They'll change in size for every frame, and they'll be much smaller than the uncompressed image. If you want to do nothing besides save JPEG images, you can just save what you have in the byte[] data to disk and you're done.
If you want to actually do something with the JPEG, you can use BitmapFactory.decodeByteArray() to convert it to a Bitmap, for example, though that's pretty inefficient.
Or you can switch to YUV, which is more efficient, but you need to do more work to get a Bitmap out of it.
I was trying to copy a picture from URI to a file path. Then I read the picture from the path, but the picture I got was rotated 90 degrees down. Below is my function. Anybody can help on this?
public boolean copyPicture(Context context, Uri source, String dest) {
boolean result = false;
int bytesum = 0;
int byteread = 0;
File destFile = new File(dest);
String scheme = source.getScheme();
if (ContentResolver.SCHEME_CONTENT.equals(scheme)
|| ContentResolver.SCHEME_FILE.equals(scheme)) {
InputStream inStream = null;
try {
inStream = context.getContentResolver().openInputStream(source);
if (!destFile.exists()) {
result = destFile.createNewFile();
}
if (result) {
FileOutputStream fs = new FileOutputStream(dest);
byte[] buffer = new byte[1024];
while ((byteread = inStream.read(buffer)) != -1) {
bytesum += byteread; //字节数 文件大小
System.out.println(bytesum);
fs.write(buffer, 0, byteread);
}
inStream.close();
fs.flush();
fs.close();
}
} catch (Exception e) {
e.printStackTrace();
result = false;
}
}
return result;
}
EXIF INTERFACE is the answer. It allows you to read specified attributes from a image file.
Bitmap bitmap = BitmapFactory.decodeFile(path, options);
try {
ExifInterface exif = new ExifInterface(path);
int orientation = exif.getAttributeInt(ExifInterface.TAG_ORIENTATION, ExifInterface.ORIENTATION_NORMAL);
int angle = 0;
if (orientation == ExifInterface.ORIENTATION_ROTATE_90) {
angle = 90;
}
else if (orientation == ExifInterface.ORIENTATION_ROTATE_180) {
angle = 180;
}
else if (orientation == ExifInterface.ORIENTATION_ROTATE_270) {
angle = 270;
}
Matrix mat = new Matrix();
mat.postRotate(angle);
Bitmap correctBmp = Bitmap.createBitmap(bitmap, 0, 0, bitmap.getWidth(), bitmap.getHeight(), mat, true);
bitmap=correctBmp;
}
catch(Exception e){
}
I have this code where i get an InputStream and create an image:
Part file;
// more code
try {
InputStream is = file.getInputStream();
File f = new File("C:\\ImagenesAlmacen\\QR\\olaKeAse.jpg");
OutputStream os = new FileOutputStream(f);
byte[] buf = new byte[1024];
int len;
while ((len = is.read(buf)) > 0) {
os.write(buf, 0, len);
}
os.close();
is.close();
} catch (IOException e) {
System.out.println("Error");
}
The problem is that I have to resize that image before i create if from the InputStream
So how to resize what I get from the InputStream and then create that resized image. I want to set the largest side of the image to 180px and resize the other side with that proportion.
Example:
Image = 289px * 206px
Resized image = 180px* 128px
I did this:
try {
InputStream is = file.getInputStream();
Image image = ImageIO.read(is);
BufferedImage bi = this.createResizedCopy(image, 180, 180, true);
ImageIO.write(bi, "jpg", new File("C:\\ImagenesAlmacen\\QR\\olaKeAse.jpg"));
} catch (IOException e) {
System.out.println("Error");
}
BufferedImage createResizedCopy(Image originalImage, int scaledWidth, int scaledHeight, boolean preserveAlpha) {
int imageType = preserveAlpha ? BufferedImage.TYPE_INT_RGB : BufferedImage.TYPE_INT_ARGB;
BufferedImage scaledBI = new BufferedImage(scaledWidth, scaledHeight, imageType);
Graphics2D g = scaledBI.createGraphics();
if (preserveAlpha) {
g.setComposite(AlphaComposite.Src);
}
g.drawImage(originalImage, 0, 0, scaledWidth, scaledHeight, null);
g.dispose();
return scaledBI;
}
And I did not use the other code.
Hope helps someone!