Why the Bitmap is always null, from image byte array? - java

I have a problem and can not solve in my application. The application performs operations on images like PNG, the image is convert in a byte array, then a piece from this array of bytes is performed on bitwise operations, the problem is the new series of new bitmap format byte is always null. I just do not understand why the new bitmap, from new array byte, is always null and not know how to fix it this bug.
// GetByte method from Image
private byte[] getByteImageData(String filePath) {
/*
Bitmap bitmap = BitmapFactory.decodeFile(filePath);
Bitmap mutable = bitmap.copy(Bitmap.Config.RGB_565, true);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
mutable.compress(Bitmap.CompressFormat.PNG, 100, baos);
return baos.toByteArray();
*/
byte[] _imagebytedata = new byte[1024];
InputStream _input = null;
try {
if (filePath != null && (filePath.length() > 0)) {
// Create a file for image
File _fileimage = new File(filePath);
if (_fileimage.exists()) {
// Get the byte from file image
_input = new BufferedInputStream(new FileInputStream(
_fileimage));
_imagebytedata = new byte[(int) _fileimage.length()];
_input.read(_imagebytedata, 0, (int) _fileimage.length());
_input.close();
}
}
} catch (Exception e) {
}
// Bitwise operations
private byte[] Text(byte[] imagedata, byte[] textmess, int offset) {
for (int i = 0; i < textmess.length; ++i) {
int add = textmess[i];
for (int bit = 7; bit >= 0; --bit, ++offset) {
int b = (add >>> bit) & 1;
imagedata[offset] = (byte) ((imagedata[offset] & 0xFE) |b);
}
}
return imagedata;
}
//Save image from new byte array
private boolean saveImage(String pathFile,byte[] encodedimage) {
OutputStream _output = null;
File _newFileImage = new File(pathFile);
byte[] _encodedimage = encodedimage;
//Bitmap _imagebitmap = BitmapFactory.decodeByteArray(encodedimage, 0, encodedimage.length);
if (_newFileImage.exists()) {
try {
_output = new BufferedOutputStream(new FileOutputStream(
_newFileImage));
_output.write(_encodedimage, 0, _encodedimage.length);
_output.flush();
_output.close();
return true;
} catch (Exception e) {
}
;
}// _newFileImage.exists()
return false;
}
public boolean encodeTextInFile(String filepath, String text) {
byte[] _newimagebytedata;
byte[] _imagebytedata = getByteImageData(filepath);
byte[] _textbytedata = text.getBytes();
byte[] _lengthbytedata = byteConversion(text.length());
Bitmap _bitmapunu = BitmapFactory.decodeByteArray(_imagebytedata, 0, _imagebytedata.length);
_newimagebytedata = Text(_imagebytedata, _lengthbytedata, 33);
Bitmap _bitmapdoi = BitmapFactory.decodeByteArray(_newimagebytedata, 0, _newimagebytedata.length);
// The value of variable _bitmapdoi is null
_newimagebytedata = Text(_imagebytedata, _textbytedata, 65);
return saveImage(filepath, _newimagebytedata);
}

It looks as if you are trying to encode a text message in the lower bits of the image (if I understand your code correctly). I actually used this as a christmas card for fellow geeks this year.
However, when you create Text you encode the text into the byte[] of the image file thus probably destroying the image (unless you are very lucky). You probably want your addition of the text bytes to be on the decoded image (Bitmap _bitmapunu).
The javadoc for Bitmap.decodeByteArray says that it will return null if the image can not be decoded.
This is what you need to do:
Read the image bytes from the file, say fileArray.
Decode the fileArray into actual pixels, imageArray
Manipulate the pixels in imageArray
Encode the pixels into a image format again (such as png), say newFileArray.
Store the newFileArray to a file.
What you seem to be doing is trying to manipulate the bytes in fileArray directly, thus breaking the file format and making it impossible to decode the bytes into pixels.

Related

Android Camera2 API YUV_420_888 to JPEG

I'm getting preview frames using OnImageAvailableListener:
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
byte[] data = new byte[buffer.capacity()];
buffer.get(data);
//data.length=332803; width=3264; height=2448
Log.e(TAG, "data.length=" + data.length + "; width=" + image.getWidth() + "; height=" + image.getHeight());
//TODO data processing
} catch (Exception e) {
e.printStackTrace();
}
if (image != null) {
image.close();
}
}
Each time length of data is different but image width and height are the same.
Main problem: data.length is too small for such resolution as 3264x2448.
Size of data array should be 3264*2448=7,990,272, not 300,000 - 600,000.
What is wrong?
imageReader = ImageReader.newInstance(3264, 2448, ImageFormat.JPEG, 5);
I solved this problem by using YUV_420_888 image format and converting it to JPEG image format manually.
imageReader = ImageReader.newInstance(MAX_PREVIEW_WIDTH, MAX_PREVIEW_HEIGHT,
ImageFormat.YUV_420_888, 5);
imageReader.setOnImageAvailableListener(this, null);
Surface imageSurface = imageReader.getSurface();
List<Surface> surfaceList = new ArrayList<>();
//...add other surfaces
previewRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
previewRequestBuilder.addTarget(imageSurface);
surfaceList.add(imageSurface);
cameraDevice.createCaptureSession(surfaceList,
new CameraCaptureSession.StateCallback() {
//...implement onConfigured, onConfigureFailed for StateCallback
}, null);
#Override
public void onImageAvailable(ImageReader reader) {
Image image = reader.acquireLatestImage();
if (image != null) {
//converting to JPEG
byte[] jpegData = ImageUtils.imageToByteArray(image);
//write to file (for example ..some_path/frame.jpg)
FileManager.writeFrame(FILE_NAME, jpegData);
image.close();
}
}
public final class ImageUtil {
public static byte[] imageToByteArray(Image image) {
byte[] data = null;
if (image.getFormat() == ImageFormat.JPEG) {
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
data = new byte[buffer.capacity()];
buffer.get(data);
return data;
} else if (image.getFormat() == ImageFormat.YUV_420_888) {
data = NV21toJPEG(
YUV_420_888toNV21(image),
image.getWidth(), image.getHeight());
}
return data;
}
private static byte[] YUV_420_888toNV21(Image image) {
byte[] nv21;
ByteBuffer yBuffer = image.getPlanes()[0].getBuffer();
ByteBuffer vuBuffer = image.getPlanes()[2].getBuffer();
int ySize = yBuffer.remaining();
int vuSize = vuBuffer.remaining();
nv21 = new byte[ySize + vuSize];
yBuffer.get(nv21, 0, ySize);
vuBuffer.get(nv21, ySize, vuSize);
return nv21;
}
private static byte[] NV21toJPEG(byte[] nv21, int width, int height) {
ByteArrayOutputStream out = new ByteArrayOutputStream();
YuvImage yuv = new YuvImage(nv21, ImageFormat.NV21, width, height, null);
yuv.compressToJpeg(new Rect(0, 0, width, height), 100, out);
return out.toByteArray();
}
}
public final class FileManager {
public static void writeFrame(String fileName, byte[] data) {
try {
BufferedOutputStream bos = new BufferedOutputStream(new FileOutputStream(fileName));
bos.write(data);
bos.flush();
bos.close();
// Log.e(TAG, "" + data.length + " bytes have been written to " + filesDir + fileName + ".jpg");
} catch (IOException e) {
e.printStackTrace();
}
}
}
I am not sure, but I think you are taking only one of the plane of the YUV_420_888 format (luminance part).
In my case, I usually transform my image to byte[] in this way.
Image m_img;
Log.v(LOG_TAG,"Format -> "+m_img.getFormat());
Image.Plane Y = m_img.getPlanes()[0];
Image.Plane U = m_img.getPlanes()[1];
Image.Plane V = m_img.getPlanes()[2];
int Yb = Y.getBuffer().remaining();
int Ub = U.getBuffer().remaining();
int Vb = V.getBuffer().remaining();
data = new byte[Yb + Ub + Vb];
//your data length should be this byte array length.
Y.getBuffer().get(data, 0, Yb);
U.getBuffer().get(data, Yb, Ub);
V.getBuffer().get(data, Yb+ Ub, Vb);
final int width = m_img.getWidth();
final int height = m_img.getHeight();
And I use this byte buffer to transform to rgb.
Hope this helps.
Cheers.
Unai.
Your code is requesting JPEG-format images, which are compressed. They'll change in size for every frame, and they'll be much smaller than the uncompressed image. If you want to do nothing besides save JPEG images, you can just save what you have in the byte[] data to disk and you're done.
If you want to actually do something with the JPEG, you can use BitmapFactory.decodeByteArray() to convert it to a Bitmap, for example, though that's pretty inefficient.
Or you can switch to YUV, which is more efficient, but you need to do more work to get a Bitmap out of it.

Image transfering btween JavaFX and Android via Bluetooth

I'm trying implement a simple file transfering app.
My app does like that:
1. Capture current camera preview in Android
2. Send it to Javafx application via Bluetooth
3. When Javafx app received the image saving it and show on the window
4. After some drawing over the image capture it then send it to Android again
I implemented like this on Android side first
I created a kind of packet which contains file size, actual data and eof.
ByteArrayOutputStream baos = new ByteArrayOutputStream();
data.compress(Bitmap.CompressFormat.JPEG, 100, baos);
// filesize + data + end of file
String fileSize = String.valueOf(baos.size());
fileSize += '\0'; // end of the filesize string
String eof = "eof"; // end of packet
// set packet size
int packetSize = fileSize.length() + baos.size() + eof.length();
ByteBuffer byteBuffer = ByteBuffer.allocate(packetSize);
byteBuffer.put(fileSize.getBytes());
byteBuffer.put(baos.toByteArray());
byteBuffer.put(eof.getBytes());
byte[] data = new byte[byteBuffer.capacity()];
byteBuffer.position(0);
byteBuffer.get(data);
and send the data to bluetooth outputstream socket
On JavaFX side,
bytes = btIn.read(buffer);
// receive packet data
if (fileSize == null) {
for (int i = 0; i < bytes; i++) {
if (buffer[i] == '\0') {
fileSize = new String(buffer, 0, i);
break;
}
}
}
// eof offset
int offset = bytes - 3;
byte[] eofByte = new byte[3];
eofByte = Arrays.copyOfRange(buffer, offset, bytes);
String message = new String(eofByte, 0, 3);
if (message.equals("eof")) {
eof = true;
fos.write(buffer, 0, bytes-3);
} else {
// set buffer size to file size
if (fileBuffer == null) {
fileBuffer = ByteBuffer.allocate(Integer.parseInt(fileSize));
fileBuffer.put(buffer, fileSize.length()+1, bytes);
fos.write(buffer, fileSize.length()+1, bytes);
} else {
fileBuffer.put(buffer, 0, bytes);
fos.write(buffer, 0, bytes);
// for progress bar
percent = (float) fileBuffer.position() / (float) fileBuffer.capacity();
}
}
log(String.valueOf(percent));
if (eof) {
byte[] data = new byte[fileBuffer.capacity()];
fileBuffer.position(0);
fileBuffer.get(data);
ByteArrayInputStream input = new ByteArrayInputStream(data);
bufferedImage = ImageIO.read(input);
image = SwingFXUtils.toFXImage(bufferedImage, null);
if (bufferedImage != null) {
log("got the image");
}
// set null fileBuffer and fileSize for next images
fileSize = null;
fileBuffer = null;
}
}
above code is for receving image from the Anroid
and Sending part on JavaFX is:
WritableImage writableImage = new WritableImage((int)canvas.getWidth(), (int)canvas.getHeight());
canvas.snapshot(null, writableImage);
BufferedImage bufferedImage = SwingFXUtils.fromFXImage(writableImage, null);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
try {
ImageIO.write(bufferedImage, "jpg", baos);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
byte[] imageInBytes = baos.toByteArray();
log(String.valueOf(imageInBytes.length));
bct.write(imageInBytes);
String eof = "end of file";
byte[] eofbyte = eof.getBytes();
bct.write(eofbyte);
the sending and receiving part work fine..
But I have problems on result images
This is JavaFX side when received the image from Android and ss you see, the most left side of the image is not desired
and more weired after receving image from JavaFX side the result image on Android like this:
My question is how should I fix the code to get correct images?
I think you were hit by this bug:
https://bugs.openjdk.java.net/browse/JDK-8041459
You can avoid this problem by using PNG instead of JPG. Another option is to explicitly convert the image into an image without alpha component before storing it.
Michael

Servlet getContentLength() returns > 0 but getInputStream().available() returns 0 [duplicate]

How do I read an entire InputStream into a byte array?
You can use Apache Commons IO to handle this and similar tasks.
The IOUtils type has a static method to read an InputStream and return a byte[].
InputStream is;
byte[] bytes = IOUtils.toByteArray(is);
Internally this creates a ByteArrayOutputStream and copies the bytes to the output, then calls toByteArray(). It handles large files by copying the bytes in blocks of 4KiB.
You need to read each byte from your InputStream and write it to a ByteArrayOutputStream.
You can then retrieve the underlying byte array by calling toByteArray():
InputStream is = ...
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
int nRead;
byte[] data = new byte[16384];
while ((nRead = is.read(data, 0, data.length)) != -1) {
buffer.write(data, 0, nRead);
}
return buffer.toByteArray();
Finally, after twenty years, there’s a simple solution without the need for a 3rd party library, thanks to Java 9:
InputStream is;
…
byte[] array = is.readAllBytes();
Note also the convenience methods readNBytes(byte[] b, int off, int len) and transferTo(OutputStream) addressing recurring needs.
Use vanilla Java's DataInputStream and its readFully Method (exists since at least Java 1.4):
...
byte[] bytes = new byte[(int) file.length()];
DataInputStream dis = new DataInputStream(new FileInputStream(file));
dis.readFully(bytes);
...
There are some other flavors of this method, but I use this all the time for this use case.
If you happen to use Google Guava, it'll be as simple as using ByteStreams:
byte[] bytes = ByteStreams.toByteArray(inputStream);
Safe solution (close streams correctly):
Java 9 and newer:
final byte[] bytes;
try (inputStream) {
bytes = inputStream.readAllBytes();
}
Java 8 and older:
public static byte[] readAllBytes(InputStream inputStream) throws IOException {
final int bufLen = 4 * 0x400; // 4KB
byte[] buf = new byte[bufLen];
int readLen;
IOException exception = null;
try {
try (ByteArrayOutputStream outputStream = new ByteArrayOutputStream()) {
while ((readLen = inputStream.read(buf, 0, bufLen)) != -1)
outputStream.write(buf, 0, readLen);
return outputStream.toByteArray();
}
} catch (IOException e) {
exception = e;
throw e;
} finally {
if (exception == null) inputStream.close();
else try {
inputStream.close();
} catch (IOException e) {
exception.addSuppressed(e);
}
}
}
Kotlin (when Java 9+ isn't accessible):
#Throws(IOException::class)
fun InputStream.readAllBytes(): ByteArray {
val bufLen = 4 * 0x400 // 4KB
val buf = ByteArray(bufLen)
var readLen: Int = 0
ByteArrayOutputStream().use { o ->
this.use { i ->
while (i.read(buf, 0, bufLen).also { readLen = it } != -1)
o.write(buf, 0, readLen)
}
return o.toByteArray()
}
}
To avoid nested use see here.
Scala (when Java 9+ isn't accessible) (By #Joan. Thx):
def readAllBytes(inputStream: InputStream): Array[Byte] =
Stream.continually(inputStream.read).takeWhile(_ != -1).map(_.toByte).toArray
As always, also Spring framework (spring-core since 3.2.2) has something for you: StreamUtils.copyToByteArray()
public static byte[] getBytesFromInputStream(InputStream is) throws IOException {
ByteArrayOutputStream os = new ByteArrayOutputStream();
byte[] buffer = new byte[0xFFFF];
for (int len = is.read(buffer); len != -1; len = is.read(buffer)) {
os.write(buffer, 0, len);
}
return os.toByteArray();
}
In-case someone is still looking for a solution without dependency and If you have a file.
DataInputStream
byte[] data = new byte[(int) file.length()];
DataInputStream dis = new DataInputStream(new FileInputStream(file));
dis.readFully(data);
dis.close();
ByteArrayOutputStream
InputStream is = new FileInputStream(file);
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
int nRead;
byte[] data = new byte[(int) file.length()];
while ((nRead = is.read(data, 0, data.length)) != -1) {
buffer.write(data, 0, nRead);
}
RandomAccessFile
RandomAccessFile raf = new RandomAccessFile(file, "r");
byte[] data = new byte[(int) raf.length()];
raf.readFully(data);
Do you really need the image as a byte[]? What exactly do you expect in the byte[] - the complete content of an image file, encoded in whatever format the image file is in, or RGB pixel values?
Other answers here show you how to read a file into a byte[]. Your byte[] will contain the exact contents of the file, and you'd need to decode that to do anything with the image data.
Java's standard API for reading (and writing) images is the ImageIO API, which you can find in the package javax.imageio. You can read in an image from a file with just a single line of code:
BufferedImage image = ImageIO.read(new File("image.jpg"));
This will give you a BufferedImage, not a byte[]. To get at the image data, you can call getRaster() on the BufferedImage. This will give you a Raster object, which has methods to access the pixel data (it has several getPixel() / getPixels() methods).
Lookup the API documentation for javax.imageio.ImageIO, java.awt.image.BufferedImage, java.awt.image.Raster etc.
ImageIO supports a number of image formats by default: JPEG, PNG, BMP, WBMP and GIF. It's possible to add support for more formats (you'd need a plug-in that implements the ImageIO service provider interface).
See also the following tutorial: Working with Images
If you don't want to use the Apache commons-io library, this snippet is taken from the sun.misc.IOUtils class. It's nearly twice as fast as the common implementation using ByteBuffers:
public static byte[] readFully(InputStream is, int length, boolean readAll)
throws IOException {
byte[] output = {};
if (length == -1) length = Integer.MAX_VALUE;
int pos = 0;
while (pos < length) {
int bytesToRead;
if (pos >= output.length) { // Only expand when there's no room
bytesToRead = Math.min(length - pos, output.length + 1024);
if (output.length < pos + bytesToRead) {
output = Arrays.copyOf(output, pos + bytesToRead);
}
} else {
bytesToRead = output.length - pos;
}
int cc = is.read(output, pos, bytesToRead);
if (cc < 0) {
if (readAll && length != Integer.MAX_VALUE) {
throw new EOFException("Detect premature EOF");
} else {
if (output.length != pos) {
output = Arrays.copyOf(output, pos);
}
break;
}
}
pos += cc;
}
return output;
}
ByteArrayOutputStream out = new ByteArrayOutputStream();
byte[] buffer = new byte[1024];
while (true) {
int r = in.read(buffer);
if (r == -1) break;
out.write(buffer, 0, r);
}
byte[] ret = out.toByteArray();
#Adamski: You can avoid buffer entirely.
Code copied from http://www.exampledepot.com/egs/java.io/File2ByteArray.html (Yes, it is very verbose, but needs half the size of memory as the other solution.)
// Returns the contents of the file in a byte array.
public static byte[] getBytesFromFile(File file) throws IOException {
InputStream is = new FileInputStream(file);
// Get the size of the file
long length = file.length();
// You cannot create an array using a long type.
// It needs to be an int type.
// Before converting to an int type, check
// to ensure that file is not larger than Integer.MAX_VALUE.
if (length > Integer.MAX_VALUE) {
// File is too large
}
// Create the byte array to hold the data
byte[] bytes = new byte[(int)length];
// Read in the bytes
int offset = 0;
int numRead = 0;
while (offset < bytes.length
&& (numRead=is.read(bytes, offset, bytes.length-offset)) >= 0) {
offset += numRead;
}
// Ensure all the bytes have been read in
if (offset < bytes.length) {
throw new IOException("Could not completely read file "+file.getName());
}
// Close the input stream and return bytes
is.close();
return bytes;
}
Input Stream is ...
ByteArrayOutputStream bos = new ByteArrayOutputStream();
int next = in.read();
while (next > -1) {
bos.write(next);
next = in.read();
}
bos.flush();
byte[] result = bos.toByteArray();
bos.close();
Java 9 will give you finally a nice method:
InputStream in = ...;
ByteArrayOutputStream bos = new ByteArrayOutputStream();
in.transferTo( bos );
byte[] bytes = bos.toByteArray();
We are seeing some delay for few AWS transaction, while converting S3 object to ByteArray.
Note: S3 Object is PDF document (max size is 3 mb).
We are using the option #1 (org.apache.commons.io.IOUtils) to convert the S3 object to ByteArray. We have noticed S3 provide the inbuild IOUtils method to convert the S3 object to ByteArray, we are request you to confirm what is the best way to convert the S3 object to ByteArray to avoid the delay.
Option #1:
import org.apache.commons.io.IOUtils;
is = s3object.getObjectContent();
content =IOUtils.toByteArray(is);
Option #2:
import com.amazonaws.util.IOUtils;
is = s3object.getObjectContent();
content =IOUtils.toByteArray(is);
Also let me know if we have any other better way to convert the s3 object to bytearray
I know it's too late but here I think is cleaner solution that's more readable...
/**
* method converts {#link InputStream} Object into byte[] array.
*
* #param stream the {#link InputStream} Object.
* #return the byte[] array representation of received {#link InputStream} Object.
* #throws IOException if an error occurs.
*/
public static byte[] streamToByteArray(InputStream stream) throws IOException {
byte[] buffer = new byte[1024];
ByteArrayOutputStream os = new ByteArrayOutputStream();
int line = 0;
// read bytes from stream, and store them in buffer
while ((line = stream.read(buffer)) != -1) {
// Writes bytes from byte array (buffer) into output stream.
os.write(buffer, 0, line);
}
stream.close();
os.flush();
os.close();
return os.toByteArray();
}
I tried to edit #numan's answer with a fix for writing garbage data but edit was rejected. While this short piece of code is nothing brilliant I can't see any other better answer. Here's what makes most sense to me:
ByteArrayOutputStream out = new ByteArrayOutputStream();
byte[] buffer = new byte[1024]; // you can configure the buffer size
int length;
while ((length = in.read(buffer)) != -1) out.write(buffer, 0, length); //copy streams
in.close(); // call this in a finally block
byte[] result = out.toByteArray();
btw ByteArrayOutputStream need not be closed. try/finally constructs omitted for readability
See the InputStream.available() documentation:
It is particularly important to realize that you must not use this
method to size a container and assume that you can read the entirety
of the stream without needing to resize the container. Such callers
should probably write everything they read to a ByteArrayOutputStream
and convert that to a byte array. Alternatively, if you're reading
from a file, File.length returns the current length of the file
(though assuming the file's length can't change may be incorrect,
reading a file is inherently racy).
Wrap it in a DataInputStream if that is off the table for some reason, just use read to hammer on it until it gives you a -1 or the entire block you asked for.
public int readFully(InputStream in, byte[] data) throws IOException {
int offset = 0;
int bytesRead;
boolean read = false;
while ((bytesRead = in.read(data, offset, data.length - offset)) != -1) {
read = true;
offset += bytesRead;
if (offset >= data.length) {
break;
}
}
return (read) ? offset : -1;
}
Java 8 way (thanks to BufferedReader and Adam Bien)
private static byte[] readFully(InputStream input) throws IOException {
try (BufferedReader buffer = new BufferedReader(new InputStreamReader(input))) {
return buffer.lines().collect(Collectors.joining("\n")).getBytes(<charset_can_be_specified>);
}
}
Note that this solution wipes carriage return ('\r') and can be inappropriate.
The other case to get correct byte array via stream, after send request to server and waiting for the response.
/**
* Begin setup TCP connection to PC app
* to open integrate connection between mobile app and pc app (or mobile app)
*/
mSocket = new Socket(IP, port);
// mSocket.setSoTimeout(30000);
DataOutputStream mDos = new DataOutputStream(mSocket.getOutputStream());
String str = "MobileRequest#" + params[0] + "#<EOF>";
mDos.write(str.getBytes());
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
/* Since data are accepted as byte, all of them will be collected in the
following byte array which initialised with accepted data length. */
DataInputStream mDis = new DataInputStream(mSocket.getInputStream());
byte[] data = new byte[mDis.available()];
// Collecting data into byte array
for (int i = 0; i < data.length; i++)
data[i] = mDis.readByte();
// Converting collected data in byte array into String.
String RESPONSE = new String(data);
You're doing an extra copy if you use ByteArrayOutputStream. If you know the length of the stream before you start reading it (e.g. the InputStream is actually a FileInputStream, and you can call file.length() on the file, or the InputStream is a zipfile entry InputStream, and you can call zipEntry.length()), then it's far better to write directly into the byte[] array -- it uses half the memory, and saves time.
// Read the file contents into a byte[] array
byte[] buf = new byte[inputStreamLength];
int bytesRead = Math.max(0, inputStream.read(buf));
// If needed: for safety, truncate the array if the file may somehow get
// truncated during the read operation
byte[] contents = bytesRead == inputStreamLength ? buf
: Arrays.copyOf(buf, bytesRead);
N.B. the last line above deals with files getting truncated while the stream is being read, if you need to handle that possibility, but if the file gets longer while the stream is being read, the contents in the byte[] array will not be lengthened to include the new file content, the array will simply be truncated to the old length inputStreamLength.
I use this.
public static byte[] toByteArray(InputStream is) throws IOException {
ByteArrayOutputStream output = new ByteArrayOutputStream();
try {
byte[] b = new byte[4096];
int n = 0;
while ((n = is.read(b)) != -1) {
output.write(b, 0, n);
}
return output.toByteArray();
} finally {
output.close();
}
}
This is my copy-paste version:
#SuppressWarnings("empty-statement")
public static byte[] inputStreamToByte(InputStream is) throws IOException {
if (is == null) {
return null;
}
// Define a size if you have an idea of it.
ByteArrayOutputStream r = new ByteArrayOutputStream(2048);
byte[] read = new byte[512]; // Your buffer size.
for (int i; -1 != (i = is.read(read)); r.write(read, 0, i));
is.close();
return r.toByteArray();
}
Java 7 and later:
import sun.misc.IOUtils;
...
InputStream in = ...;
byte[] buf = IOUtils.readFully(in, -1, false);
You can try Cactoos:
byte[] array = new BytesOf(stream).bytes();
Here is an optimized version, that tries to avoid copying data bytes as much as possible:
private static byte[] loadStream (InputStream stream) throws IOException {
int available = stream.available();
int expectedSize = available > 0 ? available : -1;
return loadStream(stream, expectedSize);
}
private static byte[] loadStream (InputStream stream, int expectedSize) throws IOException {
int basicBufferSize = 0x4000;
int initialBufferSize = (expectedSize >= 0) ? expectedSize : basicBufferSize;
byte[] buf = new byte[initialBufferSize];
int pos = 0;
while (true) {
if (pos == buf.length) {
int readAhead = -1;
if (pos == expectedSize) {
readAhead = stream.read(); // test whether EOF is at expectedSize
if (readAhead == -1) {
return buf;
}
}
int newBufferSize = Math.max(2 * buf.length, basicBufferSize);
buf = Arrays.copyOf(buf, newBufferSize);
if (readAhead != -1) {
buf[pos++] = (byte)readAhead;
}
}
int len = stream.read(buf, pos, buf.length - pos);
if (len < 0) {
return Arrays.copyOf(buf, pos);
}
pos += len;
}
}
Solution in Kotlin (will work in Java too, of course), which includes both cases of when you know the size or not:
fun InputStream.readBytesWithSize(size: Long): ByteArray? {
return when {
size < 0L -> this.readBytes()
size == 0L -> ByteArray(0)
size > Int.MAX_VALUE -> null
else -> {
val sizeInt = size.toInt()
val result = ByteArray(sizeInt)
readBytesIntoByteArray(result, sizeInt)
result
}
}
}
fun InputStream.readBytesIntoByteArray(byteArray: ByteArray,bytesToRead:Int=byteArray.size) {
var offset = 0
while (true) {
val read = this.read(byteArray, offset, bytesToRead - offset)
if (read == -1)
break
offset += read
if (offset >= bytesToRead)
break
}
}
If you know the size, it saves you on having double the memory used compared to the other solutions (in a brief moment, but still could be useful). That's because you have to read the entire stream to the end, and then convert it to a byte array (similar to ArrayList which you convert to just an array).
So, if you are on Android, for example, and you got some Uri to handle, you can try to get the size using this:
fun getStreamLengthFromUri(context: Context, uri: Uri): Long {
context.contentResolver.query(uri, arrayOf(MediaStore.MediaColumns.SIZE), null, null, null)?.use {
if (!it.moveToNext())
return#use
val fileSize = it.getLong(it.getColumnIndex(MediaStore.MediaColumns.SIZE))
if (fileSize > 0)
return fileSize
}
//if you wish, you can also get the file-path from the uri here, and then try to get its size, using this: https://stackoverflow.com/a/61835665/878126
FileUtilEx.getFilePathFromUri(context, uri, false)?.use {
val file = it.file
val fileSize = file.length()
if (fileSize > 0)
return fileSize
}
context.contentResolver.openInputStream(uri)?.use { inputStream ->
if (inputStream is FileInputStream)
return inputStream.channel.size()
else {
var bytesCount = 0L
while (true) {
val available = inputStream.available()
if (available == 0)
break
val skip = inputStream.skip(available.toLong())
if (skip < 0)
break
bytesCount += skip
}
if (bytesCount > 0L)
return bytesCount
}
}
return -1L
}
You can use cactoos library with provides reusable object-oriented Java components.
OOP is emphasized by this library, so no static methods, NULLs, and so on, only real objects and their contracts (interfaces).
A simple operation like reading InputStream, can be performed like that
final InputStream input = ...;
final Bytes bytes = new BytesOf(input);
final byte[] array = bytes.asBytes();
Assert.assertArrayEquals(
array,
new byte[]{65, 66, 67}
);
Having a dedicated type Bytes for working with data structure byte[] enables us to use OOP tactics for solving tasks at hand.
Something that a procedural "utility" method will forbid us to do.
For example, you need to enconde bytes you've read from this InputStream to Base64.
In this case you will use Decorator pattern and wrap Bytes object within implementation for Base64.
cactoos already provides such implementation:
final Bytes encoded = new BytesBase64(
new BytesOf(
new InputStreamOf("XYZ")
)
);
Assert.assertEquals(new TextOf(encoded).asString(), "WFla");
You can decode them in the same manner, by using Decorator pattern
final Bytes decoded = new Base64Bytes(
new BytesBase64(
new BytesOf(
new InputStreamOf("XYZ")
)
)
);
Assert.assertEquals(new TextOf(decoded).asString(), "XYZ");
Whatever your task is you will be able to create own implementation of Bytes to solve it.

Using ImageIO.read to read image through inputstream [duplicate]

I am sending a bufferedImage over a socket and I am using the example found in this post:
Sender
BufferedImage image = ....;
ImageIO.write(image, "PNG", socket.getOutputStream());
Receiver
BufferedImage image = ImageIO.read(socket.getInputStream());
It works - IF, and ONLY IF, I close the sender's outputStream after this line:
ImageIO.write(image, "PNG", socket.getOutputStream());
Is there anything I can do apart from closing the outputStream?
Also, is there anything else I can do to avoid using ImageIO altogether? It seems to take ages to do anything.
Also note that reading or writing to the hard disk in anyway should be avoided at all costs due to performance issues. I need to make this transfer as fast as possible, (I'm experimenting and trying to create a client similar to VNC and saving each screenshot to the hard disk would greatly slow down everything)..
#Jon Skeet
Edit 3:
Sender: (Note that I am sending a JPG image not a PNG).
int filesize;
OutputStream out = c.getClientSocket().getOutputStream();
ByteArrayOutputStream bScrn = new ByteArrayOutputStream();
ImageIO.write(screenshot, "JPG", bScrn);
byte[] imgByte = bScrn.toByteArray();
bScrn.flush();
bScrn.close();
filesize = bScrn.size();
out.write(new String("#FS " + filesize).getBytes()); //Send filesize
out.write(new String("#<IM> \n").getBytes()); //Notify start of image
out.write(imgByte); //Write file
System.out.println("Finished");
Reciever: (where input is the socket input stream)
Attempt #1:
String str = input.toString();
imageBytes = str.getBytes();
InputStream in = new ByteArrayInputStream(imageBytes);
BufferedImage image = ImageIO.read(in);
in.close();
System.out.println("width=" + image.getWidth());
(failed: Nullpointer exception on getWidth() line)
I understand this error to mean "corrupt image" because it couldn't initialize it. correct?
Attempt #2:
byte[] imageBytes = new byte[filesize];
for (int j = 0; i < filesize; i++)
{
imageBytes[j] = (byte) input.read();
}
InputStream in = new ByteArrayInputStream(imageBytes);
BufferedImage image = ImageIO.read(in);
in.close();
System.out.println("width=" + image.getWidth());
(failed: Nullpointer exception on getWidth() line)
Attempt #3:
if (filesize > 0)
{
int writtenBytes = 0;
int bufferSize = client.getReceiveBufferSize();
imageBytes = new byte[filesize]; //Create a byte array as large as the image
byte[] buffer = new byte[bufferSize];//Create buffer
do {
writtenBytes += input.read(buffer); //Fill up buffer
System.out.println(writtenBytes + "/" + filesize); //Show progress
//Copy buffer to the byte array which will contain the full image
System.arraycopy(buffer, 0, imageBytes, writtenBytes, client.getReceiveBufferSize());
writtenBytes+=bufferSize;
} while ((writtenBytes + bufferSize) < filesize);
// Read the remaining bytes
System.arraycopy(buffer, 0, imageBytes, writtenBytes-1, filesize-writtenBytes);
writtenBytes += filesize-writtenBytes;
System.out.println("Finished reading! Total read: " + writtenBytes + "/" + filesize);
}
InputStream in = new ByteArrayInputStream(imageBytes);
BufferedImage image = ImageIO.read(in);
in.close();
(failed: Reciever gives: Null pointer exception)
Attempt 4:
int readBytes = 0;
imageBytes = new byte[filesize]; //Create a byte array as large as the image
while (readBytes < filesize)
{
readBytes += input.read(imageBytes);
}
InputStream in = new ByteArrayInputStream(imageBytes);
BufferedImage image = ImageIO.read(in);
in.close();
System.out.println("width=" + image.getWidth());
(failed: sender gives: java.net.SocketException: Connection reset by peer: socket write error)
Attempt #5:
Using Jon skeet's code snippet, the image arrives, but only partially. I saved it to a file (1.jpg) to see what was going on, and it actually sends 80% of the image, while the rest of the file is filled with blank spaces. This results in a partially corrupt image. Here is the code I tried: (note that captureImg() is not at fault, saving the file directly works)
Sender:
Socket s = new Socket("127.0.0.1", 1290);
OutputStream out = s.getOutputStream();
ByteArrayOutputStream bScrn = new ByteArrayOutputStream();
ImageIO.write(captureImg(), "JPG", bScrn);
byte imgBytes[] = bScrn.toByteArray();
bScrn.close();
out.write((Integer.toString(imgBytes.length)).getBytes());
out.write(imgBytes,0,imgBytes.length);
Reciever:
InputStream in = clientSocket.getInputStream();
long startTime = System.currentTimeMillis();
byte[] b = new byte[30];
int len = in.read(b);
int filesize = Integer.parseInt(new String(b).substring(0, len));
if (filesize > 0)
{
byte[] imgBytes = readExactly(in, filesize);
FileOutputStream f = new FileOutputStream("C:\\Users\\Dan\\Desktop\\Pic\\1.jpg");
f.write(imgBytes);
f.close();
System.out.println("done");
The sender still gives a Connection reset by peer: socket write error.
Click here for full sized image
One option would be to write the image to a ByteArrayOutputStream so you can determine the length, then write that length to the output stream first.
Then on the receiving end, you can read the length, then read that many bytes into a byte array, then create a ByteArrayInputStream to wrap the array and pass that to ImageIO.read().
I'm not entirely surprised that it doesn't work until the output socket is closed normally - after all, a file which contains a valid PNG file and then something else isn't actually a valid PNG file in itself, is it? So the reader needs to read to the end of the stream before it can complete - and the "end" of a network stream only comes when the connection is closed.
EDIT: Here's a method to read the given number of bytes into a new byte array. It's handy to have as a separate "utility" method.
public static byte[] readExactly(InputStream input, int size) throws IOException
{
byte[] data = new byte[size];
int index = 0;
while (index < size)
{
int bytesRead = input.read(data, index, size - index);
if (bytesRead < 0)
{
throw new IOException("Insufficient data in stream");
}
index += bytesRead;
}
return data;
}
for other StackOverflow users like me.
In "Jon Skeet's" answer. Modify the following line of readExactly method.
<<original Line>>
index += size;
<<modified Line>>
index += bytesRead;
To get the full image data.
public static void main(String[] args) {
Socket socket = null;
try {
DataInputStream dis;
socket = new Socket("192.168.1.48",8000);
while (true) {
dis = new DataInputStream(socket.getInputStream());
int len = dis.readInt();
byte[] buffer = new byte[len];
dis.readFully(buffer, 0, len);
BufferedImage im = ImageIO.read(new ByteArrayInputStream(buffer));
jlb.setIcon(new ImageIcon(im));
jfr.add(jlb);
jfr.pack();
jfr.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
jfr.setVisible(true);
System.gc();
}
} catch (Exception e) {
e.printStackTrace();
}
finally {
try {
socket.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
In 192.168.1.48:8000 machine python server running and i got stream in java code

If convert from byte array to bitmap object return null value. Why?

I'm trying to develop an application in Android and I'm having a problem I can't figure out how to solve.
Description:
The application consists of image processing and one of the routines is as follows. An image file (PNG) is converted into a array of bytes databyteimage[] with n elements, a part of this array ex: from databyteimage[i] to databyteimage[i+k] consecutive with k elements and " i " is offset databyteimage[], the LSB (Least Significant Bit) is replaced, the value what is replaced coms from other array of bytes ex:datareplace[] with m elements the value of k is m*8. This operation is done using operations on bits . After this process, a new string databyteimage[] is created.
The problem:
When trying to create the BITMAP object from the new array databyteimage[] returns NULL to displaty or show the new image.
I would appreciate if you could help me find a solution to this problem, since until now no one could help me.
***// GetByte method from Image***
private byte[] getByteImageData(String filePath) {
/*
Bitmap bitmap = BitmapFactory.decodeFile(filePath);
Bitmap mutable = bitmap.copy(Bitmap.Config.RGB_565, true);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
mutable.compress(Bitmap.CompressFormat.PNG, 100, baos);
return baos.toByteArray();
*/
byte[] _imagebytedata = new byte[1024];
InputStream _input = null;
try {
if (filePath != null && (filePath.length() > 0)) {
// Create a file for image
File _fileimage = new File(filePath);
if (_fileimage.exists()) {
// Get the byte from file image
_input = new BufferedInputStream(new FileInputStream(
_fileimage));
_imagebytedata = new byte[(int) _fileimage.length()];
_input.read(_imagebytedata, 0, (int) _fileimage.length());
_input.close();
}
}
} catch (Exception e) {
}
**// Bitwise operations to change LSB of byte array image**
private byte[] Text(byte[] imagedata, byte[] textmess, int offset) {
for (int i = 0; i < textmess.length; ++i) {
int add = textmess[i];
for (int bit = 7; bit >= 0; --bit, ++offset) {
int b = (add >>> bit) & 1;
imagedata[offset] = (byte) ((imagedata[offset] & 0xFE) |b);
}
}
return imagedata;
}
***//Save image from new byte array***
private boolean saveImage(String pathFile,byte[] encodedimage) {
OutputStream _output = null;
File _newFileImage = new File(pathFile);
byte[] _encodedimage = encodedimage;
//Bitmap _imagebitmap = BitmapFactory.decodeByteArray(encodedimage, 0, encodedimage.length);
if (_newFileImage.exists()) {
try {
_output = new BufferedOutputStream(new FileOutputStream(
_newFileImage));
_output.write(_encodedimage, 0, _encodedimage.length);
_output.flush();
_output.close();
return true;
} catch (Exception e) {
}
;
}// _newFileImage.exists()
return false;
}
public boolean encodeTextInFile(String filepath, String text) {
byte[] _newimagebytedata;
byte[] _imagebytedata = getByteImageData(filepath);
byte[] _textbytedata = text.getBytes();
byte[] _lengthbytedata = byteConversion(text.length());
_newimagebytedata = Text(_imagebytedata, _lengthbytedata, 33);
_newimagebytedata = Text(_imagebytedata, _textbytedata, 65);
**// The value of variable _bitmapdoi is null here is the problem**
Bitmap _bitmapdoi = BitmapFactory.decodeByteArray(_newimagebytedata, 0,_newimagebytedata.length);
return saveImage(filepath, _newimagebytedata);
}
What you are reading in getByteImageData is not a bitmap. It is a file, most likely a compressed image. Working on the bytes from this file is very different from working on the image pixels. I suggest you work on the actual Bitmap object:
Load the bitmap:
Bitmap bitmap = BitmapFactory.decodeFile(filePath);
// Not quite sure if the returned bitmap is mutable, so
Bitmap mutable = bitmap.copy(Bitmap.Config.RGB_565, true);
Modify a pixel:
int pixelRGB = mutable.getPixel(x, y);
// Do whatever you have to do
mutable.setPixel(x, y, pixelRGB);
Write it back:
mutable.compress(Bitmap.CompressFormat.PNG, 100, new ByteArrayOutputStream(new FileOutputStream(_newFileImage)));

Categories