I want to send images through sockets but I have not been able to do it in android, could someone help me?
System.out.println("iniciooooo");
//converting image to bytes with base64
Bitmap b = BitmapFactory.decodeFile("/sdcard/ajeffer.jpg");
ByteArrayOutputStream byte2= new ByteArrayOutputStream();
b.compress(Bitmap.CompressFormat.JPEG,70,byte2);
byte[] enbytes = byte2.toByteArray();
String bb = Base64.encodeToString(enbytes,Base64.DEFAULT);
System.out.println(Base64.encodeToString(enbytes,Base64.DEFAULT));
data.writeUTF(bb);
FileOutputStream file;
//receiving the image in bytes to convert it into an image
DataInputStream dain = new DataInputStream(s.getInputStream());
msg = dain.readUTF();
File ff = new File("/sdcard/a2jeffer.jpg");
byte[] deco = Base64.decode(dain.readUTF(),Base64.DEFAULT);
Bitmap bit = BitmapFactory.decodeByteArray(deco,0,deco.length);
file = new FileOutputStream(ff);
bit.compress(Bitmap.CompressFormat.JPEG,70,file);
//the image is not created
I realized that my code did not work because I had to put this android: requestLegacyExternalStorage =" true " in the manifest, also I see that you are right about writeUTF () since in order to send images I must drastically lower the quality but it works If you have an idea on how to improve this, let me know, thank you very much.
You were right, this works great for sending and receiving any file.
Send file
OutputStream outputStream = socket.getOutputStream();
InputStream inputStream = new FileInputStream(file);
byte[] datita = new byte[16*1024];
int count;
while((count = inputStream.read(datita))>0){
outputStream.write(datita,0,count);
}
outputStream.close();
inputStream.close();
Receive file
OutputStream outputStream = new FileOutputStream(file);
InputStream inputStream = s.getInputStream();
byte[] datita = new byte[16*1024];
int count;
while((count = inputStream.read(datita))>0){
outputStream.write(datita,0,count);
}
outputStream.close();
inputStream.close();
Related
So my problem sounds like this. I need to make a base64 encoded string of a file and for this, I use this method:
public String getStringFile(File f) {
InputStream inputStream = null;
String encodedFile= "", lastVal;
try {
inputStream = new FileInputStream(f.getAbsolutePath());
byte[] buffer = new byte[10240];
int bytesRead;
ByteArrayOutputStream output = new ByteArrayOutputStream();
Base64OutputStream output64 = new Base64OutputStream(output, Base64.DEFAULT);
while ((bytesRead = inputStream.read(buffer)) != -1) {
output64.write(buffer, 0, bytesRead);
}
output64.close();
encodedFile = output.toString();
} catch (Exception e) {
e.printStackTrace();
}
lastVal = encodedFile;
return lastVal;
}
and the thing is when I try to encode file something around 20 Mb (exact file size is 19,35 Mb) I get an OutOfMemoryException.
Before:
After:
What am I doing wrong and how can I fix this issue? Thanks in advance.
What am I doing wrong
You are attempting to encode a ~20MB file using base64 into a string. You will not have adequate heap space on many Android devices to have a single memory allocation that large.
how can I fix this issue?
If "this issue" is "create a ~26MB string of base64-encoded data", there is no reliable way to do this. You would have to find some other solution to whatever problem you are trying to solve by creating such a string.
ByteArrayOutputStream output = new ByteArrayOutputStream();
Base64OutputStream output64 = new Base64OutputStream(output, Base64.DEFAULT);
If you upload the base64 yourself with HttpUrlConnection you can do way with the ByteArrayOutputStream and replace above lines -while directly uploading- with
OutputStream output = con.getOutputStream();
Base64OutputStream output64 = new Base64OutputStream(output, Base64.DEFAULT);
Untested.
You could also directly base64 encode to a FileOutputStream of course.
OutputStream output = new FileOutputStream(......);
Base64OutputStream output64 = new Base64OutputStream(output, Base64.DEFAULT);
and then upload that file with POJO.
I'm trying load an image in a string and after do something with this String, save the image.
The problem appear when i try to asignate the value of the FileInputStream to the String targetFileStr. If i don't to this, and I save the image, everything it's ok, but when i save it on the String, the image change, no matter if I try save the image from the String or from the FileInputStream.
FileInputStream fis = null;
File file = new File("image.png");
fis = new FileInputStream(file);
String targetFileStr = IOUtils.toString(fis, "UTF-8");
*InputStream inputStream = IOUtils.toInputStream(targetFileStr, "UTF-8");
*InputStream inputStream = fis;
// no matter which one i use, both ways fail
OutputStream outputStream = null;
try {
outputStream = new FileOutputStream(new File("image2.png"));
int read = 0;
byte[] bytes = new byte[1024];
while ((read = inputStream.read(bytes)) != -1) {
outputStream.write(bytes, 0, read);
}
} catch (Exception e) {
e.printStackTrace();
}
You may want to consider converting the image into a String via Base64 encoding/decoding. This is an example of encoding.
After encoding, you can modify the String (actually you create new strings, you cannot modify the existing one), but be sure to produce valide base64-encoded outputs, otherwise you won't be able to decode.
I am currently trying to read in data from a server response. I am using a Socket to connect to a server, creating a http GET request, then am using a Buffered Reader to read in data. Here is what the code looks like compacted:
Socket conn = new Socket(server, 80);
//Request made here
BufferedReader inFromServer = new BufferedReader(new InputStreamReader(conn.getInputStream()));
String response;
while((response = inFromServer.readLine()) != null){
System.out.println(response);
}
I would like to read in the data, instead of as a String, as a byte array, and write it to a file. How is this possible? Any help is greatly appreciated, thank you.
You need to use a ByteArrayOutputStream, do something like the below code:
Socket conn = new Socket(server, 80);
//Request made here
InputStream is = conn.getInputStream();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
byte[] buffer = new byte[1024];
int readBytes = -1;
while((readBytes = is.read(buffer)) > 1){
baos.write(buffer,0,readBytes);
}
byte[] responseArray = baos.toByteArray();
One way is to use Apache commons-io IOUtils
byte[] bytes = IOUtils.toByteArray(inputstream);
With plain java:
ByteArrayOutputStream output = new ByteArrayOutputStream();
try(InputStream stream = new FileInputStream("myFile")) {
byte[] buffer = new byte[2048];
int numRead;
while((numRead = stream.read(buffer)) != -1) {
output.write(buffer, 0, numRead);
}
} catch(IOException e) {
e.printStackTrace();
}
// and here your bytes
byte[] myDesiredBytes = output.toByteArray();
If you are not using Apache commons-io library in your project,I have pretty simple method to do the same without using it..
/*
* Read bytes from inputStream and writes to OutputStream,
* later converts OutputStream to byte array in Java.
*/
public static byte[] toByteArrayUsingJava(InputStream is)
throws IOException{
ByteArrayOutputStream baos = new ByteArrayOutputStream();
int reads = is.read();
while(reads != -1){
baos.write(reads);
reads = is.read();
}
return baos.toByteArray();
}
I'm trying to send an jpeg image from my android phone through socket and from the PC part, get the sent data and store it in an jpg file.
I'm pretty sure that I configured the socket correctly, as I can download data (binary file) from PC to android and save it correctly.
I can also read the stream which is sent from android to PC. The packet length and header information are exactly what I expect.
The problem is in reading image data. I'm getting same size for image data but when I save it to .jpg file, it is corrupted and I can not view it.
Here is my Android code that tries to send image file after sending header information:
try{
//packagesize is header information which is sent in advance
index.putInt(packagesize);
byte c[]= {index.get(3),index.get(2),index.get(1),index.get(0)};
InputStream jpgimage = new FileInputStream(fileimage);
dataOutputStream = new DataOutputStream(socket.getOutputStream());
dataInputStream = new DataInputStream(socket.getInputStream());
int writeBytes = 0,len = 0;
byte buffer[] = new byte[1024];
while((len = jpgimage.read(buffer,0,buffer.length))!=-1)
{
writeBytes+=len;
dataOutputStream.write(buffer,0,len);
}
dataOutputStream.flush();
jpgimage.close();
dataInputStream.close();
dataOutputStream.close();
...
...
...
}
catch statements here
This is the receiving code in the PC part:
// after reading header information I try to read image data
char *buff = malloc(sizeof(char) * jpeg_length);
unsigned int byteCount = 0;
unsigned int byteCount = 0;
do
{
int ret = recv(socket, buff+readBytes, jpeg_length-readBytes, 0);
if (ret <= 0)
{
fprintf(stderr,"Error receiving jpeg file.\n");
fclose( output );
return 106;
}
readBytes += ret;
fwrite(buff, sizeof(char), readBytes, output);
}
while (readBytes < jpeg_length);
fclose( output );
I also have to mention that the receiving part is working fine when I send image data with PC client application which is pure C++.
Is there any idea about what is the problem and why I get corrupted image sending from android device?
Appreciate it.
Edited
I add this the to android application for testing if the sending bytes can form a good image or not? I saved the image and it was OK.
int writeBytes = 0,len = 0;
byte buffer[] = new byte[1024];
// Here I save all sending bytes to an image called test.jpg
String path = "sdcard/download/images/test.jpg";
FileOutputStream stream = new FileOutputStream(path);
while((len = jpgimage.read(buffer,0,buffer.length))!=-1)
{
writeBytes+=len;
stream.write(buffer);
dataOutputStream.write(buffer,0,len);
dataOutputStream.flush();
}
stream.flush();
dataOutputStream.flush();
jpgimage.close();
dataInputStream.close();
dataOutputStream.close();
I think you should use Bitmap class to convert you image to ByteBuffer and then send it across and on the other end convert ByteBuffer to image.
On Sender Side
Bitmap bitmap = BitmapFactory.decodeFile("ImageD2.jpg");
int bytes = bitmap.getByteCount();
ByteBuffer buffer = ByteBuffer.allocate(bytes);
bitmap.copyPixelsToBuffer(buffer);
byte[] array = buffer.array();
Now you can send byte[] as normal data.
On receiving side
receive the array normally and convert it back to Bitmap
Bitmap bitmap = BitmapFactory.decodeByteArray(array , 0, array .length);
for more information you can read following questions
Converting bitmap to byteArray android
How to convert byte array to Bitmap
I found the solution for that. The problem was from Android side. So I did the following changes:
I changed DataOutputStream and DataInputStream to BufferedOutputStream and BufferedInputStream respectively :
try{
//packagesize is header information which is sent in advance
index.putInt(packagesize);
byte c[]= {index.get(3),index.get(2),index.get(1),index.get(0)};
InputStream jpgimage = new FileInputStream(fileimage);
dataOutputStream = new BufferedOutputStream(socket.getOutputStream());
dataInputStream = new BufferedInputStream(socket.getInputStream());
int writeBytes = 0,len = 0;
byte buffer[] = new byte[1024];
while((len = jpgimage.read(buffer,0,buffer.length))!=-1)
{
writeBytes+=len;
dataOutputStream.write(buffer,0,len);
}
dataOutputStream.flush();
jpgimage.close();
dataInputStream.close();
dataOutputStream.close();
...
...
...
}
catch statements here
I am able to send strings from my Android mobile phone to my computer, and vice versa. However, I want to send an image from my computer and display it to the mobile phone. In my case, the computer is the server and the mobile phone is the client.
This is part of my code on the server side:
socket = serverSocket.accept();
dataOutputStream = new DataOutputStream(socket.getOutputStream());
captureScreen("C:\\Users\\HP\\Desktop\\capture.png");
File f = new File("C:\\Users\\HP\\Desktop\\capture.png");
byte [] buffer = new byte[(int)f.length()];
dataOutputStream.write(buffer,0,buffer.length);
dataOutputStream.flush();
Note that captureScreen() is a method that successfully takes a screenshot of the server and save it as a .PNG image in the above path.
Now, on the client side which is the Android mobile phone, if I have an ImageView control, how to read the image sent from the computer as an InputStream and display it on the ImageView?
Furthermore, did I write successfully the image to the dataOutputStream? I would be glad if any one helps me !
You can call the setImageBitmap(Bitmap bm) of your ImageView.
http://developer.android.com/reference/android/widget/ImageView.html
How you get the image data to your client: it depends on the solution you have chosen, but technically you can use the same libraries that you would use for pure Java.
You can use android.graphics.BitmapFactory to create the Bitmap from your stream.
http://developer.android.com/reference/android/graphics/BitmapFactory.html
Bitmap bitmap1 = BitmapFactory.decodeStream(inputStream);
Bitmap bitmap2 = BitmapFactory.decodeFile(filename);
what is this ?
byte [] buffer = new byte[(int)f.length()];
dataOutputStream.write(buffer,0,buffer.length);
You just declared size of a buffer byte array , but it`s empty!
You should to convert your file to byte and than transfer it to OutputStream , smth like this:
byte[] buffer = System.IO.File.ReadAllBytes("C:\\Users\\HP\\Desktop\\capture.png");
(code for c#)
And than you will send it like you did:
dataOutputStream.write(buffer,0,buffer.length);
dataOutputStream.flush();
try this for file receiving :
public void fileReceived(InputStream is)
throws FileNotFoundException, IOException {
Log.i("IMSERVICE", "FILERECCC-1");
if (is!= null) {
FileOutputStream fos = null;
BufferedOutputStream bos = null;
try {
fos = new FileOutputStream("/sdcard/chats/gas1.jpg/");
bos = new BufferedOutputStream(fos);
byte[] aByte = new byte[1024];
int bytesRead;
while ((bytesRead = is.read(aByte)) != -1) {
bos.write(aByte, 0, bytesRead);
}
bos.flush();
bos.close();
Log.i("IMSERVICE", "FILERECCC-2");
} catch (IOException ex) {
// Do exception handling
}
}
}
}
So you`ll got new file in your sd-card on Android.