I have a server device and N client devices. I want to send pictures to each client from the server.
My problem is the client doesn't receive the whole image. I converted the Bitmaps to byte array before I send it.
Send Bitmaps on the Server:
try {
OutputStream out = clients.get(i).getSocket().getOutputStream();
//Convert Bitmap to byte array:
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(CompressFormat.JPEG, 70, stream);
byte[] byteImage = stream.toByteArray();
//Send the byte array
if(byteImage.length > 0){
out.write(byteImage, 0, byteImage.length);
out.flush();
}
out.close();
} catch (IOException e) {
e.printStackTrace();
}`
Receive image on the Client:
try {
ByteArrayOutputStream out = new ByteArrayOutputStream();
byte[] data = new byte[1024];
int length = 0;
while ((length = in.read(data)) != -1) { //in --> InputStream
out.write(data, 0, length);
out.flush();
}
byte[] bytePicture = out.toByteArray();
out.close();
} catch (IOException e) {
}
Before I send the image, I tell the size of the byte array of the Bitmap to the clients.
My problem is the received bytes < send bytes. Why?
On the server side I created threads for all clients to send the images them.
EDIT:
I use PrintWriters to communicate with messages between the server and the clients.
I send the length of the byte array from the server to the clients before I start to send the byte array.
PrintWriter writer;
writer = new PrintWriter(new BufferedWriter(new OutputStreamWriter(out)), true);
writer.println("BYTE#" + byteImage.length);
writer.flush();
When a client get a message, which starts with "BYTE#" on the client side, I start to read the byte array:
reader = new BufferedReader(new InputStreamReader(inputStream));
final String message = reader.readLine();
if (message != null && message.startsWith("BYTE#")){
//Get the length of the byte array (correct)
int length = Integer.valueOf(message.split("#")[1]);
//... Start to read the byte array here...
}
Related
First, I get the Uri of the file I want to send. I know the Uri is correct because I was successful in converting the Uri to Bitmap and display it.
Next, I convert the Uri to an inputStream using the code:-
public InputStream uriToStream(Uri uri) throws FileNotFoundException {
InputStream is = getContentResolver().openInputStream(uri);
return is;
}
Next, I convert the inputStream to byte array(byte[]) using the code:-
public byte[] streamToByteArray(InputStream is) throws IOException {
int nRead;
byte[] by = new byte[16384];
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
while ((nRead = is.read(by, 0, by.length)) != -1) {
buffer.write(by, 0, nRead);
}
return buffer.toByteArray();
}
Next, I used this code to establish a connection with the server and send data
public class threadClient extends Thread {
byte[] ByteFile;
threadClient(byte[] byteFile) {
ByteFile = byteFile;
}
public void run() {
try {
Log.i("network","connecting....");
Socket client = new Socket("192.168.0.120",6666);
Log.i("network","connected");
Log.i("network","sending data");
DataOutputStream stream = new DataOutputStream(client.getOutputStream());
stream.writeInt(ByteFile.length);
stream.flush();
stream.close();
stream.write(ByteFile);
stream.flush();
stream.close();
Log.i("network","data sent");
} catch (Exception e) {
Log.i("network",e.toString());
}
}
}
The server is running python. After the server accepts the connection from the client the server uses the following code to handle the client
def client_handler(client,address):
print(f"Accepted connection")
size = int(jpysocket.jpydecode(client.recv(1024)))
print(f"file size = {str(size)}")
with open("test.pdf","wb") as file:
file.write(client.recv(size))
print("file saved")
first, the server receives the size of the file, and then it receives the byte array and writes the file.
But surprise the file is corrupt......
PLEASE CAN SOMEONE HELP......
THANKYOU
Cause you are filing byte array more than image's bytes.
for example if the image's byte's lengths is 1024, you are initializing byteArray with 16384 size, the byte become [1,2,3,4, 0, 0, 0, 0, //more zero till the end of byte array's size]
// 1,2,3,4 is your bytes and 0 are unused bytes
try to initialize byteArray with size of image's byte's lengths to get the correct result.
I'm trying to send an image that was taken by my camera in my application as a byte array. I convert the bitmap to byte array, then to Base64 string, then endcode that to a byte array and send. When I try to send it however I get an exception that my string is too long to send.
preparing to send through to socket:
if(currentChatPartner==null)
return;
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
msg.compress(Bitmap.CompressFormat.PNG, 0, byteArrayOutputStream);
byte[] byteArray = byteArrayOutputStream .toByteArray();
String encoded = Base64.encodeToString(byteArray, 0);
sendThroughSocketImage(encoded, clientSocket, currentChatPartner.getIP(), currentChatPartner.getPort());
Sending through to socket:
byte[] decoded = null;
decoded = encoded.getBytes();
try
{
socket.send(new DatagramPacket(decoded, decoded.length, ip, port));
} catch (IOException e)
{
System.err.println("\nUnable to send message");
return false;
}
return true;
Sometimes it works, but most it doesn't :( Is there a reason for this?? Is there a way I could fix it or shorten the string that is to be sent??
By using DatagramPacket you're sending an UDP message that has a miximum length a little short of 64kb. Easy to get a message to long exception. You should send your data using TCP socket.
Plenty of examples online e.g.
http://systembash.com/content/a-simple-java-tcp-server-and-tcp-client/
With respect to your comments, an example of sending the byte array via client socket
InetAddress serverAddr = InetAddress.getByName(SERVER_IP);
socket = new Socket(serverAddr, SERVERPORT);
int start=0;
int len=decoded.length;
OutputStream out = socket.getOutputStream();
DataOutputStream dos = new DataOutputStream(out);
dos.write(array, start, len);
I'm trying to send an jpeg image from my android phone through socket and from the PC part, get the sent data and store it in an jpg file.
I'm pretty sure that I configured the socket correctly, as I can download data (binary file) from PC to android and save it correctly.
I can also read the stream which is sent from android to PC. The packet length and header information are exactly what I expect.
The problem is in reading image data. I'm getting same size for image data but when I save it to .jpg file, it is corrupted and I can not view it.
Here is my Android code that tries to send image file after sending header information:
try{
//packagesize is header information which is sent in advance
index.putInt(packagesize);
byte c[]= {index.get(3),index.get(2),index.get(1),index.get(0)};
InputStream jpgimage = new FileInputStream(fileimage);
dataOutputStream = new DataOutputStream(socket.getOutputStream());
dataInputStream = new DataInputStream(socket.getInputStream());
int writeBytes = 0,len = 0;
byte buffer[] = new byte[1024];
while((len = jpgimage.read(buffer,0,buffer.length))!=-1)
{
writeBytes+=len;
dataOutputStream.write(buffer,0,len);
}
dataOutputStream.flush();
jpgimage.close();
dataInputStream.close();
dataOutputStream.close();
...
...
...
}
catch statements here
This is the receiving code in the PC part:
// after reading header information I try to read image data
char *buff = malloc(sizeof(char) * jpeg_length);
unsigned int byteCount = 0;
unsigned int byteCount = 0;
do
{
int ret = recv(socket, buff+readBytes, jpeg_length-readBytes, 0);
if (ret <= 0)
{
fprintf(stderr,"Error receiving jpeg file.\n");
fclose( output );
return 106;
}
readBytes += ret;
fwrite(buff, sizeof(char), readBytes, output);
}
while (readBytes < jpeg_length);
fclose( output );
I also have to mention that the receiving part is working fine when I send image data with PC client application which is pure C++.
Is there any idea about what is the problem and why I get corrupted image sending from android device?
Appreciate it.
Edited
I add this the to android application for testing if the sending bytes can form a good image or not? I saved the image and it was OK.
int writeBytes = 0,len = 0;
byte buffer[] = new byte[1024];
// Here I save all sending bytes to an image called test.jpg
String path = "sdcard/download/images/test.jpg";
FileOutputStream stream = new FileOutputStream(path);
while((len = jpgimage.read(buffer,0,buffer.length))!=-1)
{
writeBytes+=len;
stream.write(buffer);
dataOutputStream.write(buffer,0,len);
dataOutputStream.flush();
}
stream.flush();
dataOutputStream.flush();
jpgimage.close();
dataInputStream.close();
dataOutputStream.close();
I think you should use Bitmap class to convert you image to ByteBuffer and then send it across and on the other end convert ByteBuffer to image.
On Sender Side
Bitmap bitmap = BitmapFactory.decodeFile("ImageD2.jpg");
int bytes = bitmap.getByteCount();
ByteBuffer buffer = ByteBuffer.allocate(bytes);
bitmap.copyPixelsToBuffer(buffer);
byte[] array = buffer.array();
Now you can send byte[] as normal data.
On receiving side
receive the array normally and convert it back to Bitmap
Bitmap bitmap = BitmapFactory.decodeByteArray(array , 0, array .length);
for more information you can read following questions
Converting bitmap to byteArray android
How to convert byte array to Bitmap
I found the solution for that. The problem was from Android side. So I did the following changes:
I changed DataOutputStream and DataInputStream to BufferedOutputStream and BufferedInputStream respectively :
try{
//packagesize is header information which is sent in advance
index.putInt(packagesize);
byte c[]= {index.get(3),index.get(2),index.get(1),index.get(0)};
InputStream jpgimage = new FileInputStream(fileimage);
dataOutputStream = new BufferedOutputStream(socket.getOutputStream());
dataInputStream = new BufferedInputStream(socket.getInputStream());
int writeBytes = 0,len = 0;
byte buffer[] = new byte[1024];
while((len = jpgimage.read(buffer,0,buffer.length))!=-1)
{
writeBytes+=len;
dataOutputStream.write(buffer,0,len);
}
dataOutputStream.flush();
jpgimage.close();
dataInputStream.close();
dataOutputStream.close();
...
...
...
}
catch statements here
I am able to send strings from my Android mobile phone to my computer, and vice versa. However, I want to send an image from my computer and display it to the mobile phone. In my case, the computer is the server and the mobile phone is the client.
This is part of my code on the server side:
socket = serverSocket.accept();
dataOutputStream = new DataOutputStream(socket.getOutputStream());
captureScreen("C:\\Users\\HP\\Desktop\\capture.png");
File f = new File("C:\\Users\\HP\\Desktop\\capture.png");
byte [] buffer = new byte[(int)f.length()];
dataOutputStream.write(buffer,0,buffer.length);
dataOutputStream.flush();
Note that captureScreen() is a method that successfully takes a screenshot of the server and save it as a .PNG image in the above path.
Now, on the client side which is the Android mobile phone, if I have an ImageView control, how to read the image sent from the computer as an InputStream and display it on the ImageView?
Furthermore, did I write successfully the image to the dataOutputStream? I would be glad if any one helps me !
You can call the setImageBitmap(Bitmap bm) of your ImageView.
http://developer.android.com/reference/android/widget/ImageView.html
How you get the image data to your client: it depends on the solution you have chosen, but technically you can use the same libraries that you would use for pure Java.
You can use android.graphics.BitmapFactory to create the Bitmap from your stream.
http://developer.android.com/reference/android/graphics/BitmapFactory.html
Bitmap bitmap1 = BitmapFactory.decodeStream(inputStream);
Bitmap bitmap2 = BitmapFactory.decodeFile(filename);
what is this ?
byte [] buffer = new byte[(int)f.length()];
dataOutputStream.write(buffer,0,buffer.length);
You just declared size of a buffer byte array , but it`s empty!
You should to convert your file to byte and than transfer it to OutputStream , smth like this:
byte[] buffer = System.IO.File.ReadAllBytes("C:\\Users\\HP\\Desktop\\capture.png");
(code for c#)
And than you will send it like you did:
dataOutputStream.write(buffer,0,buffer.length);
dataOutputStream.flush();
try this for file receiving :
public void fileReceived(InputStream is)
throws FileNotFoundException, IOException {
Log.i("IMSERVICE", "FILERECCC-1");
if (is!= null) {
FileOutputStream fos = null;
BufferedOutputStream bos = null;
try {
fos = new FileOutputStream("/sdcard/chats/gas1.jpg/");
bos = new BufferedOutputStream(fos);
byte[] aByte = new byte[1024];
int bytesRead;
while ((bytesRead = is.read(aByte)) != -1) {
bos.write(aByte, 0, bytesRead);
}
bos.flush();
bos.close();
Log.i("IMSERVICE", "FILERECCC-2");
} catch (IOException ex) {
// Do exception handling
}
}
}
}
So you`ll got new file in your sd-card on Android.
I got working over socket file sender, it worked perfectly, but I couldn't send large files with it. Always got heap error. Then I changed the code of client, so it would send file in chunks. Now I can send big files, but there is new problem. Now I recieve small files empty and larger files for example videos can't be played. Here is the code of client that sends file:
public void send(File file) throws UnknownHostException, IOException {
// Create socket
hostIP = "localhost";
socket = new Socket(hostIP, 22333);
//Send file
FileInputStream fis = new FileInputStream(file);
BufferedInputStream bis = new BufferedInputStream(fis);
DataInputStream dis = new DataInputStream(bis);
OutputStream os = socket.getOutputStream();
//Sending size of file.
DataOutputStream dos = new DataOutputStream(os);
dos.writeUTF(file.getName() + ":" + userName);
byte[] arr = new byte[1024];
try {
int len = 0;
while ((len = dis.read(arr)) != -1) {
dos.write(arr, 0, len);
}
} catch (IOException ex) {
ex.printStackTrace();
}
dos.flush();
socket.close();
}
and here is the server code:
void start() throws IOException {
// Starts server on port.
serverSocket = new ServerSocket(port);
int bytesRead;
while (true) {
connection = serverSocket.accept();
in = connection.getInputStream();
clientData = new DataInputStream(in);
String[] data = clientData.readUTF().split(":");
String fileName = data[0];
String userName = data[1];
output = new FileOutputStream("C:/" + fileName);
long size = clientData.readLong();
byte[] buffer = new byte[1024];
// Build new file
while (size > 0 && (bytesRead = clientData.read(buffer, 0, (int) Math.min(buffer.length, size))) != -1) {
output.write(buffer, 0, bytesRead);
size -= bytesRead;
}
output.close();
}
}
You failed to write out the length of the file to the stream in the client:
long size = clientData.readLong();
So that call in the server is reading the first 8 bytes of the actual file and who knows what that quantity is. You don't have to read the length from the stream since you only wrote a single file. After reading the filename, and username (not very secure is it?) you can just read the stream until EOF. If you ever wanted to send multiple files over the same open socket then you'd need to know the length before reading the file.
Also your buffers for reading are way to small. You should be at a minimum of 8192 instead of 1024. And you'll want to put all .close() in a finally block to make sure your server and clients shutdown appropriately if there is an exception ever.