I am able to send strings from my Android mobile phone to my computer, and vice versa. However, I want to send an image from my computer and display it to the mobile phone. In my case, the computer is the server and the mobile phone is the client.
This is part of my code on the server side:
socket = serverSocket.accept();
dataOutputStream = new DataOutputStream(socket.getOutputStream());
captureScreen("C:\\Users\\HP\\Desktop\\capture.png");
File f = new File("C:\\Users\\HP\\Desktop\\capture.png");
byte [] buffer = new byte[(int)f.length()];
dataOutputStream.write(buffer,0,buffer.length);
dataOutputStream.flush();
Note that captureScreen() is a method that successfully takes a screenshot of the server and save it as a .PNG image in the above path.
Now, on the client side which is the Android mobile phone, if I have an ImageView control, how to read the image sent from the computer as an InputStream and display it on the ImageView?
Furthermore, did I write successfully the image to the dataOutputStream? I would be glad if any one helps me !
You can call the setImageBitmap(Bitmap bm) of your ImageView.
http://developer.android.com/reference/android/widget/ImageView.html
How you get the image data to your client: it depends on the solution you have chosen, but technically you can use the same libraries that you would use for pure Java.
You can use android.graphics.BitmapFactory to create the Bitmap from your stream.
http://developer.android.com/reference/android/graphics/BitmapFactory.html
Bitmap bitmap1 = BitmapFactory.decodeStream(inputStream);
Bitmap bitmap2 = BitmapFactory.decodeFile(filename);
what is this ?
byte [] buffer = new byte[(int)f.length()];
dataOutputStream.write(buffer,0,buffer.length);
You just declared size of a buffer byte array , but it`s empty!
You should to convert your file to byte and than transfer it to OutputStream , smth like this:
byte[] buffer = System.IO.File.ReadAllBytes("C:\\Users\\HP\\Desktop\\capture.png");
(code for c#)
And than you will send it like you did:
dataOutputStream.write(buffer,0,buffer.length);
dataOutputStream.flush();
try this for file receiving :
public void fileReceived(InputStream is)
throws FileNotFoundException, IOException {
Log.i("IMSERVICE", "FILERECCC-1");
if (is!= null) {
FileOutputStream fos = null;
BufferedOutputStream bos = null;
try {
fos = new FileOutputStream("/sdcard/chats/gas1.jpg/");
bos = new BufferedOutputStream(fos);
byte[] aByte = new byte[1024];
int bytesRead;
while ((bytesRead = is.read(aByte)) != -1) {
bos.write(aByte, 0, bytesRead);
}
bos.flush();
bos.close();
Log.i("IMSERVICE", "FILERECCC-2");
} catch (IOException ex) {
// Do exception handling
}
}
}
}
So you`ll got new file in your sd-card on Android.
Related
I need to implement a solution in Java that connects to a CCTV server via TCP Socket, takes a live stream video from it and then takes that stream and outputs it to a spring boot endpoint (RTSP or other protocol) so that it can be shown in a web player.
The problem I have is that I do not quite know how to achieve that, maybe some can help. So far, I've got the following piece of code:
try (Socket socket = new Socket(hostname, port)) {
DataOutputStream outToServer = new DataOutputStream(socket.getOutputStream());
BufferedReader inFromServer = new BufferedReader(new InputStreamReader(socket.getInputStream()));
// Login command
outToServer.writeBytes(login);
// Start LIVE video
outToServer.writeBytes(live);
ByteArrayOutputStream outStreamObj;
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
byte[] byteChunk = new byte[1024];
InputStream input = socket.getInputStream();
int c = input.read(byteChunk);
while (c != -1) {
buffer.write(byteChunk, 0, c);
c = input.read(byteChunk); // If placed in its own loop, it will read forever (or until the stream stops) so it will never exit
BufferedImage image = ImageIO.read(new ByteArrayInputStream(buffer.toByteArray()));
// create the object of ByteArrayOutputStream class
outStreamObj = new ByteArrayOutputStream();
// write the image into the object of ByteArrayOutputStream class
ImageIO.write(image, "jpg", outStreamObj);
// create the byte array from image
byte[] byteArray = outStreamObj.toByteArray();
// create the object of ByteArrayInputStream class
// and initialized it with the byte array.
ByteArrayInputStream inStreambj = new ByteArrayInputStream(byteArray);
// read image from byte array
BufferedImage newImage = ImageIO.read(inStreambj);
// write output image
ImageIO.write(newImage, "jpg", new File("outputImage.jpg"));
System.out.println("Image generated from the byte array.");
}
} catch (UnknownHostException ex) {
...
} catch (IOException ex) {
...
}
So far, it works until BufferedImage image = ImageIO.read(new ByteArrayInputStream(buffer.toByteArray()));, where image is null. I am not even sure if this is correct. I wouldn't really want to save the image on disk any way, but for now it's okay i guess.
Basically, how it should work as follows:
User visits a web page (Angular)
The web player loads with a live stream url to the spring boot backend
The spring boot backend connect further to the CCTV server via TCP (not possible via RTSP or other protocol) and sends the live command via the socket
Within the same socket session the server starts pushing the live stream bytes
The spring boot app takes these bytes and sends them further to the browser.
Any suggestions?
I want to send images through sockets but I have not been able to do it in android, could someone help me?
System.out.println("iniciooooo");
//converting image to bytes with base64
Bitmap b = BitmapFactory.decodeFile("/sdcard/ajeffer.jpg");
ByteArrayOutputStream byte2= new ByteArrayOutputStream();
b.compress(Bitmap.CompressFormat.JPEG,70,byte2);
byte[] enbytes = byte2.toByteArray();
String bb = Base64.encodeToString(enbytes,Base64.DEFAULT);
System.out.println(Base64.encodeToString(enbytes,Base64.DEFAULT));
data.writeUTF(bb);
FileOutputStream file;
//receiving the image in bytes to convert it into an image
DataInputStream dain = new DataInputStream(s.getInputStream());
msg = dain.readUTF();
File ff = new File("/sdcard/a2jeffer.jpg");
byte[] deco = Base64.decode(dain.readUTF(),Base64.DEFAULT);
Bitmap bit = BitmapFactory.decodeByteArray(deco,0,deco.length);
file = new FileOutputStream(ff);
bit.compress(Bitmap.CompressFormat.JPEG,70,file);
//the image is not created
I realized that my code did not work because I had to put this android: requestLegacyExternalStorage =" true " in the manifest, also I see that you are right about writeUTF () since in order to send images I must drastically lower the quality but it works If you have an idea on how to improve this, let me know, thank you very much.
You were right, this works great for sending and receiving any file.
Send file
OutputStream outputStream = socket.getOutputStream();
InputStream inputStream = new FileInputStream(file);
byte[] datita = new byte[16*1024];
int count;
while((count = inputStream.read(datita))>0){
outputStream.write(datita,0,count);
}
outputStream.close();
inputStream.close();
Receive file
OutputStream outputStream = new FileOutputStream(file);
InputStream inputStream = s.getInputStream();
byte[] datita = new byte[16*1024];
int count;
while((count = inputStream.read(datita))>0){
outputStream.write(datita,0,count);
}
outputStream.close();
inputStream.close();
I'm trying to transfer some image files from an android phone, over a socket, to a server. The only way I've found to do this on android so far is using a FileInputStream to read the image as a byte array and send this over the socket to be reconstructed on the server side. This works well, unfortunately Android (or java?) does not allow Metadata, in my case exif data, to be included in a FileInputStream. This means that my exif data is missing once the images are on the server.
I've tried to solve this issue using both ExifInterface, which doesn't seem to be able to read a lot of the exif data I need, and the Metadata library. The metadata library does seem to get all the exif data I want but I can't figure out how to write it out as bytes that can be sent over my stream, it only has a toString which gets rid of some of the data that needs to be transferred.
Ideally I'd love a way to transfer the file with it's metadata, however I'd be happy with a way to turn Metadata tags into bytes which I can add to my socket's output stream.
Here is the code which uploads files over the socket
FileInputStream in = new FileInputStream(lastSavedPath);
byte[] buffer = new byte[1024];
int length = 0;
while ((length = in.read(buffer, 0, buffer.length)) != -1){
outputStream.write(buffer, 0, length);
}
ExifInterface exifInterface = new ExifInterface(lastSavedPath);
Metadata metadata = ImageMetadataReader.readMetadata(new File(lastSavedPath));
for (Directory directory : metadata.getDirectories()){
for (Tag tag : directory.getTags()){
Log.d("Socket Listener", tag.toString());
if (tag.toString().indexOf("Exif")>=0)
Log.d("Socket exif", "Data"+exifInterface.getAttribute(tag.getTagName()));
}
}
outputStream.flush();
Log.d("Socket Listener", "Data has been sent");
in.close();
socket.close();
The issue here wasn't with android at all. I had read in another thread that the android FileInputStream did not include metadata but that was not the case. I believe now the issue was in my server side code. I've fixed the issue with the following code:
Server side (Needs to be in a try catch):
socket = new Socket(args[0], 8888);
dataOutputStream = new DataOutputStream(socket.getOutputStream());
dataInputStream = new DataInputStream(socket.getInputStream());
dataOutputStream.writeUTF(args[1]);
System.out.println("Saving image");
FileOutputStream fileout = new FileOutputStream("/home/jamie/Documents/UMDSummer16/Thermal/TemporalAnalysisSensor/SocketTest/"+args[2]);
byte[] bytes = new byte[1024];
int count;
while ((count = dataInputStream.read(bytes)) > 0){
fileout.write(bytes);
}
fileout.close();
dataInputStream.close();
Android side (also in a try catch):
FileInputStream in = new FileInputStream(lastSavedPath);
byte[] buffer = new byte[1024];
int length = 0;
while ((length = in.read(buffer, 0, buffer.length)) != -1) {
outputStream.write(buffer, 0, length);
}
outputStream.flush();
Log.d("Socket Listener", "Data has been sent");
in.close();
socket.close();
I need to upload image (jpeg, png, gif) and audio (mp3, wav, aa3) to web server. So I need to convert image into byte array. How do I do that?
Now I am try following format. But it increases the size. How do other applications do this without increasing size quality? They upload the original size and image quality.
Bitmap uploadedImage = ((BitmapDrawable) temp.getDrawable()).getBitmap();
ByteArrayOutputStream stream = new ByteArrayOutputStream();
uploadedImage.compress(Bitmap.CompressFormat.JPEG, 100, stream);
byte[] bytes = stream.toByteArray();
Original images increase in size when converted to a bitmap because all the image formats you mentioned use some form of compression (some lossless, some not). When you use a bitmap, there's no compression used.
You could read the raw data of the drawable into a byte array and send that to the server. You can check that answer Reading a resource sound file into a Byte array to see an example of how this could be done.
Presumably you have the Image in some sort of form already? Don't try to decompress or process it, just send it to the server as a stream of bytes. For example if it is already a file on your computer:
public String postFile(URL url, InputStream in) throws IOException {
try {
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
try {
connection.setRequestMethod("POST");
connection.setChunkedStreamingMode(CHUNK_SIZE);
connection.setDoOutput(true);
connection.setRequestProperty("Content-Type","application/octet-stream");
try (OutputStream out = connection.getOutputStream()) {
byte[] buffer = new byte[CHUNK_SIZE];
int len;
while ((len = in.read(buffer)) != -1) {
out.write(buffer, 0, len);
}
}
if (connection.getResponseCode() != 201) {
try (BufferedReader br = new BufferedReader(new InputStreamReader(connection.getErrorStream()))) {
// TODO: Handle error
}
} else {
try (BufferedReader br = new BufferedReader(new InputStreamReader(connection.getInputStream()))) {
// TODO: Handle success
}
}
} finally {
connection.disconnect();
}
}
To send a file just pass in the FileInputStream, to send bytes you can simplify the function above so it doesn't need to read from the stream and can just send the byte buffer on the output stream.
To get an InputStream for a local file you just need:
File file = ....
try (InputStream is = new FileInputStream(file)) {
postFile(url, is);
} catch (FileNotFoundException ex) {
// TODO: Handle error
}
No need to compress. Just send as stream. once got the image uri do the following..
InputStream iStream = context.getContentResolver().openInputStream(uri);
and use the iStream into write the httpsurlconnection.
I'm trying to send an jpeg image from my android phone through socket and from the PC part, get the sent data and store it in an jpg file.
I'm pretty sure that I configured the socket correctly, as I can download data (binary file) from PC to android and save it correctly.
I can also read the stream which is sent from android to PC. The packet length and header information are exactly what I expect.
The problem is in reading image data. I'm getting same size for image data but when I save it to .jpg file, it is corrupted and I can not view it.
Here is my Android code that tries to send image file after sending header information:
try{
//packagesize is header information which is sent in advance
index.putInt(packagesize);
byte c[]= {index.get(3),index.get(2),index.get(1),index.get(0)};
InputStream jpgimage = new FileInputStream(fileimage);
dataOutputStream = new DataOutputStream(socket.getOutputStream());
dataInputStream = new DataInputStream(socket.getInputStream());
int writeBytes = 0,len = 0;
byte buffer[] = new byte[1024];
while((len = jpgimage.read(buffer,0,buffer.length))!=-1)
{
writeBytes+=len;
dataOutputStream.write(buffer,0,len);
}
dataOutputStream.flush();
jpgimage.close();
dataInputStream.close();
dataOutputStream.close();
...
...
...
}
catch statements here
This is the receiving code in the PC part:
// after reading header information I try to read image data
char *buff = malloc(sizeof(char) * jpeg_length);
unsigned int byteCount = 0;
unsigned int byteCount = 0;
do
{
int ret = recv(socket, buff+readBytes, jpeg_length-readBytes, 0);
if (ret <= 0)
{
fprintf(stderr,"Error receiving jpeg file.\n");
fclose( output );
return 106;
}
readBytes += ret;
fwrite(buff, sizeof(char), readBytes, output);
}
while (readBytes < jpeg_length);
fclose( output );
I also have to mention that the receiving part is working fine when I send image data with PC client application which is pure C++.
Is there any idea about what is the problem and why I get corrupted image sending from android device?
Appreciate it.
Edited
I add this the to android application for testing if the sending bytes can form a good image or not? I saved the image and it was OK.
int writeBytes = 0,len = 0;
byte buffer[] = new byte[1024];
// Here I save all sending bytes to an image called test.jpg
String path = "sdcard/download/images/test.jpg";
FileOutputStream stream = new FileOutputStream(path);
while((len = jpgimage.read(buffer,0,buffer.length))!=-1)
{
writeBytes+=len;
stream.write(buffer);
dataOutputStream.write(buffer,0,len);
dataOutputStream.flush();
}
stream.flush();
dataOutputStream.flush();
jpgimage.close();
dataInputStream.close();
dataOutputStream.close();
I think you should use Bitmap class to convert you image to ByteBuffer and then send it across and on the other end convert ByteBuffer to image.
On Sender Side
Bitmap bitmap = BitmapFactory.decodeFile("ImageD2.jpg");
int bytes = bitmap.getByteCount();
ByteBuffer buffer = ByteBuffer.allocate(bytes);
bitmap.copyPixelsToBuffer(buffer);
byte[] array = buffer.array();
Now you can send byte[] as normal data.
On receiving side
receive the array normally and convert it back to Bitmap
Bitmap bitmap = BitmapFactory.decodeByteArray(array , 0, array .length);
for more information you can read following questions
Converting bitmap to byteArray android
How to convert byte array to Bitmap
I found the solution for that. The problem was from Android side. So I did the following changes:
I changed DataOutputStream and DataInputStream to BufferedOutputStream and BufferedInputStream respectively :
try{
//packagesize is header information which is sent in advance
index.putInt(packagesize);
byte c[]= {index.get(3),index.get(2),index.get(1),index.get(0)};
InputStream jpgimage = new FileInputStream(fileimage);
dataOutputStream = new BufferedOutputStream(socket.getOutputStream());
dataInputStream = new BufferedInputStream(socket.getInputStream());
int writeBytes = 0,len = 0;
byte buffer[] = new byte[1024];
while((len = jpgimage.read(buffer,0,buffer.length))!=-1)
{
writeBytes+=len;
dataOutputStream.write(buffer,0,len);
}
dataOutputStream.flush();
jpgimage.close();
dataInputStream.close();
dataOutputStream.close();
...
...
...
}
catch statements here