i have a communication App over sockets.
The Client sends an Image to the server:
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bmp.compress(Bitmap.CompressFormat.PNG, 100, stream);
byte[] byteArray = stream.toByteArray();
OutputStream os;
try {
os = MyClient.socket.getOutputStream();
os.write(byteArray,0,byteArray.length);
os.flush();
On the server side i want to receive the Image, but at the moments it just shows many different characters. If the Client just sends a text i receive it with:
BufferedReader input = new BufferedReader(new InputStreamReader(s.getInputStream()));
String text = input.readLine();
But how can i "decode" the byte[] on the Server Side?
Analogously to how you sent that image. Simply use an InputStream object like this:
InputStream stream = socket.getInputStream();
byte[] data = new byte[MAX_SIZE];
int count = stream.read(data);
Both objects (sending and receiving) are compatible this way, you just have to know the byte array size, it has to be the same on both places.
Related
I need to implement a solution in Java that connects to a CCTV server via TCP Socket, takes a live stream video from it and then takes that stream and outputs it to a spring boot endpoint (RTSP or other protocol) so that it can be shown in a web player.
The problem I have is that I do not quite know how to achieve that, maybe some can help. So far, I've got the following piece of code:
try (Socket socket = new Socket(hostname, port)) {
DataOutputStream outToServer = new DataOutputStream(socket.getOutputStream());
BufferedReader inFromServer = new BufferedReader(new InputStreamReader(socket.getInputStream()));
// Login command
outToServer.writeBytes(login);
// Start LIVE video
outToServer.writeBytes(live);
ByteArrayOutputStream outStreamObj;
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
byte[] byteChunk = new byte[1024];
InputStream input = socket.getInputStream();
int c = input.read(byteChunk);
while (c != -1) {
buffer.write(byteChunk, 0, c);
c = input.read(byteChunk); // If placed in its own loop, it will read forever (or until the stream stops) so it will never exit
BufferedImage image = ImageIO.read(new ByteArrayInputStream(buffer.toByteArray()));
// create the object of ByteArrayOutputStream class
outStreamObj = new ByteArrayOutputStream();
// write the image into the object of ByteArrayOutputStream class
ImageIO.write(image, "jpg", outStreamObj);
// create the byte array from image
byte[] byteArray = outStreamObj.toByteArray();
// create the object of ByteArrayInputStream class
// and initialized it with the byte array.
ByteArrayInputStream inStreambj = new ByteArrayInputStream(byteArray);
// read image from byte array
BufferedImage newImage = ImageIO.read(inStreambj);
// write output image
ImageIO.write(newImage, "jpg", new File("outputImage.jpg"));
System.out.println("Image generated from the byte array.");
}
} catch (UnknownHostException ex) {
...
} catch (IOException ex) {
...
}
So far, it works until BufferedImage image = ImageIO.read(new ByteArrayInputStream(buffer.toByteArray()));, where image is null. I am not even sure if this is correct. I wouldn't really want to save the image on disk any way, but for now it's okay i guess.
Basically, how it should work as follows:
User visits a web page (Angular)
The web player loads with a live stream url to the spring boot backend
The spring boot backend connect further to the CCTV server via TCP (not possible via RTSP or other protocol) and sends the live command via the socket
Within the same socket session the server starts pushing the live stream bytes
The spring boot app takes these bytes and sends them further to the browser.
Any suggestions?
I want to send images through sockets but I have not been able to do it in android, could someone help me?
System.out.println("iniciooooo");
//converting image to bytes with base64
Bitmap b = BitmapFactory.decodeFile("/sdcard/ajeffer.jpg");
ByteArrayOutputStream byte2= new ByteArrayOutputStream();
b.compress(Bitmap.CompressFormat.JPEG,70,byte2);
byte[] enbytes = byte2.toByteArray();
String bb = Base64.encodeToString(enbytes,Base64.DEFAULT);
System.out.println(Base64.encodeToString(enbytes,Base64.DEFAULT));
data.writeUTF(bb);
FileOutputStream file;
//receiving the image in bytes to convert it into an image
DataInputStream dain = new DataInputStream(s.getInputStream());
msg = dain.readUTF();
File ff = new File("/sdcard/a2jeffer.jpg");
byte[] deco = Base64.decode(dain.readUTF(),Base64.DEFAULT);
Bitmap bit = BitmapFactory.decodeByteArray(deco,0,deco.length);
file = new FileOutputStream(ff);
bit.compress(Bitmap.CompressFormat.JPEG,70,file);
//the image is not created
I realized that my code did not work because I had to put this android: requestLegacyExternalStorage =" true " in the manifest, also I see that you are right about writeUTF () since in order to send images I must drastically lower the quality but it works If you have an idea on how to improve this, let me know, thank you very much.
You were right, this works great for sending and receiving any file.
Send file
OutputStream outputStream = socket.getOutputStream();
InputStream inputStream = new FileInputStream(file);
byte[] datita = new byte[16*1024];
int count;
while((count = inputStream.read(datita))>0){
outputStream.write(datita,0,count);
}
outputStream.close();
inputStream.close();
Receive file
OutputStream outputStream = new FileOutputStream(file);
InputStream inputStream = s.getInputStream();
byte[] datita = new byte[16*1024];
int count;
while((count = inputStream.read(datita))>0){
outputStream.write(datita,0,count);
}
outputStream.close();
inputStream.close();
I have a Server application on my PC which reads a jpg file and sends it through a socket to the android device. The problem is that when android device receives a byte array, it can't be converted to bitmap. I created a PC application to receive that same array and the data received is different than on the android even though I am using the same code to receive it.
Hence my assumption is that I somehow need to read it differently on android.
PC Java Server
ServerSocket serverSocket = new ServerSocket(PORT);
Socket clientSocket = serverSocket.accept();
BufferedImage image = ImageIO.read(new File("D:\\test1\\test.jpg"));
ByteArrayOutputStream byteArrayoutputStream = new ByteArrayOutputStream();
ImageIO.write(image, "jpg", byteArrayoutputStream);
OutputStream outputStream = clientSocket.getOutputStream();
byte[] size = ByteBuffer.allocate(4).putInt(byteArrayoutputStream.size()).array();
outputStream.write(size);
outputStream.write(byteArrayoutputStream.toByteArray());
outputStream.flush();
Thread.sleep((long)5000);
clientSocket.close();
Android receiver
DataInputStream inputStream = new DataInputStream(serverSocket.getInputStream());
byte[] sizeAr = new byte[4];
inputStream.read(sizeAr);
int size = ByteBuffer.wrap(sizeAr).asIntBuffer().get();
byte[] imageAr = new byte[size];
inputStream.read(imageAr);
System.out.println(imageAr.toString());
bMap = BitmapFactory.decodeByteArray(imageAr, 0, imageAr.length);//this returns null
You probably are not receiving the whole thing,
Instead:
inputStream.read(imageAr);
try:
inputStream.read(imageAr, 0, size);
the second form will wait until size bytes are received.
ADDED
Also do the same on the first read()
instead: inputStream.read(sizeAr); try inputStream.read(sizeAr, 0, 4);
Also get and check inputStream.read() return values, it says how many bytes were really read.
I'm trying to make a video file transfer but am having problems getting the server to start sending bytes.
The first step is for the client to connect, the socket gets accepted. Then the client sends the video file name but the server never reads this.
This is the code for the server up until it blocks:
try(ServerSocket serverSocket = new ServerSocket(4005))
{
Socket socket = serverSocket.accept();
System.out.println("accepted");
OutputStream os = socket.getOutputStream();
BufferedReader receiveReader = new BufferedReader(new InputStreamReader(socket.getInputStream()));
System.out.println("This gets printed");
String request = receiveReader.readLine();//never passes this line
System.out.println("This doesn't get printed");
and this is the client up until it blocks waiting for the server to send the video bytes:
try(Socket socket = new Socket(IPAddress, 4005))
{
byte[] messageBytes = new byte[10000];
DataOutputStream outputStream = new DataOutputStream(socket.getOutputStream());
outputStream.writeBytes("REQUEST;"+videoPath);//This is the line that should send the bytes for the server to read, so it won't block.
String home = System.getProperty("user.home");
String path = home+"\\Downloads" + videoName;
path = path.trim();
FileOutputStream fos = new FileOutputStream(path);
BufferedOutputStream bos = new BufferedOutputStream(fos);
InputStream is = socket.getInputStream();
int bytesRead = 0;
System.out.println("Downloading file...");
while((bytesRead = is.read(messageBytes))!=-1)//This blocks here
Why on earth isn't the server reading the "Request" + videoPath bytes that the server is sending? I tried outputStream.flush() as well, no luck.
Usual problem. You're reading a line but you aren't writing a line. Add a line terminator to the sent message.
When you fix this you will then discover that you can't mix buffered streams and readers on the same socket. I suggest you do all the I/O via the DataInput/OutputStream classes, using read/writeUTF() for the name.
If you're sending multiple files see my answer there.
I want to stream and audio with SIP Connection in java application(SE).I connected with the server and got 200 OK messages.I want to receive data sent by the server. I created a SOCKET and got an InputStream. Here is how I do it. 123.456.789.1 is the my ip address and 1234 is which my application listening port.
Socket socket=new Socket("123.456.789.1",1234);
InputStream in=socket.getInputStream();
System.out.println("inputSream available :"+in.available());
But in.available() is always 0 .
But if I get the Object content=response.getContent();
ByteArrayOutputStream bos = new ByteArrayOutputStream();
ObjectOutput out = new ObjectOutputStream(bos);
out.writeObject(content);
byte[] contentBytes = bos.toByteArray();
the lenght of contenBytes equals to the response content length.But when I try to get inputStream and Play ,like following
InputStream pp=new ByteArrayInputStream(b);
AudioStream as = new AudioStream(pp);
AudioData data = as.getData();
ContinuousAudioDataStream cas = new ContinuousAudioDataStream (data);
An Exception throws;java.io.IOException: could not create audio stream from input stream
Then I tried to read the inputstream in.read() then when read some bytes,and IOException was thrown.
Q1. How can I solve and get InputStream from the socket?
Q2. how to get an inputStream to play the audio?
or let me know where the problem is and how to solve it.
UPDATED: Thank you all who showed a fault in.availabe();
Then I changed the code.
ByteArrayOutputStream ou=new ByteArrayOutputStream();
int i=0;
System.out.println("Before while");
while((i=in.read())!=-1){
ou.write(i);
System.out.println("Wrote :"+i);
}
Unfortunately the application doesn't go further.That means only Before while is printed.Application just shows running(I use netbeans IDE).I don't why.Any clarification?
When you use getContent you get some kind of object wrapping the content. Then using an ObjectOutputStream you write the Java representation of that object, not the actual bytes of the original data.
You should be able to do
AudioStream as = new AudioStream(in);
AudioData data = as.getData();
ContinuousAudioDataStream cas = new ContinuousAudioDataStream (data);
or if you do want to buffer the data
int chunkSize;
byte[] chunk = new byte[2048];
ByteArrayOutputStream outBuffer = new ByteArrayOutputStream();
while ( ( chunkSize = in.read(chunk) ) != -1) {
outBuffer.write(chunk, 0, chunkSize);
}
ByteArrayInputStream inBuffer = new ByteArrayInputStream(outBuffer.toByteArray());
AudioStream as = new AudioStream(inBuffer);
AudioData data = as.getData();
ContinuousAudioDataStream cas = new ContinuousAudioDataStream (data);
available() show how many bytes can be guaranteed read before blocking. It might always return 0.
available() is the number of bytes which can be read with out performing a blocking call to the OS. If you want to know how much data is available you should try to read it and see how much you get.