Good day. I develop server-client application with Glassfish+ObjectDB(embeded mode) on serverside and Android application on client side. What is prefreable way (with respect to traffic and security) to send data that stored as java objects in ObjectDB to android application? (Data must be encrypted.)
I think about:
pass serializable objects through stream input/output.
pass data as XML/JSON format.
Or may be there is other way?
Thx for help.
You can try Protobuf http://code.google.com/p/protobuf. Less traffic and easy to be integrated.
Binary data as well has a smallest size, but is less useful. XML is self-described, but has a biggest size.
If you need to send data only between your apps you can choose binary format.
In my project I am doing the first approach.
pass serializable objects through stream input/output.
This means I doing "file" uploads resp. downloads.
However, with this approach your are bound to use Java on both sides (server and android) which is not an issue in my case
Your second approach will generate to much overhead
Encryption should NOT be done on this level. Better use HTTPS
Related
I'm making an online game using ObjectOutputStream... to exchange data. Since I have different types of data I'm using the write/readObject() functions only. I was wondering if sending a String for commands was good practice or if there is a better, safer solution.
When I say I send commands with a String, for example I have a chat and I want to ignore a user, so I send to the server "block +username"; if I want to add a friend I send "addfriend +username", etc.
Well, using serialized objects might create lot of interoperability work if you are going for a serious installation. It also can become a bottleneck. I would (besides the obvious of using any other messeging protocol) stick to DataOutputStream if you look for a compact home grown protocol.
Sending strings as serialized java objects is the most suprising thing to do (and wont easily allow you to have client or servers in different languages).
If you want to be cool, use JSON and Websocket. :)
I have a project that requires me to create a Java server which will be connected to a MySQL database. This server is going to handle requests from clients, to send them data from the database.
The request from the clients will be:
check if a User is registered in the database
add User to the database
get a list of Users and which of them are on-line(this is where I use the HashMap)
After some searching I've concluded in using NIO, so I won't get too many threads to handle multiple Client requests. My problem is that I can't understand how you can retrieve data from the channel when you want to send, for example, a List or a HashMap. I mean, I've seen how the read(buffer) method works. I just can't understand -for example- how do you get th HashMap object back from the buffer, or how you retrieve any kind of "structured" data for that matter. If someone could explain(maybe with an example), that would be fantastic.
Maybe there is another way to communicate the data that I need, that would be easier for me to understand. I don't know. Your insight is greatly appreciated.
P.S. : My problem isn't that I don't get it because of the NIO, I have the same problem with the typical Input/Output Streams.
I should mention that the actual project is to create a Java server and the clients will be android devices. But since I'm a bit of a newbie, I thought I'd start off by testing the communication between two desktop, Java, applications before going for the android.
I mention this because I've seen something about Java RMI that allows you to use methods of your server remotely, but I think that you can't use it in Android.
You can read and write objects using the serialisation mechanism. The classes that are involved are ObjectOutputStream and ObjectInputStream. They are stream based though, so they don't fit well in the nio model. They are covered in the official tutorial: http://docs.oracle.com/javase/tutorial/essential/io/objectstreams.html
An alternative is using Google protocol buffers.
I'm developing a Java application that handles and analyses different types of data and what I want is to display this data in an user friendly way by using plots and tables in a HTML webpage using jQuery, Highcharts and few other JavaScript libraries.
What's the best way to do this? Is Google Web Toolkit my best bet?
My current solution is to do the analysis and the display of the data completely separated. I run all the analysis in Java an then export the results in Json files and build the plots with JavaScript scripts. Although, this way I'm having issues in handling the increasingly growing JavaScript files. Moreover, a more integrated interface where I could handle everything together would be the perfect solution.
Thanks.
Is your only problem the size of the resulting javascript file? If so, there are a couple of options.
1) Use websockets.
Websockets allow you to have a live connection between the server and the browser. This would allow you to stream the data instead of sending it between the backend and the browser as a server.
2) Serialize your data in a more compressed manner.
JSON is great. It's biggest strengths being it being human readable and playing exceptionally well with Javascript. However, it will add some extra bytes to your data. Not many mind you, but some. However, if size is truly being a problem, then consider creating a more serialized way to pass data, where you have an understanding of how the data should look, then on the browser, you can deserialize it into json, and continue as normal.
3) Compress and decompress
This one is a bit more convoluted, but will probably save you more bytes for the price of slower performance. You can zip the data up before passing it to the client, and unzipping on the browser. See this other question for more details on compression and decompression with javascript.
4) Leave it as is.
I guess I don't have enough context, but I think what you are currently doing should be fine enough. There is always ways to improve, but not sure exactly what you want to improve.
Cheers.
I need to transfer files fast over the Internet from a Java server to C++ clients, where often many clients would need the same files. I was looking at say transferTo() in Java which sounds like it would be a decently optimized function to send files. However, I'm not sure when I use transferTo() how to best receive that in C++ (i.e. is it just a raw data transfer, how do I determine when the file is over on the client side, etc.). I need this to work on both Windows and Linux. Also, other than transferTo(), would there be some way to be more efficient, especially by taking advantage of the fact that many clients will usually need the same files? I'm not sure how to do say multicast etc. Also, I'm using application-level security rather than a VPN, and on the Java server, encrypting with AES and using MAC digital signing, so I'm also looking for a cross-platform library recommendation to deal with the crypto on the C++ side with minimal pain.
I'm very proficient in C++ but have no previous experience with network programming, so please consider than in any suggestions.
Thanks.
An embedded webserver? http-transfers are efficient enough for you?
The simplest embeddable Java webserver I remember seeing is http://acme.com/java/software/Acme.Serve.Serve.html. We use embedded Jetty 6 in production at work, but that takes more elbow grease.
If your clients doesn't know where to find your webserver in the first place, consider announcing using Zeroconf. http://jmdns.sourceforge.net/
For scalability reasons, Thorbjørns suggestion of using http seems like a very good idea as it would allow you to easily set up http proxies for caching, use standard load balancing tools and so forth.
If you are looking to transfer more than just a blob of data, you might want to have a look at googles protocol buffers. They allow for very easy and fast encoding/decoding on the java and c++ end.
Consider chunking the file and sending via UDP datagram. C++ can re-compile as it receives it. Have you considered implementing/embedding an existing P2P protocol implementation?
If you need effecient transfer to many clients then your bottleneck is the server.
For this please look at the bit-torrent protocol as it distributes the transfer between the clients.
I know that the Java can use the Socket Programming to send an Object. Apart from socket programming, anything other way to do it?
Java's Remote Method Invocation (RMI) is probably the easiest and most widely supported way.
Java's Advanced Socket Programming describes marshalling objects over a socket.
Control a high-speed robot hand to type it in.
Pretty well every other thing you can do will be a layer built on sockets.
RFC1149!
Attach rubber band to computer A
Put Object in rubber band.
Pull back, aim at Computer B.
Let go.
Via web service for example. But its build on top on sockets again.
SOAP (Simple Object Access Protocol).
Serialize the object, write to a file, copy the file if the computers are network connected if not use a removable disk and deserialize it.
Objects can be transferred via a shared database.
I work near a production system that's been doing this for 10 years.
Ironically, except in rare cases, DB connections are also implemented via sockets.
No. Sockets are how computers communicate. Other than writing the data to some media and physically transporting it between computers, you will have to use sockets.
At the lowest level, data is transferred over sockets as bytes. So first you need to serialize your object to bytes, then you can send it, then on the other side you need to deserialize the object from the bytes.
Approaching your question less literally, there are Java libraries that handle the serialization automatically and hide the nastiness of dealing directly with sockets. I recommend KryoNet. KryoNet can do remote method invocations, and a lot simpler and more efficiently than Java's built-in RMI support.