I want using JMeter to benchmark server to server communication (Java Spring) with different data serialization format than JSON
Article Why not JSON? is suggesting MessagePack
MessagePack is an efficient binary serialization format. It lets you exchange data among multiple languages like JSON. But it's faster and smaller.
Can I use JMeter for benchmark sending JSON messages vs MessagePack and what can it compare? Can I check the time for getting request on receiver side or also time to prepare and send request on sender side? Or there are other considerations/known issues that prevent either?
You can use JMeter for literally anything, in case of MessagePack you can go for MessagePack Serializer for Java - this will allow you creating binary request payloads on JMeter side i.e. using JSR223 PreProcessor and Groovy language (it's 99.9% Java-compatible so example code will work just fine)
With regards to metrics, they should be the same as usually, your main targets should be:
Response time (lower is better)
Throughut - number of requests per unit of time (higher is better)
Given JSON and MessagePack are served by the same host network-specific metrics like Latency and Connect Time and be ignored. Check out JMeter Glossary for main metrics listed and explained.
I would also pay attention to server-side metrics like CPU or Memory usage as potentially deserialising binary data and serialising it back can be more resources intensive so my expectation is that MessagePack implementation will have larger footprint. You can use i.e. SSHMon Listener or JMeter PerfMon Plugin in order to check your system under test resources usage while your test is running - this way you will be able to correlate increasing load with increasing resources consumption.
Related
My application reads an XML request from WebSphere MQ and responds with single or multiple java object(s). While i can use JMS point-to-point sampler to post the XML request and subscriber sampler to catch the java object posted back by my application. Now i want the deserialization of the java objects to be able to assert the same. I have the required jar(s) that can help me in deserialization but i am not aware as to how i can perform this in jmeter. Can someone please provider directions as to how i can proceed?
You will need to have all the necessary dependencies in your JMeter's /lib folder.
You can then just add a JSR-223 sampler/post-processor that executes the Java code that you want using those dependencies. You can choose any of the scripting languages there, but be aware of the performance problems that some of them have (BeanShell caused GC lag for me).
Add JSR223 PostProcessor as a child of the JMS P2P Sampler and put the deserializing code into it. When you convert binary response to String you will be able to assign the value to a JMeter Variable as:
vars.put("variableName", variableValue);
and use it in Assertion (JMeter Assertions can target JMeter Variables).
It is recommended to use Groovy as JSR223 element language as JavaScript, Beanshell, etc. interpreters have some performance issues, besides they're quite out of date and Groovy scripts can be compiled into bytecode (assuming test element being properly configured) providing maximum performance.
See Beanshell vs JSR223 vs Java JMeter Scripting: The Performance-Off You've Been Waiting For! guide for instructions on how to setup groovy scripting engine support, best practices in regards to caching/using variables/etc. and some form of different scripting engines benchmark.
One of benefits of using Jackson for JSON processing is:
all modes [i.e. streaming, tree, and binding to Java objects] fully supported, and best of all, in such a way that it is easy to convert between modes, mix and match. For example, to process very large JSON streams, one typically starts with a streaming parser, but uses data binder to bind sub-sections of data into Java objects: this allows processing of huge files without excessive memory usage, but with full convenience of data binding.
Are there XML processors for Java or Scala which also support this scenario?
Maybe you want to check out Smooks
http://smooks.org
HTH
Suppose we have a remoting enabled application with server and client components, which can run on different machines.
Now we have a set of files containing data that need to be saved to DB via server. We can have 2 approaches:
1). Convert the data into a list of Objects, serialise them and send them over to server
2). Serialise the files and send them over to server
Is there difference between the two approaches? How do I test them?
Sending the files as they are is always going to be more efficient than translating them into and out of different formats at both ends.
You should probably define a little API for the server (the file format it expects e.g. CSV or JSON with some schema) and send it the files in that format. If you are only going to have to interact with one client then the format might as well be whatever the files are already in. Otherwise make it more general and the client must convert the files to that format. I wouldn't use Java serialisation as it is very fragile - generally the client and the server have to have the same versions of the classes involved (you can use readObject and writeObject and version numbers to work around this but its not worth the hassle).
Environment
My application (war) has a JavaScript frontend and a Java REST service.
The files to be uploaded will be generated in the frontend, but not directly by user interaction -- this is not a use case where the user is uploading files herself. For that reason, it's necessary to initiate the upload from the JavaScript code.
I need to be able to send metadata (generated by other parts of the application) about the binary data when I'm uploading it -- which is why I need some sort of protocol instead of just uploading a file.
Question
What I haven't been able to determine is what the best practice is for uploading files, with regards, primarily, to the protocol used.
I've come across the following protocols:
json
xml
proctol-buffers (via protobuf.js)
However, the internets has, as usual, lots of different info that hasn't been giving me a coherent picture:
With regards to reliability, the internets seems to say that you're better off using the multipart/mixed type to transfer data, instead of the pure application/octet-stream type.
json doesn't natively doesn't support binary data, and apparently, Base64 has a high processing overhead.
It's a JavaScript frontend, so json would be preferred.
Sure, I could use protobuf.js, but I'd rather use leading-edge tech than bleeding-edge tech.
My priorities are:
reliable data transfer of files between 1 and 10 megabytes.
performant and efficient data transfer.
readable code/architecture
In short, which of the 3 formats mentioned above fits those requirements the best, given that I'm using a Java REST service on the backend?
(If the fact that I'm using a Java REST service -- instead of say, a servlet -- to upload the files is going to be the biggest slowdown, that's also a good answer!)
EDIT: added information asked by the comments -- thanks!
I'm currently working on a project which needs some server-client communication. We're planning to use Websockets and a Java server (Jetty) on the server side. So, messages sent must be interpreted with Java from the server and with JavaScript from the client.
Now we're thinking about a protocol and which structure the messages should have. We already have a reference implementation which uses XML messages. But since JSON is designed to be used with JavaScript we're also thinking about the possibility to use JSON-Strings.
Messages will contain data which consists of XML strings and some meta information which is needed to process this data (i.e. store it in a database, redirect is to other clients...). It would be important if the processing of the messages (parsing and creating) would be easy and fast on both server and client side since the application should feature real time speed.
Since we have not the time to test both of the technologies I would be happy about some suggestions based on personal experience or on technical aspects. Is one of the technics more usable than the other or are there any drawbacks in one of them?
Thanks in advance.
JSON is infinitely easier to work with, in my opinion. It is far easier to access something like data.foo.bar.name than trying to work your way to the corresponding node in XML.
XML is okay for data files, albeit still iffy, but for client-server communication I highly recommend JSON.
You are opening a can of worms (again, not the first time).
have a look at this JSON vs XML. also a quick serach on stackoverflow will also be good.
this question might be duplicated across. Like this Stackoverflow XML vs JSON.
In the end answers stays the same. It depends on you. I though agree with many comments there that sometime, XML is overkill (and sometime not).
I agree with Kolink,
The reason, it is better to use JSON because the XML has a big Header, which means each transfer has a big overhead.
For iOS or Android, you have to use JSON as opposed to WLAN XML.
I agree with Kolink, but if you already have an XML scheme in place, I'd use XML to save you some headaches on the Java-side. It really depends on who's doing the most work.
Also, JSON is more compact, so you could save bandwidth using its format.
There seem to be some libraries for parsing JSON in Java, so it may not be too hard to switch formats.
http://json.org/java/