Getting an object into a ChannelBuffer - java

I've written a small http server using Netty by following the example http server and now i'm trying to adapt it to my needs (a small app that should send json). I began by manually encoding my POJOs to json using jackson and then using the StringEncoder to get a ChannelBuffer. Now i'm trying to generalize it slightly by extracting the bit that encodes the POJOs to json by adding a HttpContentEncoder and I've managed to implement that more or less.
The part that i can't figure out is how to set the content on the HttpResponse. It expects a ChannelBuffer but how do i get my object into a ChannelBuffer?
Edit
Say i have a handler with code like below and have a HttpContentEncoder that knows how to serialize SomeSerializableObject. Then how do i get my content (SomeSerializableObject) to the HttpContentEncoder? That's what i'm looking for.
SomeSerializableObject obj = ...
// This won't work becuase the HttpMessage expects a ChannelBuffer
HttpRequest res = ...
res.setContent(obj);
Channel ch = ...
ch.write(res);
After looking into it a bit more though i'm unsure if this is what HttpContentEncoder is meant to do or rather do stuff like compression?

Most object serialization/deserialization libraries use InputStream and OutputStream. You could create a dynamic buffer (or a wrapped buffer for deserialization), wrap it with ChannelBufferOutputStream (or ChannelBufferInputStream) to feed the serialization library. For example:
// Deserialization
HttpMessage m = ...;
ChannelBuffer content = m.getContent();
InputStream in = new ChannelBufferInputStream(content);
Object contentObject = myDeserializer.decode(in);
// Serialization
HttpMessage m = ...;
Object contentObject = ...;
ChannelBuffer content = ChannelBuffers.dynamicBuffer();
OutputStream out = new ChannelBufferOutputStream(content);
mySerializer.encode(contentObject, out);
m.setContent(content);
If the serialization library allows you to use a byte array instead of streams, this can be much simpler using ChannelBuffer.array() and ChannelBuffer.arrayOffset().

Related

Best way to send a JSon object from Servlet

The question can seem simple, but I didn't find a good answer yet. I need to send a JSon structure (build with an unspecified libretry I'm currently developing) from a Servlet to a remote page.
I'm interested in the best way to send the structure.
I mean, in my Servlet, inside the doPost() event, how should I manage the send?
I was thinking about 2 scenarios:
try (PrintWriter out = response.getWriter()) {
out.print(myJSon.toString(); // <- recursive function that overrides
// toString() and returns the entire JSon
// structure
} (...)
or
try (OutputStream os = response.getOutputStream()) {
myJSon.write(os, StandardCharsets.UTF8); // <- function that
// recursively writes chunk of my JSon structure
// in a BufferWriter created inside the root write function
// forcing UTF-8 encoding
} (...)
Or something different, if there's a better approch.
Note that the JSon structure contains an array of objects with long text fields (descriptions with more than 1000 characterd), so it can be quite memory consuming.
For why I'm not using standard JSon libreries, it's because I don't know them and I don't know if I can trust them yet. And also I don't know if I will be able to install them on the production server.
Thanks for your answers.
From your question i see multiple points to adress:
How to send your JSon
What JSon library can you use
How to use the library in production
How to send your JSon
From your code this seems to be an HTTP response rather than a POST on your Servlet so you need to know how to send a JSON string as an HTTP response's body
Do you use a framework for your web server or are you handling everything manually ? If you use a framework it usually does it for you, just pass the JSON String
If your doing it manually:
try (PrintWriter pw = response.getWriter()) {
pw.write(myJson.toString());
}
or
try (OutputStream os = response.getOutputStream()) {
os.write(myJson.toString().getBytes());
}
Both are valid, see Writer or OutputStream?
Your JSON's size shouldn't matter given what your saying, it's just text so it won't be big enough to matter.
What libraries can you use
There are a lot of JSON libraries for Java, mainly:
Jackson
GSon
json-io
Genson
Go for the one you prefer, there will be extensive documentation and resources all over google
How to use in production
If you are not sure you are able to install dependencies on the production server, you can always create an uber-jar (See #Premraj' answer)
Basically, you bundle the dependency in your Jar
Using Gson is good way to send json
Gson gson = new Gson();
String jsonData = gson.toJson(student);
PrintWriter out = response.getWriter();
try {
out.println(jsonData);
} finally {
out.close();
}
for detail json response from servlet in java

Thrift: Serialize + Deserialize changes object

I have a thrift struct something like this:
struct GeneralContainer {
1: required string identifier;
2: required binary data;
}
The idea is to be able to pass different types of thrift objects on a single "pipe", and still be able to deserialize at the other side correctly.
But serializing a GeneralContainer object, and then deserializing it changes the contents of the data field. I am using the TBinaryProtocol:
TSerializer serializer = new TSerializer(new TBinaryProtocol.Factory());
TDeserializer deserializer = new TDeserializer(new TBinaryProtocol.Factory());
GeneralContainer container = new GeneralContainer();
container.setIdentifier("my-thrift-type");
container.setData(ByteBuffer.wrap(serializer.serialize(myThriftTypeObject)));
byte[] serializedContainer = serializer.serialize(container);
GeneralContainer testContainer = new GeneralContainer();
deserializer.deserialize(testContainer, serializedContainer);
Assert.assertEquals(container, testContainer); // fails
My guess is that some sort of markers are getting messed up when we serialize an object containing binary field using TBinaryProtocol. Is that correct? If yes, what are my options for the protocol? My goal is to minimize the size of resulting serialized byte array.
Thanks,
Aman
Tracked it to a bug in thrift 0.4 serialization. Works fine in thrift 0.8.

Java XML: Keeping a copy of a partial XML tree when parsing from a socket

I have a socket feeding into a sax parser, formatted as a ISO 8859/1 stream. Every so often there is an invalid character, and I get a SAXParseException with a row and column where that happened, so I need to see what the data is at that point (or more importantly log it).
Originally the lines that processed the data were:
InputSource is = new InputSource(new InputStreamReader(socket.getInputStream(), "ISO8859_1"));
XMLReader reader = XMLReaderFactory.createXMLReader();
reader.setContentHandler(new ResponseParseHandler(etc, id));
reader.parse(is);
Problem is that I can't get the data after the event of this happening, so I changed it to read into a large byte buffer, convert it to a string and parse that data with a StringReader. Unfortunately the data coming from the socket is spread out in small chunks over a long time, so it will start with the root tag when it first connects, but then there will be thousands of separate messages without a closing tag.
Because I am parsing these strings individually when they come in the first one has an error that it doesn't have a closing tag, and the following ones error as they don't have a base tag. This doesn't happen with the socket as I assume the stream is still open
Presumably I can feed these strings to another reader / writer but it seems to be getting really complicated just to find out what the block of data was at the time of the error.
Is there something really simple I am missing here?
The last time I had a problem similar to this, I solved it with a SplittingWriter. This was a decorator style class around two other Writers, and when something "wrote" to the SplittingWriter it simply delegated the write call to both of its two underlying Writers.
In your case, you would want something like a SplittingInputStreamReader, which would implement InputStreamReader and which you would pass in to InputSource instead of the InputStreamReader you are using at the moment.
In its constructor the SplittingInputStreamReader would take your current InputStreamReader and some other object, lets call it Foo. The implementation of the read methods on SplittingInputStreamReader would then delegate the read calls to the underlying InputStreamReader, push the results of those calls to Foo, and then return the result of those calls back to the thing that called it. So your implementation of the int read() method would be something like:
#Override
public int read() {
int r = this.inputStreamReader.read();
this.foo.submit(r);
return r;
}
That way, as you read via the SplittingInputStreamReader, you also write to Foo, allowing you to see where the write stopped assuming you gave Foo a decent interface. In the end, after implementing SplittingInputStreamReader and Foo, your code would look something like this:
Foo streamCapture = new Foo();
SplittingInputStreamReader streamReader = new SplittingInputStreamReader(
new InputStreamReader(socket.getInputStream(), "ISO8859_1"), streamCapture);
InputSource is = new InputSource(streamReader);
XMLReader reader = XMLReaderFactory.createXMLReader();
reader.setContentHandler(new ResponseParseHandler(etc, id));
reader.parse(is);
// After parse, if there was an error, check what is in Foo streamCapture
You can provide your own InputStreamReader custom impl that keeps a reference to the content you need (e.g. MyInputStreamReader) and provides methods for you to get at the decoded content or last 1024 bytes of decoded content (or some capped amount).
Let the existing InputStreamReader impl do what it is already doing, just wrap it with some additional logic in a custom class then pass that to create the InputSource.

Implementing zLib compression in Flex and Java

I am sending some JSON data from my Flex application to the Java side for business processing. Now on top of that, I have added some code to compress(zLib) the data at Flex side and then pass it through Request and uncompress the same at java side.
But at the java layer, the uncompressed data is still not in readable/usable format.
Putting the code in here for reference.
Flex code for encoding
var bytes:ByteArray = new ByteArray();
bytes.writeObject(JSON.encode(someObj));
bytes.position = 0;
bytes.compress();
variables.encodeJSONStr = bytes;
requester.data = variables;
loader.load(requester);
Java code for decoding
String json = req.getParameter("encodeJSONStr");
byte[] input = json.getBytes();
Inflater decompresser = new Inflater();
decompresser.setInput(input);
byte[] result = new byte[1000];
int resultLength=0;
resultLength = decompresser.inflate(result);
decompresser.end();
String outputString = new String(result, 0, resultLength, "UTF-8");
System.out.println("\n\n resultLength>>>"+resultLength); // O/P comes as Zero
Can someone point put the issue in here or some better approach for compression of data when sending from Flex to Java ?
Some time ago I wrote a short post about sending compressed data between flex/java, maybe it helps: http://cornelcreanga.com/2008/07/actionscript-compressing-strings/
First you should try if Flex does the zLib compression properly (by uncompressing the data sent with another tool).
On the Java side you can try to use the InflaterInputStream which is easier to handle than the more low level Inflater. I had some issues with the Java native implementation and ended up using the jZlib which offers a zlib compression uncompression in pure Java.

How to change endianness when unmarshalling a CDR stream containing valuetype objects in Java

I've got marshaled CDR data all by itself in the form of a file (i.e., not packed in a GIOP message) which I need to unmarshal and display on the screen. I get to know what type the data is and have working code to do this successfully by the following:
ValueFactory myFactory = (ValueFactory)myConstructor.newInstance( objParam );
StreamableValue myObject = myFactory.init();
myObject._read( myCDRInputStream );
where init() calls the constructor of myObjectImpl(). and _read is the org.omg.CORBA.portable.Streamable _read(InputStream) method.
This works as long as the marshaled data is of the same endianness as the computer running my reader program, but I will need to be able to handle cases where the endianness of the data is different than the endianness of the computer running the reader. I know that endianness is in GIOP messages, which I don't have. Assuming I figure out that I need to change the endianness, how can I tell this to the stream reader?
Thanks!
If you access to the underlying ByteBuffer of your input stream, and then you can set the endianness. For example I use this to open matlab files myself
File file = new File("swiss_roll_data.matlab5");
FileChannel channel = new FileInputStream(file).getChannel();
ByteBuffer scan = channel.map(MapMode.READ_ONLY,0,channel.size());
scan.order(ByteOrder.BIG_ENDIAN);
However, I dont know if you corba framework is happy to read from a bytebuffer (corba is so 90ies). So maybe that does not work for you.

Categories