Is there some subtle reason why java.nio.ByteBuffer does not implement java.io.DataOutput or java.io.DataInput, or did the authors just not choose to do this? It would seem straightforward to map the calls (e.g. putInt() -> writeInt()).
The basic problem I (and some others, apparently) have is older classes which know how to serialize/serialize themselves using the generic interfaces: DataInput/DataOutput. I would like to reuse my custom serialization without writing a custom proxy for ByteBuffer.
Just wrap the buffer in ByteArrayInputStream or ByteArrayOutputStream using the put() or wrap() methods. The problem with having a ByteBuffer directly emulate a datainput/output stream has to do with not knowing the sizes in advance. What if there's an overrun?
What is needed is a ByteBufferOutputStream in which you can wrap / expose the required behaviors. Examples of this exist; the Apache avro serialization scheme has such a thing. It's not too hard to roll your own. Why is there not one by default? Well, it's not a perfect world...
ByteArrayOutputStream backing = new ByteArrayOutputStream();
DataOutput foo = new DataOutputStream(backing);
// do your serialization out to foo
foo.close();
ByteBuffer buffer = ByteBuffer.wrap(backing.toByteArray());
// now you've got a bytebuffer...
A better way that works with direct buffers too:
class ByteBufferOutputStream extends OutputStream
{
private final ByteBuffer buffer;
public ByteBufferOutputStream(ByteBuffer buffer)
{
this.buffer = buffer;
}
public void write(int b) throws IOException
{
buffer.put((byte) b);
}
}
Note that this requires calling buffer.flip() after you are done writing to it, before you can read from it.
Related
This question already has answers here:
How to convert outputStream to a byte array?
(5 answers)
Closed 5 years ago.
How can I get the bytes of an outputStream, or how can I convert an outputStream to a byte array?
From a theoretical perspective (i.e., irrespective of whether it makes sense in practice as a use case), this is an interesting question that essentially requires the implementation of a method like
public abstract byte[] convert(OutputStream out);
The Java OutputStream class, as its name implies, only supports an overridden write() method for I/O, and that write() method gets either an integer (representing 1 byte) or a byte array, the contents of which it sends to an output (e.g., a file).
For example, the following code saves the bytes already present in the data array, to the output.txt file:
byte[] data = ... // Get some data
OutputStream fos = new FileOutputStream("path/to/output.txt");
fos.write(data);
In order to get all the data that a given OutputStream will be outputting and put it into a byte array (i.e., into a byte[] object), the class from which the corresponding OutputStream object was instantiated, should keep storing all the bytes processed via its write() methods and provide a special method, such as toByteArray(), that would return them all, upon invocation.
This is exactly what the ByteArrayOutputStream class does, making the convert() method trivial (and unnecessary):
public byte[] convert(ByteArrayOutputStream out) {
return out.toByteArray();
}
For any other type of OutputStream, not inherently supporting a similar conversion to a byte[] object, there is no way to make the conversion, before the OutputStream is drained, i.e. before the desired calls to its write() methods have been completed.
If such an assumption (of the writes to have been completed) can be made, and if the original OutputStream object can be replaced, then one option is to wrap it inside a delegate class that would essentially "grab" the bytes that would be supplied via its write() methods. For example:
public class DrainableOutputStream extends FilterOutputStream {
private final ByteArrayOutputStream buffer;
public DrainableOutputStream(OutputStream out) {
super(out);
this.buffer = new ByteArrayOutputStream();
}
#Override
public void write(byte b[]) throws IOException {
this.buffer.write(b);
super.write(b);
}
#Override
public void write(byte b[], int off, int len) throws IOException {
this.buffer.write(b, off, len);
super.write(b, off, len);
}
#Override
public void write(int b) throws IOException {
this.buffer.write(b);
super.write(b);
}
public byte[] toByteArray() {
return this.buffer.toByteArray();
}
}
The calls to the write() methods of the internal "buffer" (ByteArrayOutputStream) precede the calls to the original stream (which, in turn, can be accessed via super, or even via this.out, since the corresponding parameter of the FilterOutputStream is protected). This makes sure that the bytes will be buffered, even if there is an exception while writing to the original stream.
To reduce the overhead, the calls to super in the above class can be omitted - e.g., if only the "conversion" to a byte array is desired. Even the ByteArrayOutputStream or OutputStream classes can be used as parent classes, with a bit more work and some assumptions (e.g., about the reset() method).
In any case, enough memory has to be available for the draining to take place and for the toByteArray() method to work.
For #Obicere comment example:
ByteArrayOutputStream btOs = new ByteArrayOutputStream();
btOs.write("test bytes".getBytes());
String restoredString = new String(btOs.toByteArray());
System.out.println(restoredString);
I'm working with some code that I have no control over, and the ByteBuffer I'm working with gets passed to native method. I don't have access to the native code but it expects "buf" to be a ByteBuffer. Also note that the code doesn't really make any sense but there is a lot so I am distilling it down to the issue.
public class otherClass {
public final void setParams(Bundle params) {
final String key = params.keySet()[0];
Object buf = params.get(key));
nativeSet(key, buf);
}
private native final void nativeSet(key, buf);
}
Here is my code:
public void myMethod(ByteBuffer myBuffer) {
final Bundle myBundle = new Bundle();
myBundle.putByteBuffer("param", myBuffer);
otherClass.setParams(runTimeParam);
}
The problem? There is no putByteBuffer method in Bundle. Seems kind of weird that there is a get() that returns an object, but no generic put().
But what seems weirder to me is that the native code wants a ByteBuffer. When it gets passed from Java layer, won't it have a bunch of metadata with it? Can code in the native layer predict the metadata and extract from the ByteBuffer?
Is there any way to reliably pass a ByteBuffer here? It can be a little hacky. I was thinking maybe I could figure out what the ByteBuffer object would be in bits, convert to integer, and use putInt(). Not sure how to go from ByteBuffer object to raw data.
Hypothetically this should work. Turn the byte buffer to a string and pass that into your bundle like this:
byte[] bytes = myBuffer.getBytes( Charset.forName("UTF-8" ));
String byteString = new String( bytes, Charset.forName("UTF-8") );
myBundle.putString("param", byteString);
then reconstruct the bytebuffer from the string:
byte[] byteArray = byteString.getBytes();
ByteBuffer byteBuffer = ByteBuffer.allocate(byteArray.length + 8);
byteBuffer.put(byteArray);
When writing to a file using an OuputStream, what is the difference between using writeInt():
public static void makeFile(String name) throws Exception{
try (
OutputStream ostr = new FileOutputStream(name); ) {
//Uses writeInt() method
ostr.writeInt(1);
ostr.close();
}
}
and using write():
public static void makeFile(String name) throws Exception{
try (
OutputStream ostr = new FileOutputStream(name); ) {
// Uses the write() method with an int as input
ostr.write(1);
ostr.close();
}
}
What do both methods mean?
writeInt is not a member of OutputStream, so it wont compile. Assuming you use DataOutputStream or similar it will write the four bytes of the 32-bit integer in big-endian order. write will just write a single byte (the least significant of the int).
Arguably it wasn't a great idea to mix these two different ideas in the same interface. DataOutputStream should not have extended OutputStream, but too late to fix that now.
writeInt(int) comes from DataOutput interface. (ObjectOutputStream implements ObjectOutput interface, and ObjectOutput interface extends DataOutput interface.) As you can see from the JavaDoc documentation for DataOutput writeInt method, it writes four bytes in big endian order to the underlying stream.
write(int) comes from OutputStream class, which is extended by ObjectOutputStream. This method writes the low order byte of the int argument (the "right most" eight bits). Again, you can see this in the JavaDoc documentation.
I'm trying to use javax.crypto.Cipher.doFinal(byte[]) method to encrypt an object. But, for security reasons, the object cannot be serializable.
So, how to convert the object to byte array without serialization?
--update
is using serialization the only way to use this Cipher method? Because as I know important data should not be serializable.
I used com.fasterxml.jackson.databind.ObjectMapper.
private static byte[] serialize(Object obj) throws IOException {
ByteArrayOutputStream os = new ByteArrayOutputStream();
ObjectMapper mapper = new ObjectMapper();
mapper.enable(SerializationFeature.INDENT_OUTPUT);
mapper.setSerializationInclusion(JsonInclude.Include.NON_NULL);
mapper.writeValue(os, obj);
return os.toByteArray();
}
You just serialize each of it's components. Recurse. Eventually you end up with native objects that you can serialize.
If you implement this by implementing java's serialization methods, java will ensure that you do not serialize any object twice and will take care of references for you.
In short, make the object serializable.
Solved,
instead of use a getByteArray() to call Cipher.doFinal(), I'll use Cipher.doFinal() inside the class, with a getEncryptedByteArray() method; so I serialize the data inside the class without making the class itself serializable, and the return result will be encrypted.
Any objection to this approach will be considered.. :)
Here is a simple example of serializing a class to a byte array.
public Class Foo {
private boolean isHappy;
private short happyCount;
private Bar bar;
public byte[] serializeData () throws IOException
{
ByteArrayOutputStream stream = new ByteArrayOutputStream();
DataOutputStream out = new DataOutputStream( stream );
out.writeBoolean(isHappy);
out.writeShort( slope );
// Serialize bar which will just append to this byte stream
bar.doSerializeData(out);
// Return the serialized object.
byte[] data = stream.toByteArray();
// Clean up.
stream.close();
return data;
}
}
Of course, a lot of the details in your case depend on your class structure but hopefully this gets you pointed in the right direction.
To deserialize you just need to reverse the above.
java.beans.XMLEncoder/Decoder.
I'm writing a network app, which sends and receives a lot of different kinds of binary packets, and I'm trying to make adding new kinds of packets to my app as easy as possible.
For now, I created a Packet class, and I create subclasses of it for each different kind of packet. However, it isn't as clean as it seems; I've ended up with code like this:
static class ItemDesc extends Packet {
public final int item_id;
public final int desc_type;
public final String filename;
public final String buf;
public ItemDesc(Type t, int item_id, int desc_type, String filename, String buf) {
super(t); // sets type for use in packet header
this.item_id = item_id;
this.desc_type = desc_type;
this.filename = filename;
this.buf = buf;
}
public ItemDesc(InputStream i) throws IOException {
super(i); // reads packet header and sets this.input
item_id = input.readInt();
desc_type = input.readByte();
filename = input.readStringWithLength();
buf = input.readStringWithLength();
}
public void writeTo(OutputStream o) throws IOException {
MyOutputStream dataOutput = new MyOutputStream();
dataOutput.writeInt(item_id);
dataOutput.writeByte(desc_type);
dataOutput.writeStringWithLength(filename);
dataOutput.writeStringWithLength(buf);
super.write(dataOutput.toByteArray(), o);
}
}
What bothers me about this approach is the code repetition - I'm repeating the packet structure four times. I'd be glad to avoid this, but I can't see a reasonable way to simplify it.
If I was writing in Python I would create a dictionary of all possible field types, and then define new packet types like this:
ItemDesc = [('item_id', 'int'), ('desc_type', 'byte'), ...]
I suppose that I could do something similar in any functional language. However, I can't see a way to take this approach to Java.
(Maybe I'm just too pedantic, or I got used to functional programming and writing code that writes code, so I could avoid any repetition :))
Thank you in advance for any suggestions.
I agree with #silky that your current code is a good solution. A bit of repetitious (though not duplicated) code is not a bad thing, IMO.
If you wanted a more python-like solution, you could:
Replace the member attributes of ItemDesc with some kind of order-preserving map structure, do the serialization using a common writeTo method that iterates over the map. You also need to add getters for each attribute, and replace all uses of the existing fields.
Replace the member attributes with a Properties object and use Properties serialization instead of binary writes.
Write a common writeTo method that uses Java reflection to access the member attributes and their types and serialize them.
But in all 3 cases, the code will be slower, more complicated and potentially more fragile than the current "ugly" code. I wouldn't do this.
Seem okay to me. You may just want to abstract some of the 'general' parts of the packet up the inheritance chain, so you don't need to read them, but it makes sense to be repeating the format like you are, because you've got a case for reading in raw from the constructor, reading from a stream, and writing. I see nothing wrong with it.
I am not sure you can do this in java- but maybe you could reuse one of the ctors:
public ItemDesc(InputStream i) throws IOException {
super(i); // reads packet header and sets this.input
this(input.readInt(), input.readByte(), input.readStringWithLength(), input.readStringWithLength());
}
Were 'this' means a call to this classes ctor, whtever the syntax might be.