When sending file, you can do ctx.writeAndFlush(new ChunkedFile(new File("file.png")));.
how about a List<Object>?
The list contains String and bytes of image.
from the documentation there's ChunkedInput() but I'm not able to get the use of it.
UPDATE
let's say in my Handler, inside channelRead0(ChannelHandlerContext ctx, Object o) method where I want to send the List<Object> I've done the following
#Override
protected void channelRead0(ChannelHandlerContext ctx, Object o) throws Exception {
List<Object> msg = new ArrayList<>();
/**getting the bytes of image**/
byte[] imageInByte;
BufferedImage originalImage = ImageIO.read(new File(fileName));
// convert BufferedImage to byte array
ByteArrayOutputStream bAoS = new ByteArrayOutputStream();
ImageIO.write(originalImage, "png", bAoS);
bAoS.flush();
imageInByte = baos.toByteArray();
baos.close();
msg.clear();
msg.add(0, "String"); //add the String into List
msg.add(1, imageInByte); //add the bytes of images into list
/**Chunk the List<Object> and Send it just like the chunked file**/
ctx.writeAndFlush(new ChunkedInput(DONT_KNOW_WHAT_TO_DO_HERE)); //
}
Just implement your own ChunkedInput<ByteBuf>. Following the implementations shipped with Netty you can implement it as follows:
public class ChunkedList implements ChunkedInput<ByteBuf> {
private static final byte[] EMPTY = new byte[0];
private byte[] previousPart = EMPTY;
private final int chunkSize;
private final Iterator<Object> iterator;
public ChunkedList(int chunkSize, List<Object> objs) {
//chunk size in bytes
this.chunkSize = chunkSize;
this.iterator = objs.iterator();
}
public ByteBuf readChunk(ChannelHandlerContext ctx) {
return readChunk(ctx.alloc());
}
public ByteBuf readChunk(ByteBufAllocator allocator) {
if (isEndOfInput())
return null;
else {
ByteBuf buf = allocator.buffer(chunkSize);
boolean release = true;
try {
int bytesRead = 0;
if (previousPart.length > 0) {
if (previousPart.length > chunkSize) {
throw new IllegalStateException();
}
bytesRead += previousPart.length;
buf.writeBytes(previousPart);
}
boolean done = false;
while (!done) {
if (!iterator.hasNext()) {
done = true;
previousPart = EMPTY;
} else {
Object o = iterator.next();
//depending on the encoding
byte[] bytes = o instanceof String ? ((String) o).getBytes() : (byte[]) o;
bytesRead += bytes.length;
if (bytesRead > chunkSize) {
done = true;
previousPart = bytes;
} else {
buf.writeBytes(bytes);
}
}
}
release = false;
} finally {
if (release)
buf.release();
}
return buf;
}
}
public long length() {
return -1;
}
public boolean isEndOfInput() {
return !iterator.hasNext() && previousPart.length == 0;
}
public long progress() {
return 0;
}
public void close(){
//close
}
}
In order to write ChunkedContent there is a special handler shipped with Netty. See io.netty.handler.stream.ChunkedWriteHandler. So just add to your downstream. Here is the quote from documentation:
A ChannelHandler that adds support for writing a large data stream
asynchronously neither spending a lot of memory nor getting
OutOfMemoryError. Large data streaming such as file transfer requires
complicated state management in a ChannelHandler implementation.
ChunkedWriteHandler manages such complicated states so that you can
send a large data stream without difficulties.
Related
I have to transfer a file using SCTP protocol. I have written the code in java but the code is not working when I am using 4G hotspot network. So I came across this RFC which talks about UDP encapsulation of SCTP. I want to know if there is an implementation which I can use to encapsulate SCTP packet in UDP and send it over UDP channel so that it can traverse heavily NATted network. My current code for sending the data packet is as follows:
import java.io.*;
import java.net.InetSocketAddress;
import java.nio.ByteBuffer;
import java.util.*;
import com.sun.nio.sctp.MessageInfo;
import com.sun.nio.sctp.SctpChannel;
import com.sun.nio.sctp.SctpServerChannel;
public class Main {
SctpChannel connectionChannelPrimary;
SctpChannel connectionChannelSecondary;
InetSocketAddress serverSocketAddressPrimary;
InetSocketAddress serverSocketAddressSecondary;
String directoryPath;
public Main() {
serverSocketAddressPrimary = new InetSocketAddress(6002);
serverSocketAddressSecondary = new InetSocketAddress(6003);
}
public void setDirectoryPath(String directoryPath) {
this.directoryPath = directoryPath;
}
public String getDirectoryPath() {
return directoryPath;
}
public void establishConnection(int connId) throws IOException {
SctpServerChannel sctpServerChannel = SctpServerChannel.open();
if (connId == 0) {
sctpServerChannel.bind(serverSocketAddressPrimary);
connectionChannelPrimary = sctpServerChannel.accept();
System.out.println("connection established for primary");
} else {
sctpServerChannel.bind(serverSocketAddressSecondary);
connectionChannelSecondary = sctpServerChannel.accept();
System.out.println("connection established for helper");
}
}
ArrayList<String> getAllFiles() {
File directory = new File(this.directoryPath);
ArrayList<String> fileNames = new ArrayList<>();
for (File fileEntry : Objects.requireNonNull(directory.listFiles())) {
if (fileEntry.isFile()) {
fileNames.add(fileEntry.getName());
}
}
Collections.sort(fileNames);
return fileNames;
}
public byte[] readFile(String filename) throws IOException {
String extraString = "\n\n\n\nNRL\n\n\n";
File file = new File(filename);
FileInputStream fl = new FileInputStream(file);
ByteBuffer finalBuffer = ByteBuffer.allocate((int) (file.length() + extraString.length()));
byte[] arr = new byte[(int) file.length()];
int res = fl.read(arr);
if (res < 0) {
System.out.println("Error in reading file");
fl.close();
return null;
}
fl.close();
finalBuffer.put(arr);
finalBuffer.put(extraString.getBytes());
byte[] tmp = new byte[extraString.length()];
finalBuffer.position((int) (file.length() - 1));
finalBuffer.get(tmp, 0, tmp.length);
return finalBuffer.array();
}
public void sendBytes(String filename, int connId) throws IOException {
byte[] message = readFile(filename);
assert message != null;
System.out.println(message.length);
int tmp = 0;
int cntIndex = 60000;
int prevIndex = 0;
boolean isBreak = false;
while (!isBreak) {
byte[] slice;
if (prevIndex + 60000 >= message.length) {
slice = Arrays.copyOfRange(message, prevIndex, message.length);
isBreak = true;
} else {
slice = Arrays.copyOfRange(message, prevIndex, cntIndex);
prevIndex = cntIndex;
cntIndex = cntIndex + 60000;
}
final ByteBuffer byteBuffer = ByteBuffer.allocate(64000);
final MessageInfo messageInfo = MessageInfo.createOutgoing(null, 0);
byteBuffer.put(slice);
byteBuffer.flip();
tmp += slice.length;
try {
if (connId == 0) connectionChannelPrimary.send(byteBuffer, messageInfo);
else connectionChannelSecondary.send(byteBuffer, messageInfo);
} catch (Exception e) {
e.printStackTrace();
}
}
System.out.println(tmp);
}
public static void main(String[] args) throws IOException {
String bgFilePath = "/home/iiitd/Desktop/background/";
String fgFilePath = "/home/iiitd/Desktop/foreground/";
Main myObj = new Main();
myObj.setDirectoryPath("/home/iiitd/Desktop/tmp/");
myObj.establishConnection(1);
myObj.establishConnection(0);
ArrayList<String> files = myObj.getAllFiles();
for (String tmpFile : files) {
String cntFilePath = myObj.getDirectoryPath() + tmpFile;
myObj.sendBytes(cntFilePath,0);
}
}
}
RFC Link: https://datatracker.ietf.org/doc/html/draft-ietf-tsvwg-sctp-udp-encaps-09
In C, I think that usrsctp is a popular implementation of SCTP over UDP. If I understand correctly, it was used by Google Chrome at some point (though I see they mentioned moving to "dcsctp" at some point). Also I have seen it in a mirror of the Firefox sources in 2016, not sure what's the state today.
So one solution would be to wrap usrsctp with JNI. And it appears that this is exactly what jitsi-sctp is doing. I haven't used it, but I would have a look.
I'm using Android HttpURLConnection to do network downloading. Here is my code:
HttpURLConnection conn = null;
try {
conn = (HttpURLConnection) new URL("http://XXXXXXXXXX").openConnection();
InputStream inputStream = conn.getInputStream();
Log.d("Reflection",inputStream.getClass().getCanonicalName());
int TOTAL_LEN = conn.getContentLength();
byte[] buf = new byte[1024 * 8];
int len = 0, total = 0;
while (len != -1) {
len = inputStream.read(buf);
total += len;
Log.d("Download Process:", total * 1.0 / TOTAL_LEN * 100 + "");
}
}catch (Exception e) {
e.printStackTrace();
}
Since Inputstream is a abstract class, I use reflection to show its implementation:inputStream.getClass().getCanonicalName(). But the result is null.
So I try inputStream.getClass() and the result is class com.android.okio.RealBufferedSource$1.
Why? How can I figure out the real implementation of the inputstream here?
PROBLEM SOLVED
The answer is anonymous class. I went through the Android source code and discovered it. The location is /external/okhttp/okio/src/main/java/okio/RealBufferedSource.java and the code is below:
public InputStream inputStream() {
return new InputStream() {
#Override
public int read() throws IOException {
if (closed) throw new IOException("closed");
if (buffer.size == 0) {
long count = source.read(buffer, Segment.SIZE);
if (count == -1) return -1;
}
return buffer.readByte() & 0xff;
}
#Override
public int read(byte[] data, int offset, int byteCount) throws IOException {
if (closed) throw new IOException("closed");
checkOffsetAndCount(data.length, offset, byteCount);
if (buffer.size == 0) {
long count = source.read(buffer, Segment.SIZE);
if (count == -1) return -1;
}
return buffer.read(data, offset, byteCount);
}
#Override
public int available() throws IOException {
if (closed) throw new IOException("closed");
return (int) Math.min(buffer.size, Integer.MAX_VALUE);
}
#Override
public void close() throws IOException {
RealBufferedSource.this.close();
}
#Override
public String toString() {
return RealBufferedSource.this + ".inputStream()";
}
};
}
Notice that this method returns a new InputStream() and it is anonymous.
Is there any way InputStream wrapping a list of UTF-8 String? I'd like to do something like:
InputStream in = new XyzInputStream( List<String> lines )
You can read from a ByteArrayOutputStream and you can create your source byte[] array using a ByteArrayInputStream.
So create the array as follows:
List<String> source = new ArrayList<String>();
source.add("one");
source.add("two");
source.add("three");
ByteArrayOutputStream baos = new ByteArrayOutputStream();
for (String line : source) {
baos.write(line.getBytes());
}
byte[] bytes = baos.toByteArray();
And reading from it is as simple as:
InputStream in = new ByteArrayInputStream(bytes);
Alternatively, depending on what you're trying to do, a StringReader might be better.
You can concatenate all the lines together to create a String then convert it to a byte array using String#getBytes and pass it into ByteArrayInputStream. However this is not the most efficient way of doing it.
In short, no, there is no way of doing this using existing JDK classes. You could, however, implement your own InputStream that read from a List of Strings.
EDIT: Dave Web has an answer above, which I think is the way to go. If you need a reusable class, then something like this might do:
public class StringsInputStream<T extends Iterable<String>> extends InputStream {
private ByteArrayInputStream bais = null;
public StringsInputStream(final T strings) throws IOException {
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
for (String line : strings) {
outputStream.write(line.getBytes());
}
bais = new ByteArrayInputStream(outputStream.toByteArray());
}
#Override
public int read() throws IOException {
return bais.read();
}
#Override
public int read(byte[] b) throws IOException {
return bais.read(b);
}
#Override
public int read(byte[] b, int off, int len) throws IOException {
return bais.read(b, off, len);
}
#Override
public long skip(long n) throws IOException {
return bais.skip(n);
}
#Override
public int available() throws IOException {
return bais.available();
}
#Override
public void close() throws IOException {
bais.close();
}
#Override
public synchronized void mark(int readlimit) {
bais.mark(readlimit);
}
#Override
public synchronized void reset() throws IOException {
bais.reset();
}
#Override
public boolean markSupported() {
return bais.markSupported();
}
public static void main(String[] args) throws Exception {
List source = new ArrayList();
source.add("foo ");
source.add("bar ");
source.add("baz");
StringsInputStream<List<String>> in = new StringsInputStream<List<String>>(source);
int read = in.read();
while (read != -1) {
System.out.print((char) read);
read = in.read();
}
}
}
This basically an adapter for ByteArrayInputStream.
You can create some kind of IterableInputStream
public class IterableInputStream<T> extends InputStream {
public static final int EOF = -1;
private static final InputStream EOF_IS = new InputStream() {
#Override public int read() throws IOException {
return EOF;
}
};
private final Iterator<T> iterator;
private final Function<T, byte[]> mapper;
private InputStream current;
public IterableInputStream(Iterable<T> iterable, Function<T, byte[]> mapper) {
this.iterator = iterable.iterator();
this.mapper = mapper;
next();
}
#Override
public int read() throws IOException {
int n = current.read();
while (n == EOF && current != EOF_IS) {
next();
n = current.read();
}
return n;
}
private void next() {
current = iterator.hasNext()
? new ByteArrayInputStream(mapper.apply(iterator.next()))
: EOF_IS;
}
}
To use it
public static void main(String[] args) throws IOException {
Iterable<String> strings = Arrays.asList("1", "22", "333", "4444");
try (InputStream is = new IterableInputStream<String>(strings, String::getBytes)) {
for (int b = is.read(); b != -1; b = is.read()) {
System.out.print((char) b);
}
}
}
In my case I had to convert a list of string in the equivalent file (with a line feed for each line).
This was my solution:
List<String> inputList = Arrays.asList("line1", "line2", "line3");
byte[] bytes = inputList.stream().collect(Collectors.joining("\n", "", "\n")).getBytes();
InputStream inputStream = new ByteArrayInputStream(bytes);
You can do something similar to this:
https://commons.apache.org/sandbox/flatfile/xref/org/apache/commons/flatfile/util/ConcatenatedInputStream.html
It just implements the read() method of InputStream and has a list of InputStreams it is concatenating. Once it reads an EOF it starts reading from the next InputStream. Just convert the Strings to ByteArrayInputStreams.
you can also do this way create a Serializable List
List<String> quarks = Arrays.asList(
"up", "down", "strange", "charm", "top", "bottom"
);
//serialize the List
//note the use of abstract base class references
try{
//use buffering
OutputStream file = new FileOutputStream( "quarks.ser" );
OutputStream buffer = new BufferedOutputStream( file );
ObjectOutput output = new ObjectOutputStream( buffer );
try{
output.writeObject(quarks);
}
finally{
output.close();
}
}
catch(IOException ex){
fLogger.log(Level.SEVERE, "Cannot perform output.", ex);
}
//deserialize the quarks.ser file
//note the use of abstract base class references
try{
//use buffering
InputStream file = new FileInputStream( "quarks.ser" );
InputStream buffer = new BufferedInputStream( file );
ObjectInput input = new ObjectInputStream ( buffer );
try{
//deserialize the List
List<String> recoveredQuarks = (List<String>)input.readObject();
//display its data
for(String quark: recoveredQuarks){
System.out.println("Recovered Quark: " + quark);
}
}
finally{
input.close();
}
}
catch(ClassNotFoundException ex){
fLogger.log(Level.SEVERE, "Cannot perform input. Class not found.", ex);
}
catch(IOException ex){
fLogger.log(Level.SEVERE, "Cannot perform input.", ex);
}
I'd like to propose my simple solution:
public class StringListInputStream extends InputStream {
private final List<String> strings;
private int pos = 0;
private byte[] bytes = null;
private int i = 0;
public StringListInputStream(List<String> strings) {
this.strings = strings;
this.bytes = strings.get(0).getBytes();
}
#Override
public int read() throws IOException {
if (pos >= bytes.length) {
if (!next()) return -1;
else return read();
}
return bytes[pos++];
}
private boolean next() {
if (i + 1 >= strings.size()) return false;
pos = 0;
bytes = strings.get(++i).getBytes();
return true;
}
}
I am trying to figure out object design to implement large file(~600 MB) respository in the Database using hibernate.
Please suggest a correct approach/design?
class ModelClass{
String name; //meta data
...
Option 1.
byte[] file; // dont want to load the content of the entire file
// in memory by using this but hibernate recognizes
// this datatype
Option 2.
InputStream inputStream;
OutputStream outputStream;
// I can have the methods to provide the input or output stream
// but i dont think its a clean approach. I am not sure how
// I will be able to work with hibernate with streams
Option 3.
File fileHandle;
}
Any other options??
I would like to call save(Object) method of hibernateTemplate to save the object in Database. Dont know if I should have just the meta-data in the class and handle the file save and retreive seperately.
Thanks in advance.
Another workable solution is to use "Work" Interface. The purpose was to avoid loading the file content into memory.
session.doWork(new Work(){
#Override
public void execute(Connection conn) {
//direct sql queries go here
}
});
I have written a SerializableFile class that keeps data in a file. When the object is read, it creates a temporary file.
Here it is:
public class SerializableFile implements Serializable {
private static final File TEMP_DIR = getTempDir();
private transient boolean temporary;
private transient String name;
private transient File file;
public SerializableFile() {
}
public SerializableFile(File file) {
this.file = file;
this.name = file.getName();
this.temporary = false;
}
#Override
protected void finalize() throws Throwable {
dispose();
super.finalize();
}
public void dispose() {
if (temporary && file != null) {
file.delete();
file = null;
}
}
public File keep(String name) throws IOException {
if (temporary) {
temporary = false;
} else {
File newFile = new File(TEMP_DIR, name);
keepAs(newFile);
file = newFile;
}
return file;
}
public void keepAs(File outFile) throws IOException {
if ((temporary || file.equals(outFile)) && file.renameTo(outFile)) {
temporary = false;
file = outFile;
} else {
InputStream in = new FileInputStream(file);
try {
OutputStream out = new FileOutputStream(outFile);
try {
byte buf[] = new byte[4096];
for (int n = in.read(buf); n > 0; n = in.read(buf)) {
out.write(buf, 0, n);
}
} finally {
out.close();
}
} finally {
in.close();
}
outFile.setLastModified(file.lastModified());
}
}
public String getName() {
return name;
}
public File getFile() {
return file;
}
public long lastModified() {
return file.lastModified();
}
private void writeObject(ObjectOutputStream out) throws IOException {
int size = (int)file.length();
long date = file.lastModified();
out.writeUTF(name);
out.writeInt(size);
out.writeLong(date);
InputStream in = new FileInputStream(file);
try {
byte buf[] = new byte[4096];
while (size > 0) {
int n = in.read(buf);
if (n <= 0 || n > size) {
throw new IOException("Unexpected file size");
}
out.write(buf, 0, n);
size -= n;
}
} finally {
in.close();
}
}
private void readObject(ObjectInputStream in) throws IOException {
name = in.readUTF();
int size = in.readInt();
long date = in.readLong();
file = File.createTempFile("tmp", ".tmp", TEMP_DIR);
OutputStream out = new FileOutputStream(file);
try {
byte buf[] = new byte[4096];
while (size > 0) {
int n = in.read(buf, 0, size <= buf.length ? size : buf.length);
if (n <= 0 || n > size) {
throw new IOException("Unexpected file size");
}
out.write(buf, 0, n);
size -= n;
}
} finally {
out.close();
}
file.setLastModified(date);
temporary = true;
}
private static File getTempDir() {
File dir;
String temp = System.getProperty("com.lagalerie.live.temp-dir");
if (temp != null) {
dir = new File(temp);
} else {
String home = System.getProperty("user.home");
dir = new File(home, "temp");
}
if (!dir.isDirectory() && !dir.mkdirs()) {
throw new RuntimeException("Could not create temp dir " + dir);
}
return dir;
}
}
Open JPA supports a #Persistent annotation with some databases:
MySQL
Oracle
PostgreSQL
SQL Server
DB2
Even if you are still using an RDBMS as a data store, you should consider storing this binary data into a file system, and saving the directory / location of the path into the database, instead of storing this as a BLOB or CLOB into the database.
In a traditional blocking-thread server, I would do something like this
class ServerSideThread {
ObjectInputStream in;
ObjectOutputStream out;
Engine engine;
public ServerSideThread(Socket socket, Engine engine) {
in = new ObjectInputStream(socket.getInputStream());
out = new ObjectOutputStream(socket.getOutputStream());
this.engine = engine;
}
public void sendMessage(Message m) {
out.writeObject(m);
}
public void run() {
while(true) {
Message m = (Message)in.readObject();
engine.queueMessage(m,this); // give the engine a message with this as a callback
}
}
}
Now, the object can be expected to be quite large. In my nio loop, I can't simply wait for the object to come through, all my other connections (with much smaller workloads) will be waiting on me.
How can I only get notified that a connection has the entire object before it tells my nio channel it's ready?
You can write the object to a ByteArrayOutputStream allowing you to give the length before an object sent. On the receiving side, read the amount of data required before attempting to decode it.
However, you are likely to find it much simpler and more efficient to use blocking IO (rather than NIO) with Object*Stream
Edit something like this
public static void send(SocketChannel socket, Serializable serializable) throws IOException {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
for(int i=0;i<4;i++) baos.write(0);
ObjectOutputStream oos = new ObjectOutputStream(baos);
oos.writeObject(serializable);
oos.close();
final ByteBuffer wrap = ByteBuffer.wrap(baos.toByteArray());
wrap.putInt(0, baos.size()-4);
socket.write(wrap);
}
private final ByteBuffer lengthByteBuffer = ByteBuffer.wrap(new byte[4]);
private ByteBuffer dataByteBuffer = null;
private boolean readLength = true;
public Serializable recv(SocketChannel socket) throws IOException, ClassNotFoundException {
if (readLength) {
socket.read(lengthByteBuffer);
if (lengthByteBuffer.remaining() == 0) {
readLength = false;
dataByteBuffer = ByteBuffer.allocate(lengthByteBuffer.getInt(0));
lengthByteBuffer.clear();
}
} else {
socket.read(dataByteBuffer);
if (dataByteBuffer.remaining() == 0) {
ObjectInputStream ois = new ObjectInputStream(new ByteArrayInputStream(dataByteBuffer.array()));
final Serializable ret = (Serializable) ois.readObject();
// clean up
dataByteBuffer = null;
readLength = true;
return ret;
}
}
return null;
}
Inspired by the code above I've created a (GoogleCode project)
It includes a simple unit test:
SeriServer server = new SeriServer(6001, nthreads);
final SeriClient client[] = new SeriClient[nclients];
//write the data with multiple threads to flood the server
for (int cnt = 0; cnt < nclients; cnt++) {
final int counterVal = cnt;
client[cnt] = new SeriClient("localhost", 6001);
Thread t = new Thread(new Runnable() {
public void run() {
try {
for (int cnt2 = 0; cnt2 < nsends; cnt2++) {
String msg = "[" + counterVal + "]";
client[counterVal].send(msg);
}
} catch (IOException e) {
e.printStackTrace();
fail();
}
}
});
t.start();
}
HashMap<String, Integer> counts = new HashMap<String, Integer>();
int nullCounts = 0;
for (int cnt = 0; cnt < nsends * nclients;) {
//read the data from a vector (that the server pool automatically fills
SeriDataPackage data = server.read();
if (data == null) {
nullCounts++;
System.out.println("NULL");
continue;
}
if (counts.containsKey(data.getObject())) {
Integer c = counts.get(data.getObject());
counts.put((String) data.getObject(), c + 1);
} else {
counts.put((String) data.getObject(), 1);
}
cnt++;
System.out.println("Received: " + data.getObject());
}
// asserts the results
Collection<Integer> values = counts.values();
for (Integer value : values) {
int ivalue = value;
assertEquals(nsends, ivalue);
System.out.println(value);
}
assertEquals(counts.size(), nclients);
System.out.println(counts.size());
System.out.println("Finishing");
server.shutdown();