I'm learing Netty and i'm trying to implement a simple counter, where all the clients shares an Integer and they input a number and the Integer value increments by that number.
Here is my code:
Server.java
package nettyincvalue;
import io.netty.bootstrap.ServerBootstrap;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelOption;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.nio.NioServerSocketChannel;
import java.util.logging.Level;
import java.util.logging.Logger;
public class Server {
private int port;
private Integer value;
public Server(int port) {
this.port = port;
this.value = 0;
}
public void run(){
EventLoopGroup bossGroup = new NioEventLoopGroup();
EventLoopGroup workerGroup = new NioEventLoopGroup();
try {
ServerBootstrap server = new ServerBootstrap();
server.group(bossGroup, workerGroup)
.channel(NioServerSocketChannel.class)
.childHandler(new ServerInit(this.value))
.option(ChannelOption.SO_BACKLOG, 128)
.childOption(ChannelOption.SO_KEEPALIVE, true);
ChannelFuture f = server.bind(port).sync();
f.channel().closeFuture().sync();
} catch (InterruptedException ex) {
Logger.getLogger(Server.class.getName()).log(Level.SEVERE, null, ex);
} finally {
bossGroup.shutdownGracefully();
workerGroup.shutdownGracefully();
}
}
public static void main(String[] args) {
new Server(12345).run();
}
}
ServerInit.java
package nettyincvalue;
import io.netty.channel.ChannelInitializer;
import io.netty.channel.ChannelPipeline;
import io.netty.channel.socket.SocketChannel;
import io.netty.handler.codec.string.StringDecoder;
public class ServerInit extends ChannelInitializer<SocketChannel> {
private Integer value;
public ServerInit(Integer value) {
this.value = value;
}
#Override
protected void initChannel(SocketChannel ch) throws Exception {
ChannelPipeline pipeline = ch.pipeline();
pipeline.addLast(new StringDecoder());
pipeline.addLast(new ServerHandler(this.value));
}
}
ServerHandler.java
package nettyincvalue;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.ChannelInboundHandlerAdapter;
public class ServerHandler extends ChannelInboundHandlerAdapter {
private Integer value;
public ServerHandler(Integer value) {
this.value = value;
}
#Override
public void channelRead(ChannelHandlerContext ctx, Object msg) throws Exception {
String s = (String) msg;
try {
Integer i = Integer.parseInt(s.substring(0, s.length() - 1));
this.value += i;
System.out.println("Value its now: " + this.value);
} catch (NumberFormatException n) {
System.out.println("Not a number received");
}
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) throws Exception {
System.err.println(cause.getMessage());
ctx.close();
}
}
Individually, its working, when a client inputs via nc a number it increments but its not globally, i mean when a different client starts, the counter its set to 0.
Your variable doesn't have proper synchronization; it should generally be declared volatile to make sure that all threads see updated values, and you'll need to synchronize the blocks where you're using it. However, in your case, using an AtomicInteger will be simpler and more efficient.
Related
In the paper "SEDA: An Architecture for Well-Conditioned, Scalable Internet Services", the SEDA was first published.
SEDA consists of a set of stages, where each stage has a separate thread pool.
Sandstorm is the Java API for SEDA which is available in https://github.com/chenhaodong/seda-sandstorm. Also, Apache MINA uses SEDA inside. Yet these implementations do not have any documentation on how to implement a server using SEDA.
Does anyone know, how to build a very simple echo service using SEDA? (Java)
Apache MINA is the Open source implementation for SEDA.
This StackOverflow question has an answer which shows how to build a simple http service using Apache MINA.
Apache MINA is kind of deprecated now (as in 2019), and the technologies used there are very old. Hence I wrote a simple new SEDA lightweight library and an Http Server example, from scratch, as follows.
SEDA-CORE
Event.java
import com.pasindu.queue.seda.handler.EventHandler;
public interface Event {
public EventHandler getHandler();
}
EventHandler.java
import com.pasindu.queue.seda.queue.Queue;
public interface EventHandler extends Runnable {
public void onEvent() throws InterruptedException ;
public void run();
public void setOutQueue(Queue queue);
public String getName();
public void setName(String name);
}
Logger.java
public class Logger {
public static void log(String className, String msg){
// System.out.println("SEDA-CORE LOG ----- "+ className+ "----- \t \t \t"+ msg+" -----");
}
}
Queue.java
import com.pasindu.queue.seda.event.Event;
import com.pasindu.queue.seda.helper.Logger;
import java.util.concurrent.ArrayBlockingQueue;
public class Queue {
private int capacity;
private ArrayBlockingQueue<Event> queue;
private String name;
public Queue (int capacity, String name){
this.setCapacity(capacity);
this.name = name;
setQueue(new ArrayBlockingQueue<Event>(capacity));
}
public String getName(){return this.name;}
public void enqueu(Event event) throws InterruptedException{
Logger.log(this.toString(), "Enqueing attempt for "+event.toString()+" to "+this.toString());
getQueue().put(event); // if queue is full the calling thread has to wait till this sucess (in our case the main thread or one of event handler threads in the executor pool)
}
public Event dequeue() throws InterruptedException{
Logger.log(this.toString(), "Dequeing attempt "+" from "+this.toString());
return this.getQueue().take(); // if queue is empty then the calling thread (stage thread) has to wait till the event becomes available
}
public int getCapacity() {
return capacity;
}
public void setCapacity(int capacity) {
this.capacity = capacity;
}
public ArrayBlockingQueue<Event> getQueue() {
return queue;
}
public void setQueue(ArrayBlockingQueue<Event> queue) {
this.queue = queue;
}
public int getNumElements(){
return queue.size();
}
}
Stage.java
import com.pasindu.queue.seda.event.Event;
import com.pasindu.queue.seda.thread.pool.ThreadPool;
import com.pasindu.queue.seda.handler.EventHandler;
import com.pasindu.queue.seda.helper.Logger;
import com.pasindu.queue.seda.queue.Queue;
public class Stage extends Thread {
private Queue inputQueue;
private Queue outputQueue;
private int batchSize;
private ThreadPool threadPool;
public Stage(Queue inputQueue, Queue outputQueue, int batchSize){
this.threadPool = new ThreadPool();
this.batchSize = batchSize;
this.inputQueue = inputQueue;
this.outputQueue = outputQueue;
}
#Override
public void run(){
while(true){
Event event = null;
try{
event = inputQueue.dequeue();
Logger.log(this.toString(), "Dequeued "+event.toString()+" from "+inputQueue.toString());
}catch (InterruptedException ex){
}
if(event != null) {
EventHandler handler = event.getHandler();
handler.setOutQueue(outputQueue);
handler.setName(this.getName()+"'s Event Handler");
threadPool.submit(handler);
Logger.log(this.toString(), "Enqueued " + event.toString() + " to " + outputQueue);
}else{
try {
Thread.sleep(10);
}catch(InterruptedException ex){
}
}
}
}
}
ThreadPool.java
import com.pasindu.queue.seda.handler.EventHandler;
import com.pasindu.queue.seda.helper.Logger;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
public class ThreadPool {
ExecutorService executorService;
public ThreadPool (){
this.executorService = Executors.newFixedThreadPool(4);
}
public void submit(EventHandler handler){
Logger.log(this.toString(),"Calling submit of "+executorService.toString());
this.executorService.submit(handler);
}
}
SEDA-HTTP-SERVER
BufferEvent.java
import com.pasindu.queue.seda.event.Event;
import com.pasindu.queue.seda.handler.EventHandler;
import handler.BufferEventHandler;
import java.nio.ByteBuffer;
public class BufferEvent implements Event {
private EventHandler handler;
private ByteBuffer buffer;
private String requestId;
private int numRead;
public BufferEvent(ByteBuffer byteBuffer, String requestId, int numRead){
this.setBuffer(byteBuffer);
this.setRequestId(requestId);
this.setNumRead(numRead);
this.setHandler(new BufferEventHandler(this));
}
public EventHandler getHandler(){
return this.handler;
}
public ByteBuffer getBuffer() {
return buffer;
}
public void setBuffer(ByteBuffer buffer) {
this.buffer = buffer;
}
public String getRequestId() {
return requestId;
}
public void setRequestId(String requestId) {
this.requestId = requestId;
}
public int getNumRead() {
return numRead;
}
public void setNumRead(int numRead) {
this.numRead = numRead;
}
public void setHandler(EventHandler handler) {
this.handler = handler;
}
}
ByteArrayEvent.java
import com.pasindu.queue.seda.event.Event;
import com.pasindu.queue.seda.handler.EventHandler;
import handler.ByteArrayEventHandler;
import java.nio.ByteBuffer;
public class ByteArrayEvent implements Event {
private EventHandler handler;
private ByteBuffer buffer;
private String requestId;
private byte[] data ;
private int numRead;
public ByteArrayEvent(ByteBuffer byteBuffer, String requestId, byte[] data, int numRead ){
this.setBuffer(byteBuffer);
this.setRequestId(requestId);
this.setData(data);
this.setHandler(new ByteArrayEventHandler(this));
this.numRead = numRead;
}
public EventHandler getHandler(){
return this.handler;
}
public ByteBuffer getBuffer() {
return buffer;
}
public void setBuffer(ByteBuffer buffer) {
this.buffer = buffer;
}
public String getRequestId() {
return requestId;
}
public void setRequestId(String requestId) {
this.requestId = requestId;
}
public void setHandler(EventHandler handler) {
this.handler = handler;
}
public byte[] getData() {
return data;
}
public void setData(byte[] data) {
this.data = data;
}
public int getNumRead() {
return numRead;
}
public void setNumRead(int numRead) {
this.numRead = numRead;
}
}
HttpRequestEvent.java
import com.pasindu.queue.seda.event.Event;
import com.pasindu.queue.seda.handler.EventHandler;
import handler.HttpRequestEventHandler;
import java.nio.ByteBuffer;
public class HttpRequestEvent implements Event {
private EventHandler handler;
private ByteBuffer buffer;
private String requestId;
private String request;
public HttpRequestEvent(ByteBuffer byteBuffer, String requestId, String request){
this.setBuffer(byteBuffer);
this.setRequestId(requestId);
this.setRequest(request);
this.setHandler(new HttpRequestEventHandler(this));
}
public EventHandler getHandler(){
return this.handler;
}
public ByteBuffer getBuffer() {
return buffer;
}
public void setBuffer(ByteBuffer buffer) {
this.buffer = buffer;
}
public String getRequestId() {
return requestId;
}
public void setRequestId(String requestId) {
this.requestId = requestId;
}
public void setHandler(EventHandler handler) {
this.handler = handler;
}
public String getRequest() {
return request;
}
public void setRequest(String request) {
this.request = request;
}
}
HttpResponseEvent.java
import com.pasindu.queue.seda.event.Event;
import com.pasindu.queue.seda.handler.EventHandler;
public class HttpResponseEvent implements Event {
private String requestId;
public HttpResponseEvent(String requestId){
this.setRequestId(requestId);
}
public EventHandler getHandler(){
return null;
}
public String getRequestId() {
return requestId;
}
public void setRequestId(String requestId) {
this.requestId = requestId;
}
}
BufferEventHandler.java
import com.pasindu.queue.seda.handler.EventHandler;
import com.pasindu.queue.seda.helper.Logger;
import com.pasindu.queue.seda.queue.Queue;
import event.BufferEvent;
import event.ByteArrayEvent;
import java.nio.ByteBuffer;
public class BufferEventHandler implements EventHandler {
private BufferEvent event;
private Queue outQueue;
private String name;
public BufferEventHandler(BufferEvent event){
this.event = event;
}
public String getName(){
return this.name;
}
public void setName(String name){
this.name = name;
}
public void setOutQueue(Queue queue){
this.outQueue = queue;
}
public void onEvent() throws InterruptedException{
ByteBuffer buffer = this.event.getBuffer();
String requestId = this.event.getRequestId();
int numRead = this.event.getNumRead();
Logger.log(this.toString(), "Recieved "+event.toString());
buffer.flip();
byte[] data = new byte[numRead];
System.arraycopy(buffer.array(), 0, data, 0, numRead);
ByteArrayEvent byteEvent = new ByteArrayEvent(buffer, requestId, data, numRead );
Logger.log(this.toString(), "Set new object to "+byteEvent.toString());
outQueue.enqueu(byteEvent);
Logger.log(this.toString(), byteEvent.toString()+" added to "+outQueue.toString());
}
public void run(){
Logger.log(this.toString(), "Running "+ this.toString()+" for "+event.toString());
try{
this.onEvent();
}catch (InterruptedException ex){
}
}
}
ByteArrayEventHandler.java
import com.pasindu.queue.seda.handler.EventHandler;
import com.pasindu.queue.seda.helper.Logger;
import com.pasindu.queue.seda.queue.Queue;
import event.ByteArrayEvent;
import event.HttpRequestEvent;
import java.io.UnsupportedEncodingException;
import java.nio.ByteBuffer;
public class ByteArrayEventHandler implements EventHandler {
private ByteArrayEvent event;
private Queue outQueue;
private String name;
public String getName(){
return this.name;
}
public void setName(String name){
this.name = name;
}
public ByteArrayEventHandler(ByteArrayEvent event){
this.event = event;
}
public void onEvent() throws InterruptedException{
Logger.log(this.toString(), "Recieved event "+event.toString());
ByteBuffer buffer = this.event.getBuffer();
String requestId = this.event.getRequestId();
byte[] data = this.event.getData();
int numRead = this.event.getNumRead();
String request = null;
try {
request = new String(data, "US-ASCII");
}catch (UnsupportedEncodingException ex){
}
request = request.split("\n")[0].trim();
HttpRequestEvent httpRequestEvent = new HttpRequestEvent(buffer, requestId, request);
outQueue.enqueu(httpRequestEvent);
Logger.log(this.toString(), "Enqueued "+httpRequestEvent.toString() +" to "+outQueue.toString());
}
public void setOutQueue(Queue queueu){
this.outQueue = queueu;
}
public void run(){
Logger.log(this.toString(), "Running "+ this.toString()+" for "+event.toString());
try{
this.onEvent();
}catch (InterruptedException ex){
}
}
}
HttpRequestHandler.java
import com.pasindu.queue.seda.handler.EventHandler;
import com.pasindu.queue.seda.helper.Logger;
import com.pasindu.queue.seda.queue.Queue;
import event.HttpRequestEvent;
import event.HttpResponseEvent;
import java.nio.ByteBuffer;
import java.util.Dictionary;
public class HttpRequestEventHandler implements EventHandler {
private HttpRequestEvent event;
private Queue outQueue;
private String name;
public String getName(){
return this.name;
}
public void setName(String name){
this.name = name;
}
public HttpRequestEventHandler(HttpRequestEvent event){
this.event = event;
}
public void setOutQueue(Queue queue){
this.outQueue = queue;
}
private String serverRequest(String request) {
String response = "";
if (request.startsWith("GET")) {
// pass the request and generate response here
response = "response";
return response;
}
public void onEvent() throws InterruptedException{
Logger.log(this.toString(),"Recieved "+event.toString());
ByteBuffer buffer = this.event.getBuffer();
String requestId = this.event.getRequestId();
String request = this.event.getRequest();
Logger.log(this.toString(), "Recieved object inside is "+event);
String response = serverRequest(request);
buffer.clear();
buffer.put(response.getBytes());
HttpResponseEvent responseEvent= new HttpResponseEvent(requestId);
Logger.log(this.toString(), "Set new object inside "+event.toString());
outQueue.enqueu(responseEvent);
Logger.log(this.toString(), responseEvent.toString()+" added to "+outQueue.toString());
}
public void run(){
Logger.log(this.toString(), "Running "+ this.toString()+" for "+event.toString());
try{
this.onEvent();
}catch (InterruptedException ex){
}
}
}
QueueMonitor.java
import com.pasindu.queue.seda.helper.Logger;
import com.pasindu.queue.seda.queue.Queue;
public class QueueMonitor extends Thread {
private Queue[] queues;
public QueueMonitor(Queue[] queues){
this.queues= queues;
}
#Override
public void run(){
while(true){
try{
Thread.sleep(9000);
}catch(InterruptedException ex){
}
for(Queue queue: queues){
Logger.log(this.toString(), queue.getName()+" is "+queue.getNumElements());
}
}
}
}
ThreadMonitor.java
import com.pasindu.queue.seda.helper.Logger;
public class ThreadMonitor extends Thread{
private Thread [] threads;
public ThreadMonitor(Thread [] threads){
this.threads= threads;
}
#Override
public void run(){
while(true){
try{
Thread.sleep(11000);
}catch(InterruptedException ex){
}
for(Thread thread: threads){
Logger.log(this.toString(), thread.getName()+" is "+thread.getState());
}
}
}
}
HttpEventMain.java
import com.pasindu.queue.seda.queue.Queue;
import com.pasindu.queue.seda.stage.Stage;
import event.BufferEvent;
import monitor.QueueMonitor;
import monitor.ThreadMonitor;
import org.apache.commons.lang3.RandomStringUtils;
import java.io.IOException;
import java.net.*;
import java.nio.ByteBuffer;
import java.nio.channels.SelectionKey;
import java.nio.channels.Selector;
import java.nio.channels.ServerSocketChannel;
import java.nio.channels.SocketChannel;
import java.util.Iterator;
import java.util.concurrent.ConcurrentHashMap;
public class HttpEventMain extends Thread
{
private InetAddress addr;
private int port;
private Selector selector;
private ConcurrentHashMap concurrentHashMapResponse;
private ConcurrentHashMap concurrentHashMapKey;
public HttpEventMain(InetAddress addr, int port) throws IOException {
this.setAddr(addr);
this.setPort(port);
this.setConcurrentHashMapResponse(new ConcurrentHashMap<>());
this.concurrentHashMapKey = new ConcurrentHashMap<>();
}
#Override
public void run(){
System.out.println("----- Running the server on machine with "+Runtime.getRuntime().availableProcessors()+" cores -----");
try {
System.out.println("\n====================Server Details====================");
System.out.println("Server Machine: "+ InetAddress.getLocalHost().getCanonicalHostName());
System.out.println("Port number: " + this.getPort());
} catch (UnknownHostException e1) {
e1.printStackTrace();
}
try {
this.startServer();
} catch (IOException e) {
System.err.println("Error occured in runnable.HttpEventMain:" + e.getMessage());
System.exit(0);
}
}
public static void main(String[] args) throws Exception
{
HttpEventMain server = new HttpEventMain(null, 4333);
server.start();
}
private void startServer() throws IOException {
this.selector = Selector.open();
ServerSocketChannel serverChannel = ServerSocketChannel.open();
serverChannel.configureBlocking(false);
InetSocketAddress listenAddr = new InetSocketAddress(this.addr, this.port);
serverChannel.socket().bind(listenAddr);
serverChannel.register(this.selector, SelectionKey.OP_ACCEPT);
System.out.println("Server ready. Ctrl-C to stop.");
Queue inQueue = new Queue(100, "In Queue");
Queue outQueue1 = new Queue(100, "Out Queue 1");
Queue outQueue2 = new Queue(100, "Out Queue 2");
Queue outQueue3 = new Queue(100, "Out Queue 3");
int batchSize = 10;
// Stage stage = new Stage(inQueue, outQueue, batchSize);
this.setName("Event Main");
Stage bufferstage = new Stage(inQueue, outQueue1, batchSize);
bufferstage.setName("bufferstage");
Stage byteArrayStage = new Stage(outQueue1, outQueue2, batchSize);
byteArrayStage.setName("byteArrayStage");
Stage httpRequestStage = new Stage(outQueue2, outQueue3, batchSize);
httpRequestStage.setName("httpRequestStage");
ResponseMannager responseMannager = new ResponseMannager(concurrentHashMapResponse, outQueue3);
responseMannager.setName("responseMannager");
Thread [] threads = {this, bufferstage, byteArrayStage, httpRequestStage, responseMannager};
ThreadMonitor monitor = new ThreadMonitor(threads);
monitor.start();
Queue [] queues = {inQueue, outQueue1, outQueue2, outQueue3};
QueueMonitor queueMonitor = new QueueMonitor(queues);
queueMonitor.start();
bufferstage.start();
byteArrayStage.start();
httpRequestStage.start();
responseMannager.start();
while (true) {
this.selector.select();
Iterator keys = this.selector.selectedKeys().iterator();
while (keys.hasNext()) {
SelectionKey key = (SelectionKey) keys.next();
keys.remove();
if (! key.isValid()) {
continue;
}
if (key.isAcceptable()) {
this.accept(key);
}
else if (key.isReadable()) {
this.read(key, inQueue);
}
else if (key.isWritable()) {
this.write(key);
}
}
}
}
private void accept(SelectionKey key) throws IOException {
ServerSocketChannel serverChannel = (ServerSocketChannel) key.channel();
SocketChannel channel = serverChannel.accept();
channel.configureBlocking(false);
Socket socket = channel.socket();
SocketAddress remoteAddr = socket.getRemoteSocketAddress();
channel.register(this.selector, SelectionKey.OP_READ);
}
private void read(SelectionKey key, Queue inQueue) throws IOException {
SocketChannel channel = (SocketChannel) key.channel();
ByteBuffer buffer = ByteBuffer.allocate(8192);
int numRead = -1;
try {
numRead = channel.read(buffer);
}
catch (IOException e) {
e.printStackTrace();
}
if (numRead == -1) {
Socket socket = channel.socket();
SocketAddress remoteAddr = socket.getRemoteSocketAddress();
channel.close();
key.cancel();
return;
}
String requestID = RandomStringUtils.random(32, true, true);
while(concurrentHashMapKey.containsValue(requestID) || concurrentHashMapResponse.containsKey(requestID)){
requestID = RandomStringUtils.random(15, true, true);
}
concurrentHashMapKey.put(key, requestID);
try {
inQueue.enqueu(new BufferEvent(buffer, requestID, numRead));
}catch (InterruptedException ex){
}
concurrentHashMapResponse.put(requestID, false);
channel.register(this.selector, SelectionKey.OP_WRITE, buffer);
}
private boolean responseReady(SelectionKey key){
String requestId = concurrentHashMapKey.get(key).toString();
Boolean response = (Boolean) concurrentHashMapResponse.get(requestId);
if(response==true){
concurrentHashMapKey.remove(key);
concurrentHashMapResponse.remove(requestId);
return true;
}else{
return false;
}
}
private void write(SelectionKey key) throws IOException {
if(responseReady(key)) {
SocketChannel channel = (SocketChannel) key.channel();
ByteBuffer inputBuffer = (ByteBuffer) key.attachment();
inputBuffer.flip();
channel.write(inputBuffer);
channel.close();
key.cancel();
}else{
}
}
public ConcurrentHashMap getConcurrentHashMapResponse() {
return concurrentHashMapResponse;
}
public void setConcurrentHashMapResponse(ConcurrentHashMap concurrentHashMapResponse) {
this.concurrentHashMapResponse = concurrentHashMapResponse;
}
public InetAddress getAddr() {
return addr;
}
public void setAddr(InetAddress addr) {
this.addr = addr;
}
public int getPort() {
return port;
}
public void setPort(int port) {
this.port = port;
}
public Selector getSelector() {
return selector;
}
public void setSelector(Selector selector) {
this.selector = selector;
}
}
ResponseMannager.java
import com.pasindu.queue.seda.helper.Logger;
import com.pasindu.queue.seda.queue.Queue;
import event.HttpResponseEvent;
import java.util.concurrent.ConcurrentHashMap;
public class ResponseMannager extends Thread{
ConcurrentHashMap concurrentHashMapResponse;
Queue inQueue;
public ResponseMannager(ConcurrentHashMap concurrentHashMap, Queue queue){
this.concurrentHashMapResponse = concurrentHashMap;
this.inQueue = queue;
}
#Override
public void run() {
while(true){
HttpResponseEvent event = null;
try {
event = (HttpResponseEvent) inQueue.dequeue();
}catch(InterruptedException ex){
}
if(event!=null) {
Logger.log(this.toString(), "Dequeued " + event.toString() + " from " + inQueue.toString());
concurrentHashMapResponse.put(event.getRequestId(), true);
Logger.log(this.toString(), "Set response availabliity for " + event.getRequestId() + " in " + concurrentHashMapResponse.toString());
}else{
try{
Thread.sleep(10);
}catch(InterruptedException ex){
}
}
}
}
}
Problem description:
My program creates a Netty server and client, then it makes 2^17 connections to that server, at some point the client starts to receive this exception:
java.io.IOException: Istniejące połączenie zostało gwałtownie zamknięte przez zdalnego hosta.
The english equivalent is:
java.io.IOException: An existing connection was forcibly closed by the remote host
Obviously it is not desired that server is forcibly closing existing connections.
Steps to reproduce:
For convenience of anyone willing to reproduce this problem I've created this "single runnable java file" program that reproduces it, it needs only the netty-all-4.1.12.Final.jar dependency. It starts netty server on some free port, then creates client, perform requests, waits a bit to give the server chance to process the requests, then print statistics about how many connections was made, how many connections did server process, how many connections was lost, how many and what kind of exceptions did server encountered, and how many and what kind of exceptions did client encountered.
package netty.exception.tst;
import java.io.PrintWriter;
import java.io.StringWriter;
import java.net.InetSocketAddress;
import java.util.Collections;
import java.util.Map;
import java.util.Map.Entry;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.stream.Collectors;
import io.netty.bootstrap.Bootstrap;
import io.netty.bootstrap.ServerBootstrap;
import io.netty.buffer.ByteBuf;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelFutureListener;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.ChannelInboundHandlerAdapter;
import io.netty.channel.ChannelInitializer;
import io.netty.channel.ChannelOption;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.SocketChannel;
import io.netty.channel.socket.nio.NioServerSocketChannel;
import io.netty.channel.socket.nio.NioSocketChannel;
public class NettyException {
public static void main(String[] args) throws InterruptedException {
System.out.println("starting server");
NettyServer server = new NettyServer(0);
int port = server.getPort();
System.out.println("server started at port: " + port);
System.out.println("staring client");
NettyClient client = new NettyClient();
System.out.println("client started");
int noOfConnectionsToPerform = 1 << 17;
System.out.println("performing " + noOfConnectionsToPerform + " connections");
for (int n = 0; n < noOfConnectionsToPerform; n++) {
// send a request
ChannelFuture f = client.getBootstrap().connect("localhost", port);
}
System.out.println("client performed " + noOfConnectionsToPerform + " connections");
System.out.println("wait a bit to give a chance for server to finish processing incoming requests");
Thread.currentThread().sleep(80000);
System.out.println("shutting down server and client");
server.stop();
client.stop();
System.out.println("stopped, server received: " + server.connectionsCount() + " connections");
int numberOfLostConnections = noOfConnectionsToPerform - server.connectionsCount();
if (numberOfLostConnections > 0) {
System.out.println("Where do we lost " + numberOfLostConnections + " connections?");
}
System.out.println("srerver exceptions: ");
printExceptions(server.getExceptions());
System.out.println("client exceptions: ");
printExceptions(client.getExceptions());
}
private static void printExceptions(Map<String, Integer> exceptions) {
if (exceptions.isEmpty()) {
System.out.println("There was no exceptions");
}
for (Entry<String, Integer> exception : exceptions.entrySet()) {
System.out.println("There was " + exception.getValue() + " times this exception:");
System.out.println(exception.getKey());
}
}
public static class NettyServer {
private ChannelFuture channelFuture;
private EventLoopGroup bossGroup;
private EventLoopGroup workerGroup;
private AtomicInteger connections = new AtomicInteger(0);
private ExceptionCounter exceptionCounter = new ExceptionCounter();
public NettyServer(int port) throws InterruptedException {
bossGroup = new NioEventLoopGroup();
workerGroup = new NioEventLoopGroup();
ServerBootstrap serverBootstrap = new ServerBootstrap();
serverBootstrap.group(bossGroup, workerGroup).channel(NioServerSocketChannel.class)
.childHandler(new ChannelInitializer<SocketChannel>() {
#Override
public void initChannel(SocketChannel ch) throws Exception {
ch.pipeline().addLast(new TimeServerHandler() {
#Override
public void channelActive(final ChannelHandlerContext ctx) {
connections.incrementAndGet();
super.channelActive(ctx);
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) {
exceptionCounter.countException(cause);
super.exceptionCaught(ctx, cause);
}
});
}
}).option(ChannelOption.SO_BACKLOG, 128).childOption(ChannelOption.SO_KEEPALIVE, true);
channelFuture = serverBootstrap.bind(port).sync();
}
public int getPort() {
return ((InetSocketAddress) channelFuture.channel().localAddress()).getPort();
}
public int connectionsCount() {
return connections.get();
}
public Map<String, Integer> getExceptions() {
return exceptionCounter.getExceptions();
}
public void stop() {
bossGroup.shutdownGracefully();
workerGroup.shutdownGracefully();
try {
bossGroup.awaitTermination(Long.MAX_VALUE, TimeUnit.DAYS);
workerGroup.awaitTermination(Long.MAX_VALUE, TimeUnit.DAYS);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
public static class NettyClient {
private Bootstrap bootstrap;
private EventLoopGroup workerGroup;
private ExceptionCounter exceptionCounter = new ExceptionCounter();
public NettyClient() {
workerGroup = new NioEventLoopGroup();
bootstrap = new Bootstrap();
bootstrap.group(workerGroup);
bootstrap.channel(NioSocketChannel.class);
bootstrap.option(ChannelOption.SO_KEEPALIVE, true);
bootstrap.handler(new ChannelInitializer<SocketChannel>() {
#Override
public void initChannel(SocketChannel ch) throws Exception {
ch.pipeline().addLast(new TimeClientHandler() {
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) {
exceptionCounter.countException(cause);
super.exceptionCaught(ctx, cause);
}
});
}
});
}
public Bootstrap getBootstrap() {
return bootstrap;
}
public void stop() {
workerGroup.shutdownGracefully();
try {
workerGroup.awaitTermination(Long.MAX_VALUE, TimeUnit.DAYS);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
public Map<String, Integer> getExceptions() {
return exceptionCounter.getExceptions();
}
}
public static class TimeServerHandler extends ChannelInboundHandlerAdapter {
#Override
public void channelActive(final ChannelHandlerContext ctx) {
final ByteBuf time = ctx.alloc().buffer(4);
time.writeInt((int) (System.currentTimeMillis() / 1000L + 2208988800L));
final ChannelFuture f = ctx.writeAndFlush(time);
f.addListener(new ChannelFutureListener() {
#Override
public void operationComplete(ChannelFuture future) {
assert f == future;
ctx.close();
}
});
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) {
ctx.close();
}
}
public static class TimeClientHandler extends ChannelInboundHandlerAdapter {
private ThreadLocal<ByteBuf> buf = new ThreadLocal<ByteBuf>();
#Override
public void handlerAdded(ChannelHandlerContext ctx) {
buf.set(ctx.alloc().buffer(4));
}
#Override
public void handlerRemoved(ChannelHandlerContext ctx) {
buf.get().release();
buf.remove();
}
#Override
public void channelRead(ChannelHandlerContext ctx, Object msg) {
ByteBuf m = (ByteBuf) msg;
buf.get().writeBytes(m);
m.release();
if (buf.get().readableBytes() >= 4) {
long currentTimeMillis = (buf.get().readUnsignedInt() - 2208988800L) * 1000L;
ctx.close();
}
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) {
ctx.close();
}
}
public static class ExceptionCounter {
private ConcurrentHashMap<String, AtomicInteger> exceptions = new ConcurrentHashMap<String, AtomicInteger>();
private void countException(Throwable cause) {
StringWriter writer = new StringWriter();
cause.printStackTrace(new PrintWriter(writer));
String stackTrace = writer.toString();
AtomicInteger exceptionCount = exceptions.get(stackTrace);
if (exceptionCount == null) {
exceptionCount = new AtomicInteger(0);
AtomicInteger prevCount = exceptions.putIfAbsent(stackTrace, exceptionCount);
if (prevCount != null) {
exceptionCount = prevCount;
}
}
exceptionCount.incrementAndGet();
}
public Map<String, Integer> getExceptions() {
Map<String, Integer> newMap = exceptions.entrySet().stream()
.collect(Collectors.toMap(Map.Entry::getKey, e -> e.getValue().get()));
return Collections.unmodifiableMap(newMap);
}
}
}
The output is:
starting server
server started at port: 56069
staring client
client started
performing 131072 connections
client performed 131072 connections
wait a bit to give a chance for server to finish processing incoming requests
shutting down server and client
stopped, server received: 34735 connections
Where do we lost 96337 connections?
srerver exceptions:
There was no exceptions
client exceptions:
There was 258 times this exception:
java.io.IOException: Istniejące połączenie zostało gwałtownie zamknięte przez zdalnego hosta
at sun.nio.ch.SocketDispatcher.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:192)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:288)
at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1100)
at io.netty.buffer.WrappedByteBuf.writeBytes(WrappedByteBuf.java:813)
at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:372)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:123)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:745)
There was 30312 times this exception:
java.io.IOException: Istniejące połączenie zostało gwałtownie zamknięte przez zdalnego hosta
at sun.nio.ch.SocketDispatcher.read0(Native Method)
at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:192)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380)
at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:288)
at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1100)
at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:372)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:123)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:745)
Questions:
Why this exception is thrown?
Where the lost connections has gone? Why there is no error for them?
How to avoid it, what is the correct way to program this kind of "high throughput" application to not have such problems like loosing/breaking existing connections?
No related to subject, but maybe some Netty expert will know: Why when I change in TimeClientHandler the field declaration of private ThreadLocal<ByteBuf> buf to be also static, I have null pointer exception in TimeClientHandler.handlerRemoved? This is very strange, is this class somehow replicated? or are the Threads from NioEventLoopGroup somehow strange?
Environment:
Netty version: netty-all-4.1.12.Final.jar
JVM version: jdk1.8.0_111 64 bit
OS version: Windows 10 64 bit
There is a limit of 64k ports per IP address, so you cannot open 2^17 ports. Since each socket uses a file handle, you might be hitting the limit of max open files per process. See "Max open files" for working process.
import io.netty.bootstrap.ServerBootstrap;
import io.netty.channel.Channel;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelInitializer;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.epoll.EpollChannelOption;
import io.netty.channel.epoll.EpollEventLoopGroup;
import io.netty.channel.epoll.EpollServerSocketChannel;
import io.netty.channel.socket.SocketChannel;
import sun.misc.Signal;
import sun.misc.SignalHandler;
import java.net.InetSocketAddress;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.LinkedList;
import java.util.List;
public class ReusePortServer {
private final int port;
private List<Channel> bindingChannels = new LinkedList<>();
public ReusePortServer(int port) {
this.port = port;
}
private void initSignals() {
Signal.handle(new Signal("BUS"), new SignalHandler() {
#Override public void handle(Signal signal) {
System.out.println("signal arrived");
closeChannels();
}
});
}
synchronized private void closeChannels() {
for (Channel channel : bindingChannels) {
channel.close();
}
bindingChannels.clear();
}
synchronized private void registerChannel(Channel channel) {
bindingChannels.add(channel);
}
public void start() throws Exception {
initSignals();
EventLoopGroup group = new EpollEventLoopGroup();
try {
ServerBootstrap b = new ServerBootstrap();
b.group(group)
.channel(EpollServerSocketChannel.class)
.option(EpollChannelOption.SO_REUSEPORT, true)
.localAddress(new InetSocketAddress(port))
.childHandler(new ChannelInitializer<SocketChannel>(){
#Override
public void initChannel(SocketChannel ch) throws Exception {
ch.pipeline().addLast(new ReusePortHandler());
registerChannel(ch);
}
});
for (StackTraceElement e : Thread.currentThread().getStackTrace()) {
System.out.println(e.toString());
}
ChannelFuture f = b.bind().sync();
log(String.format("%s started and listen on %s", ReusePortServer.class.getName(), f.channel().localAddress()));
// registerChannel(ch); // ---------------I also tried to register this channel, but after my signaling, it closes my client's connection, rather than keeping it.
f.channel().closeFuture().sync();
} finally {
group.shutdownGracefully().sync();
}
}
private final static SimpleDateFormat datefmt = new SimpleDateFormat("HH:mm:ss ");
public static void log(final String msg) {
System.out.print(datefmt.format(new Date()));
System.out.println(msg);
System.out.flush();
}
public static void main(final String[] args) throws Exception {
int port = 12355;
new ReusePortServer(port).start();
}
}
Hi, I am looking a way to stop netty from listening and accepting on server socket, but to finish up any ongoing job on current connections.
I come across the following question:
How to stop netty from listening and accepting on server socket
and according to it, I wrote the above code, which receive signal (kill -7) to do the closing.
But the result is not expected, it closes the tcp connections, and netty can still accept new connection.
Do I use the correct way of stop netty from listening and accepting?
What's wrong here?
You should close the ServerChannel like this:
ChannelFuture f = b.bind().sync();
// Call this once you want to stop accepting new connections.
f.channel().close().sync();
I've spent about two days about this problem. I try this, and it work for me :
First, I declare:
static ChannelFuture future;
Then, when I bind port, I assign future variable:
ServerBootstrap b = new ServerBootstrap();
b.group(bossGroup, workerGroup)
.channel(NioServerSocketChannel.class)
.childHandler(new ServerInitializer())
.option(ChannelOption.SO_BACKLOG, NetUtil.SOMAXCONN)
.childOption(ChannelOption.SO_KEEPALIVE, true);
future = b.bind(WS_PORT).sync();
Final, I add a function to handler event when web application stop.
#PreDestroy
void shutdownWorkers() throws InterruptedException {
future.channel().close().sync();
}
I have a simple ECHO server and client written using Netty. The server and client are on the same machine. I was expecting mean latency of the order of a couple of milliseconds, however, regardless of what I try I can never bring the latency down to sub-millisecond durations. Any help would be greatly appreciated.
Update: Even when using System.nanoTime I see the latency around 25-30ms.
EchoClient
import org.jboss.netty.bootstrap.ClientBootstrap;
import org.jboss.netty.channel.*;
import org.jboss.netty.channel.socket.nio.NioClientSocketChannelFactory;
import org.jboss.netty.handler.execution.ExecutionHandler;
import org.jboss.netty.handler.execution.OrderedMemoryAwareThreadPoolExecutor;
import java.net.InetSocketAddress;
import java.util.concurrent.Executors;
public class EchoClient {
public static void main(String[] args) {
if (args.length != 1) {
System.err.println(String.format("usage: %s <num-msgs>", EchoClient.class.getCanonicalName()));
System.exit(1);
}
final long NUM_MSGS = Integer.parseInt(args[0]);
final EchoClientHandler echoClientHandler = new EchoClientHandler();
final ExecutionHandler e =
new ExecutionHandler(new OrderedMemoryAwareThreadPoolExecutor(4, 128 * 1024L, 128 * 1024L));
ChannelFactory factory =
new NioClientSocketChannelFactory(Executors.newCachedThreadPool(),
Executors.newCachedThreadPool());
ClientBootstrap bootstrap = new ClientBootstrap(factory);
bootstrap.setPipelineFactory(new ChannelPipelineFactory() {
#Override
public ChannelPipeline getPipeline() throws Exception {
return Channels.pipeline(new TestPayloadEncoder(),
new TestPayloadDecoder(),
e,
echoClientHandler);
}
});
bootstrap.setOption("tcpNoDelay", true);
bootstrap.setOption("keepAlive", false);
bootstrap.setOption("child.keepAlive", false);
bootstrap.setOption("sendBufferSize", 128 * 1024L);
bootstrap.setOption("receiveBufferSize", 128 * 1024L);
for (int i = 0; i < NUM_MSGS; i++) {
final InetSocketAddress serverAddr =
new InetSocketAddress("localhost", 8080);
bootstrap.connect(serverAddr).addListener(new ChannelFutureListener() {
#Override
public void operationComplete(ChannelFuture f) throws Exception {
if (f.isSuccess()) {
f.getChannel().write(new TestPayload());
}
}
});
}
while (echoClientHandler.numMsgs.get() < NUM_MSGS);
System.out.println(echoClientHandler.numMsgs);
System.out.println(echoClientHandler.aggTime);
System.out.println(String.format("mean transfer time: %.2fms",
((float) echoClientHandler.aggTime.get()) /
echoClientHandler.numMsgs.get()));
System.out.flush();
e.releaseExternalResources();
factory.releaseExternalResources();
}
}
EchoClientHandler
import org.jboss.netty.channel.ChannelHandlerContext;
import org.jboss.netty.channel.ExceptionEvent;
import org.jboss.netty.channel.MessageEvent;
import org.jboss.netty.channel.SimpleChannelHandler;
import java.util.concurrent.atomic.AtomicLong;
public class EchoClientHandler extends SimpleChannelHandler {
public final AtomicLong numMsgs = new AtomicLong(0);
public final AtomicLong aggTime = new AtomicLong(0);
#Override
public void messageReceived(ChannelHandlerContext ctx, MessageEvent e) throws Exception {
long recvTime = System.currentTimeMillis();
TestPayload m = (TestPayload) e.getMessage();
aggTime.addAndGet(recvTime - m.getTime());
numMsgs.incrementAndGet();
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, ExceptionEvent e) throws Exception {
e.getCause().printStackTrace();
e.getChannel().close();
}
}
EchoServer
import org.jboss.netty.bootstrap.ServerBootstrap;
import org.jboss.netty.channel.ChannelFactory;
import org.jboss.netty.channel.ChannelPipeline;
import org.jboss.netty.channel.ChannelPipelineFactory;
import org.jboss.netty.channel.Channels;
import org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory;
import org.jboss.netty.handler.execution.ExecutionHandler;
import org.jboss.netty.handler.execution.OrderedMemoryAwareThreadPoolExecutor;
import java.net.InetSocketAddress;
import java.util.concurrent.Executors;
public class EchoServer {
public static void main(String[] args) {
ChannelFactory factory =
new NioServerSocketChannelFactory(Executors.newFixedThreadPool(4),
Executors.newFixedThreadPool(32),
32);
ServerBootstrap bootstrap = new ServerBootstrap(factory);
final ExecutionHandler e =
new ExecutionHandler(new OrderedMemoryAwareThreadPoolExecutor(4, 128 * 1024L, 128 * 1024L));
bootstrap.setPipelineFactory(new ChannelPipelineFactory() {
#Override
public ChannelPipeline getPipeline() throws Exception {
return Channels.pipeline(e, new EchoServerHandler());
}
});
bootstrap.setOption("reuseAddr", true);
bootstrap.setOption("keepAlive", false);
bootstrap.setOption("child.reuseAddr", true);
bootstrap.setOption("child.soLinger", 0);
bootstrap.setOption("child.keepAlive", false);
bootstrap.setOption("child.tcpNoDelay", true);
bootstrap.setOption("child.sendBufferSize", 128 * 1024L);
bootstrap.setOption("child.receiveBufferSize", 128 * 1024L);
bootstrap.bind(new InetSocketAddress("localhost", 8080));
}
}
EchoServerHandler
import org.jboss.netty.channel.*;
public class EchoServerHandler extends SimpleChannelHandler {
#Override
public void messageReceived(ChannelHandlerContext ctx, MessageEvent e) throws Exception {
e.getChannel().write(e.getMessage());
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, ExceptionEvent e) throws Exception {
e.getCause().printStackTrace();
e.getChannel().close();
}
}
TestPayload
import org.jboss.netty.buffer.ChannelBuffer;
import java.util.Date;
import java.util.Random;
public class TestPayload {
private static final int PREAMBLE_LEN = (Long.SIZE + Integer.SIZE) / 8;
private static final Random RNG;
static {
RNG = new Random();
RNG.setSeed(new Date().getTime());
}
private final int paddingLen;
private final byte[] padding;
private final long time;
public TestPayload() {
this(65536);
}
public TestPayload(int sizeInBytes) {
this.paddingLen = sizeInBytes;
this.padding = new byte[this.paddingLen];
RNG.nextBytes(this.padding);
this.time = System.currentTimeMillis();
}
private TestPayload(long time, int paddingLen, byte[] padding) {
this.paddingLen = paddingLen;
this.padding = padding;
this.time = time;
}
public long getTime() {
return this.time;
}
public void writeTo(ChannelBuffer buf) {
buf.writeLong(this.time);
buf.writeInt(this.paddingLen);
buf.writeBytes(this.padding);
}
public static TestPayload readFrom(ChannelBuffer buf) {
if (buf.readableBytes() < PREAMBLE_LEN) {
return null;
}
buf.markReaderIndex();
long time = buf.readLong();
int paddingLen = buf.readInt();
if (buf.readableBytes() < paddingLen) {
buf.resetReaderIndex();
return null;
}
byte[] padding = new byte[paddingLen];
buf.readBytes(padding);
return new TestPayload(time, paddingLen, padding);
}
public int getLength() {
return PREAMBLE_LEN + this.paddingLen;
}
Are you running your client and your server in different JVMs? If so, measuring time across JVM boundaries is not as straight forward as you would think. For example using System.nanoTime() is not necessarily going to work according to the oracle java doc:
The values returned by this method become meaningful only when the difference between two such values, obtained within the same instance of a Java virtual machine, is computed.
Assuming you can find a reliable way to measure time across JVMs and if your goal is to isolate how long it takes a Netty client to send to a Netty server then simplify your use case to isolate this as much as possible. For example, in the above code you are counting the time to send/receive an array of 65536 bytes. Remove this from the timing experiment to help isolate where the bottlenecks are.
How many runs are you collecting timing from? Are you excluding initialization time of Netty itself (running a few messages between client/server before taking timing)?
Also how does adjusting your configuration impact performance? There are plenty of knobs to tweak (thread pool size, send/receive buff size, etc...).
What version of Netty are you using, and is there an option to force a flush after you write?
I don't see the code for EchoClient. It looks like you copy/pasted the code for EchoClientHandler where EchoClient's code should be.
When I tested a simple producer/consumer example, I got a very strange result as below.
If I used main() to test the following code, I'll get the correct and expected result.
But I only can get the 1st directory correctly, the remaining works were dropped by the JUnit.
What is the exact reason?
Working code:
import java.io.File;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.LinkedBlockingQueue;
import org.junit.Test;
public class TestProducerAndConsumer {
public static void main(String[] args) {
BlockingQueue<File> queue = new LinkedBlockingQueue<File>(1000);
new Thread(new FileCrawler(queue, new File("C:\\"))).start();
new Thread(new Indexer(queue)).start();
}
}
Bad Code:
import java.io.File;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.LinkedBlockingQueue;
import org.junit.Test;
public class TestProducerAndConsumer {
#Test
public void start2() {
BlockingQueue<File> queue = new LinkedBlockingQueue<File>(1000);
new Thread(new FileCrawler(queue, new File("C:\\"))).start();
new Thread(new Indexer(queue)).start();
}
}
Other function code:
import java.io.File;
import java.util.Arrays;
import java.util.concurrent.BlockingQueue;
public class FileCrawler implements Runnable {
private final BlockingQueue<File> fileQueue;
private final File root;
private int i = 0;
public FileCrawler(BlockingQueue<File> fileQueue, File root) {
this.fileQueue = fileQueue;
this.root = root;
}
#Override
public void run() {
try {
craw(root);
} catch (InterruptedException e) {
System.out.println("shit!");
e.printStackTrace();
Thread.currentThread().interrupt();
}
}
private void craw(File file) throws InterruptedException {
File[] entries = file.listFiles();
//System.out.println(Arrays.toString(entries));
if (entries != null && entries.length > 0) {
for (File entry : entries) {
if (entry.isDirectory()) {
craw(entry);
} else {
fileQueue.offer(entry);
i++;
System.out.println(entry);
System.out.println(i);
}
}
}
}
public static void main(String[] args) throws InterruptedException {
FileCrawler fc = new FileCrawler(null, null);
fc.craw(new File("C:\\"));
System.out.println(fc.i);
}
}
import java.io.File;
import java.util.concurrent.BlockingQueue;
public class Indexer implements Runnable {
private BlockingQueue<File> queue;
public Indexer(BlockingQueue<File> queue) {
this.queue = queue;
}
#Override
public void run() {
try {
while (true) {
indexFile(queue.take());
}
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
private void indexFile(File file) {
System.out.println("Indexing ... " + file);
}
}
Junit's presumably allowing the JVM & threads to terminate, once the test is finished -- thus your threads do not complete working.
Try waiting for the threads to 'join':
Thread crawlerThread = new Thread(new FileCrawler(queue, new File("C:\\")));
Thread indexerThread = new Thread(new Indexer(queue));
crawlerThread.start();
indexerThread.start();
//
// wait for them to finish.
crawlerThread.join();
indexerThread.join();
This should help.
.. The other thing that can go wrong, is that log output (via Log4J) can sometimes be truncated at the end of execution; flushing & pausing can help. But I don't think that will affect you here.