I'm trying to create a TCP server that read data periodically from a database (Redis) and send it to the appropriate client.
However, since I'm pretty new to Netty, I don't know how could I schedule this. I do know that I need to use a Scheduled Executor Service like this:
ScheduledExecutorService e = Executors.newSingleThreadScheduledExecutor();
e.scheduleAtFixedRate(() -> {
System.out.println("Calling...");
// Do something
}, 1, 1, TimeUnit.SECONDS);
However, when I tried to put that in the server code, It's only calling the method once. I've tried to put that in different place but still can't seem to get it right. What should I do?
Here's the code of the server:
package com.example.test.app;
import io.netty.bootstrap.ServerBootstrap;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelInitializer;
import io.netty.channel.ChannelOption;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.SocketChannel;
import io.netty.channel.socket.nio.NioServerSocketChannel;
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.TimeUnit;
public class Server {
public static void main(String[] args) throws Exception
{
EventLoopGroup bossGroup = new NioEventLoopGroup();
EventLoopGroup workerGroup = new NioEventLoopGroup();
final ServerHandler handler = new ServerHandler();
try {
ServerBootstrap b = new ServerBootstrap();
b.group(bossGroup, workerGroup);
b.channel(NioServerSocketChannel.class);
b.childHandler(new ChannelInitializer<SocketChannel>() {
#Override
protected void initChannel(SocketChannel ch) throws Exception
{
ch.pipeline().addLast(handler);
}
});
b.option(ChannelOption.SO_BACKLOG, 128);
b.childOption(ChannelOption.SO_KEEPALIVE, true);
ScheduledExecutorService e = Executors.newSingleThreadScheduledExecutor();
e.scheduleAtFixedRate(() -> {
System.out.println("Calling...");
handler.saySomething();
}, 1, 1, TimeUnit.SECONDS);
ChannelFuture f = b.bind(1337).sync();
f.channel().closeFuture().sync();
} finally {
workerGroup.shutdownGracefully();
bossGroup.shutdownGracefully();
}
}
}
And here's the server handler:
package com.example.test.app;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelFutureListener;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.ChannelInboundHandlerAdapter;
public class ServerHandler extends ChannelInboundHandlerAdapter {
private ChannelHandlerContext ctx;
#Override
public void channelActive(ChannelHandlerContext ctx)
{
this.ctx = ctx;
System.out.println("Someone's connedted!");
}
public void saySomething()
{
final ChannelFuture f = ctx.writeAndFlush("Sup!");
f.addListener((ChannelFutureListener) (ChannelFuture future) -> {
System.out.println("Something has been said!");
});
}
}
The method saySomething() generates NullPointerException for calling final ChannelFuture f = ctx.writeAndFlush("Sup!"); while ctx is null.
EventExecutorGroup.scheduleAtFixedRate javadoc description says that "If any execution of the task encounters an exception, subsequent executions are suppressed". So this is why you get is called only once...
Also, seems like Netty allows you to re-use a handler instance for different pipeline instances only if you annotate this handler's class as #Sharable. Otherwise, it will throw exception. If your handler is stateless (which is not your case, as yours has the ctx member) then you should annotate it as #Sharable and re-use it to all created pipelines. If it is stateful, create a new instance for every new pipeline (new client connection).
Finally, to schedule your task for each connected client you can use the executor which can be referenced by the ctx of the connected client's channel (by default, as in your case, the channel's EventLoop) on your channelActive() implementation. This executor implements ScheduledExecutorService, so you have also scheduleAtFixedRate.
Take a look at my version of your code and see if it suits you.
Server:
package com.example.test.app;
import io.netty.bootstrap.ServerBootstrap;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelInitializer;
import io.netty.channel.ChannelOption;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.SocketChannel;
import io.netty.channel.socket.nio.NioServerSocketChannel;
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.TimeUnit;
public class Server {
public static void main(String[] args) throws Exception
{
EventLoopGroup bossGroup = new NioEventLoopGroup();
EventLoopGroup workerGroup = new NioEventLoopGroup();
try {
ServerBootstrap b = new ServerBootstrap();
b.group(bossGroup, workerGroup);
b.channel(NioServerSocketChannel.class);
b.childHandler(new ChannelInitializer<SocketChannel>() {
#Override
protected void initChannel(SocketChannel ch) throws Exception
{
ch.pipeline().addLast(new ServerHandler());
}
});
b.option(ChannelOption.SO_BACKLOG, 128);
b.childOption(ChannelOption.SO_KEEPALIVE, true);
// ScheduledExecutorService e = Executors.newSingleThreadScheduledExecutor();
// e.scheduleAtFixedRate(() -> {
// System.out.println("Calling...");
// handler.saySomething();
// }, 1, 1, TimeUnit.SECONDS);
ChannelFuture f = b.bind(1337).sync();
f.channel().closeFuture().sync();
} finally {
workerGroup.shutdownGracefully();
bossGroup.shutdownGracefully();
}
}
}
ServerHandler:
package com.example.test.app;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelFutureListener;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.ChannelInboundHandlerAdapter;
import io.netty.util.concurrent.ScheduledFuture;
import java.util.concurrent.TimeUnit;
public class ServerHandler extends ChannelInboundHandlerAdapter {
private ScheduledFuture sf;
#Override
public void channelActive(ChannelHandlerContext ctx)
{
System.out.println("Someone's connedted! "+ctx.channel());
sf = ctx.executor().scheduleAtFixedRate(() -> {
System.out.println("Calling...");
saySomething(ctx);
}, 1, 1, TimeUnit.SECONDS);
}
#Override
public void channelInactive(ChannelHandlerContext ctx) {
System.out.println("Someone's disconnected! "+ctx.channel());
sf.cancel(false);
}
private void saySomething(ChannelHandlerContext ctx)
{
final ChannelFuture f = ctx.writeAndFlush("Sup!");
f.addListener((ChannelFutureListener) (ChannelFuture future) -> {
System.out.println("Something has been said!");
});
}
}
Related
I'm using Netty for a server that needs to handle hundreds of thousands of requests per second while maintaining as little variance as possible on response latencies. I'm doing some final optimizations and I'm currently looking into reducing unnecessary memory allocation by reusing whatever objects I can. A simplified example of a server highlighting my issue is the following:
import io.netty.bootstrap.ServerBootstrap;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelInitializer;
import io.netty.channel.ChannelPipeline;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.SocketChannel;
import io.netty.channel.socket.nio.NioServerSocketChannel;
import io.netty.handler.codec.http.HttpServerCodec;
import io.netty.handler.codec.http.HttpObjectAggregator;
public class NettyServer {
public void run() throws Exception {
EventLoopGroup bossGroup = new NioEventLoopGroup();
EventLoopGroup workerGroup = new NioEventLoopGroup();
try {
ServerBootstrap b = new ServerBootstrap();
b.group(bossGroup, workerGroup)
.channel(NioServerSocketChannel.class)
.childHandler(new ChannelInitializer<SocketChannel>() {
#Override
public void initChannel(SocketChannel ch) throws Exception {
ChannelPipeline p = ch.pipeline();
p.addLast(new HttpServerCodec());
p.addLast(new HttpObjectAggregator(1048576));
p.addLast(new NettyHandler());
}
});
ChannelFuture f = b.bind(8080).sync();
f.channel().closeFuture().sync();
} finally {
workerGroup.shutdownGracefully();
bossGroup.shutdownGracefully();
}
}
public static void main(String[] args) throws Exception {
new NettyServer().run();
}
}
The handler code is the following:
import io.netty.buffer.Unpooled;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelFutureListener;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.SimpleChannelInboundHandler;
import io.netty.handler.codec.http.DefaultFullHttpResponse;
import io.netty.handler.codec.http.FullHttpRequest;
import io.netty.handler.codec.http.FullHttpResponse;
import io.netty.handler.codec.http.HttpUtil;
import io.netty.handler.codec.http.HttpHeaderNames;
import io.netty.handler.codec.http.HttpHeaderValues;
import io.netty.handler.codec.http.HttpResponseStatus;
import io.netty.handler.codec.http.HttpVersion;
import io.netty.handler.codec.http.QueryStringDecoder;
import io.netty.util.CharsetUtil;
public class NettyHandler extends SimpleChannelInboundHandler<Object> {
private static final FullHttpResponse okResponse = OkResponse();
private static final FullHttpResponse failResponse = FailResponse();
#Override
public void channelReadComplete(ChannelHandlerContext ctx) {
ctx.flush();
}
#Override
protected void channelRead0(ChannelHandlerContext ctx, Object msg) {
FullHttpRequest request = (FullHttpRequest) msg;
QueryStringDecoder query = new QueryStringDecoder(request.getUri());
String path = query.path();
ChannelFuture f;
boolean keepAlive = HttpUtil.isKeepAlive(request);
if ("/ok".equals(path)) {
f = ctx.write(okResponse);
} else {
f = ctx.write(failResponse);
keepAlive = false;
}
if (!keepAlive) {
f.addListener(ChannelFutureListener.CLOSE);
}
}
private static FullHttpResponse OkResponse() {
String data = "{ \"status\": ok }";
FullHttpResponse response = new DefaultFullHttpResponse(
HttpVersion.HTTP_1_1,
HttpResponseStatus.OK,
Unpooled.copiedBuffer(data, CharsetUtil.UTF_8)
);
response.headers().set(HttpHeaderNames.CONTENT_TYPE, HttpHeaderValues.APPLICATION_JSON);
response.headers().set(HttpHeaderNames.CACHE_CONTROL, "max-age=0, no-cache, must-revalidate, proxy-revalidate");
return response;
}
private static FullHttpResponse FailResponse() {
String data = "{ \"status\": fail }";
FullHttpResponse response = new DefaultFullHttpResponse(
HttpVersion.HTTP_1_1,
HttpResponseStatus.OK,
Unpooled.copiedBuffer(data, CharsetUtil.UTF_8)
);
response.headers().set(HttpHeaderNames.CONTENT_TYPE, HttpHeaderValues.APPLICATION_JSON);
response.headers().set(HttpHeaderNames.CACHE_CONTROL, "max-age=0, no-cache, must-revalidate, proxy-revalidate");
return response;
}
}
The handler shows what I'm trying to accomplish. The handler contains static instances of fixed HTTP responses. For the server all responses except error codes come from a small group and can be preconstructed. With the above code the second query to a handler will fail, since Netty's ref counts for the response has gone down to zero. I was expecting that just calling retain() on the object would be enough, but it doesn't look like it is.
What would be the most efficient way to reuse the HTTP response objects between requests?
You should call retainedDuplicate() as otherwise the readerIndex etc may become “invalid”
I'm new to vert.x and decided to start out light, but can't get a simple unit test to work (using maven to build). Code as follows
FileRepo.java:
import io.vertx.core.AbstractVerticle;
import io.vertx.core.Future;
import io.vertx.core.http.HttpServer;
import io.vertx.core.http.HttpServerResponse;
import io.vertx.ext.web.Router;
public class FileRepo extends AbstractVerticle {
#Override
public void start(Future<Void> fut) {
HttpServer server = vertx.createHttpServer();
Router router = Router.router(vertx);
router.route("/upload").handler(routingContext -> {
HttpServerResponse response = routingContext.response();
response.putHeader("content-type", "text/plain");
response.end("Hello world!");
});
System.out.println("Starting server!");
server.requestHandler(router::accept).listen(8080);
System.out.println("Server started!");
}
}
FileRepoTest.java:
import FileRepo;
import io.vertx.core.Vertx;
import io.vertx.ext.unit.TestContext;
import io.vertx.ext.unit.junit.VertxUnitRunner;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
#RunWith(VertxUnitRunner.class)
public class FileRepoTest {
private Vertx vertx;
#Before
public void setUp(TestContext context) {
vertx = Vertx.vertx();
System.out.println("SetUp Vertx");
vertx.deployVerticle(FileRepo.class.getName(), context.asyncAssertSuccess());
System.out.println("SetUp done");
}
#After
public void tearDown(TestContext context) {
System.out.println("tearDown Vertx");
vertx.close(context.asyncAssertSuccess());
}
#Test
public void testUpload(TestContext context) {
System.out.println("testUpload");
}
}
Result:
SetUp Vertx
SetUp done
Starting server!
Server started!
tearDown Vertx
java.util.concurrent.TimeoutException
at io.vertx.ext.unit.impl.TestContextImpl$Step.lambda$run$0(TestContextImpl.java:112)
at java.lang.Thread.run(Thread.java:745)
Process finished with exit code -1
Browsing http://localhost:8080/upload while waiting for the TimeoutException shows a Hello World! page, but the #Test never seem to execute. What am I doing wrong here?
Regards,
Mattias
The exception you are getting is because there is no acknowledgment of the server start state.
Change your Verticle to following:
import io.vertx.core.AbstractVerticle;
import io.vertx.core.Future;
import io.vertx.core.http.HttpServerResponse;
import io.vertx.ext.web.Router;
public class FileRepo extends AbstractVerticle {
#Override
public void start(Future<Void> fut) {
Router router = Router.router(vertx);
router.route("/upload").handler(routingContext -> {
HttpServerResponse response = routingContext.response();
response.putHeader("content-type", "text/plain");
response.end("Hello world!");
});
System.out.println("Starting server!");
vertx.createHttpServer()
.requestHandler(router::accept)
.listen(8080, result -> {
if (result.succeeded()) {
System.out.println("Server started!");
fut.complete();
} else {
System.out.println("Server start failed!");
fut.fail(result.cause());
}
});
}
}
I want to modify the client handler to use Foo instead of Datagram -- what changes are required in the client itself?
Surely it's not necessary to strictly keep to datagrams to send and receive with Netty? The Factorial example uses BigInteger, so, surely, it's possible to use POJO's.
Any and all attempts to create a class like:
class FooClientHandler extends SimpleChannelInboundHandler<Foo> are just non-starters for me, it literally won't send or receive with a server. (Yes, both client and server use similar handlers, generic classes with Foo.) So, I'm coming at this now from working code.
What's the key distinction between the factorial handler and the the datagram handler below? Or, is the primary distinction in how it's used in the client?
client:
package net.bounceme.dur.netty;
import io.netty.bootstrap.Bootstrap;
import io.netty.buffer.Unpooled;
import io.netty.channel.Channel;
import io.netty.channel.ChannelOption;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.DatagramPacket;
import io.netty.channel.socket.nio.NioDatagramChannel;
import io.netty.util.CharsetUtil;
import java.net.InetSocketAddress;
import java.util.logging.Logger;
import net.bounceme.dur.client.gui.MyProps;
public final class Client {
private static final Logger log = Logger.getLogger(Client.class.getName());
public void connect() throws InterruptedException {
MyProps p = new MyProps();
String host = p.getHost();
int port = p.getServerPort();
pingPongDatagram(host, port);
}
public void pingPongDatagram(String host, int port) throws InterruptedException {
EventLoopGroup group = new NioEventLoopGroup();
try {
Bootstrap b = new Bootstrap();
b.group(group)
.channel(NioDatagramChannel.class)
.option(ChannelOption.SO_BROADCAST, true)
.handler(new DatagramClientHandler());
Channel ch = b.bind(0).sync().channel();
ch.writeAndFlush(new DatagramPacket(
Unpooled.copiedBuffer("QOTM?", CharsetUtil.UTF_8),
new InetSocketAddress(host, port))).sync();
log.info("wrote packet");
if (!ch.closeFuture().await(5000)) {
log.warning("server timed out");
}
} finally {
group.shutdownGracefully();
}
}
}
handler:
package net.bounceme.dur.netty;
import io.netty.buffer.Unpooled;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.SimpleChannelInboundHandler;
import io.netty.channel.socket.DatagramPacket;
import io.netty.util.CharsetUtil;
import java.net.InetSocketAddress;
import java.util.logging.Logger;
public class DatagramClientHandler extends SimpleChannelInboundHandler<DatagramPacket> {
private static final Logger log = Logger.getLogger(DatagramClientHandler.class.getName());
#Override
public void channelRead0(ChannelHandlerContext ctx, DatagramPacket msg) throws Exception {
String response = msg.content().toString(CharsetUtil.UTF_8);
log.info(response);
DatagramPacket foo = new DatagramPacket(
Unpooled.copiedBuffer("QOTM?", CharsetUtil.UTF_8),
new InetSocketAddress("localhost", 4454));
ctx.writeAndFlush(foo);
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) {
log.severe(cause.toString());
ctx.close();
}
}
I omitted the server code, it's almost exactly as in the Ghandi quote example.
What changes do I need to make to the client so that the handler can use Foo instead of DatagramPacket?
All I can say with certainty is that this client:
package net.bounceme.dur.netty;
import io.netty.bootstrap.Bootstrap;
import io.netty.buffer.Unpooled;
import io.netty.channel.Channel;
import io.netty.channel.ChannelOption;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.DatagramPacket;
import io.netty.channel.socket.nio.NioDatagramChannel;
import io.netty.util.CharsetUtil;
import java.net.InetSocketAddress;
import java.util.logging.Logger;
import net.bounceme.dur.client.gui.MyProps;
import net.bounceme.dur.client.jdbc.Title;
public final class Client {
private static final Logger log = Logger.getLogger(Client.class.getName());
public void connect() throws InterruptedException {
MyProps p = new MyProps();
String host = p.getHost();
int port = p.getServerPort();
pingPongDatagram(host, port);
}
public void pingPongDatagram(String host, int port) throws InterruptedException {
EventLoopGroup group = new NioEventLoopGroup();
try {
Bootstrap b = new Bootstrap();
b.group(group)
.channel(NioDatagramChannel.class)
.option(ChannelOption.SO_BROADCAST, true)
.handler(new TitleClientHandler());
Channel ch = b.bind(0).sync().channel();
ch.writeAndFlush(new DatagramPacket(
Unpooled.copiedBuffer("QOTM?", CharsetUtil.UTF_8),
new InetSocketAddress(host, port))).sync();
ch.writeAndFlush(new Title());
log.info("wrote packets");
if (!ch.closeFuture().await(5000)) {
log.warning("server timed out");
}
} finally {
group.shutdownGracefully();
}
}
}
and handler:
package net.bounceme.dur.netty;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.SimpleChannelInboundHandler;
import java.util.logging.Logger;
import net.bounceme.dur.client.jdbc.Title;
public class TitleClientHandler extends SimpleChannelInboundHandler<Title> {
private static final Logger log = Logger.getLogger(TitleClientHandler.class.getName());
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) {
log.severe(cause.toString());
ctx.close();
}
#Override
protected void channelRead0(ChannelHandlerContext chc, Title title) throws Exception {
log.info(title.toString());
}
}
don't, seemingly, communicate at all with the server -- even when the server has been modified accordingly.
9184 Bytes memory is leaking in the following code
Main Class:
import java.net.InetSocketAddress;
import java.util.concurrent.Executors;
import org.jboss.netty.bootstrap.ServerBootstrap;
import org.jboss.netty.channel.ChannelFactory;
import org.jboss.netty.channel.ChannelPipeline;
import org.jboss.netty.channel.ChannelPipelineFactory;
import org.jboss.netty.channel.Channels;
import org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory;
import org.jboss.netty.handler.timeout.IdleStateHandler;
import org.jboss.netty.util.HashedWheelTimer;
import org.jboss.netty.util.Timer;
public class NettyMemoryLeakTest {
public static void main(String[] args) {
final Timer timer = new HashedWheelTimer();
final IdleStateHandler idle = new IdleStateHandler(timer, 0, 0, 1);
final ChannelFactory cf = new NioServerSocketChannelFactory(Executors.newCachedThreadPool(), Executors.newCachedThreadPool());
final ServerBootstrap bootStrap = new ServerBootstrap(cf);
final ServerHandler objServerHandler = new ServerHandler();
bootStrap.setPipelineFactory(new ChannelPipelineFactory() {
#Override
public ChannelPipeline getPipeline() throws Exception {
ChannelPipeline pipeline = Channels.pipeline();
pipeline.addLast("IDLE", idle);
pipeline.addLast("Handler", objServerHandler);
return pipeline;
}
});
bootStrap.bind(new InetSocketAddress(8080));
}
}
Handler :
import org.jboss.netty.channel.ChannelHandlerContext;
import org.jboss.netty.handler.timeout.IdleState;
import org.jboss.netty.handler.timeout.IdleStateAwareChannelHandler;
import org.jboss.netty.handler.timeout.IdleStateEvent;
public class ServerHandler extends IdleStateAwareChannelHandler {
#Override
public void channelIdle(ChannelHandlerContext ctx, IdleStateEvent e) {
if (e.getState() == IdleState.ALL_IDLE) {
System.gc();
}
}
}
Am using jdk 1.7, netty 3.5.8 Final
Am checking the memory leack through netbeans profiler. It shows the variation of memory. The memory used
AFTER 48 Min leackage is 2020kb
please help me to find out the issue
Thank you.
System.gc() will is not guarantee to do a GC at all. If you really think it is a memory leak take a heap-dump in binary form and provide it for inspection.
Using JBOSS Netty, I'm trying to send data continuously to the connected client. In the example below,
I try to send the time every 5 secs to the client, as soon as the client gets connected (channelConnected).
But this is not working. It works only if I comment the while loop.
import java.net.InetAddress;
import java.net.InetSocketAddress;
import java.util.Date;
import java.util.concurrent.Executors;
import java.util.logging.Level;
import java.util.logging.Logger;
import org.jboss.netty.bootstrap.ServerBootstrap;
import org.jboss.netty.channel.ChannelHandlerContext;
import org.jboss.netty.channel.ChannelPipeline;
import org.jboss.netty.channel.ChannelPipelineFactory;
import org.jboss.netty.channel.ChannelStateEvent;
import org.jboss.netty.channel.Channels;
import org.jboss.netty.channel.ExceptionEvent;
import org.jboss.netty.channel.SimpleChannelUpstreamHandler;
import org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory;
import org.jboss.netty.handler.codec.string.StringEncoder;
public class SRNGServer {
public static void main(String[] args) throws Exception {
// Configure the server.
ServerBootstrap bootstrap = new ServerBootstrap(
new NioServerSocketChannelFactory(
Executors.newCachedThreadPool(),
Executors.newCachedThreadPool()));
// Configure the pipeline factory.
bootstrap.setPipelineFactory(new SRNGServerPipelineFactoryP());
// Bind and start to accept incoming connections.
bootstrap.bind(new InetSocketAddress(8080));
}
private static class SRNGServerHandlerP extends SimpleChannelUpstreamHandler {
private static final Logger logger = Logger.getLogger(SRNGServerHandlerP.class.getName());
#Override
public void channelConnected(
ChannelHandlerContext ctx, ChannelStateEvent e) throws Exception {
// Send greeting for a new connection.
e.getChannel().write("Welcome to " + InetAddress.getLocalHost().getHostName() + "!\r\n");
while(true){
e.getChannel().write("It is " + new Date() + " now.\r\n");
Thread.sleep(1000*5);
}
}
#Override
public void exceptionCaught(
ChannelHandlerContext ctx, ExceptionEvent e) {
logger.log(
Level.WARNING,
"Unexpected exception from downstream.",
e.getCause());
e.getChannel().close();
}
}
private static class SRNGServerPipelineFactoryP implements ChannelPipelineFactory {
public ChannelPipeline getPipeline() throws Exception {
// Create a default pipeline implementation.
ChannelPipeline pipeline = Channels.pipeline();
pipeline.addLast("encoder", new StringEncoder());
pipeline.addLast("handler", new SRNGServerHandlerP());
return pipeline;
}
}
}
The Netty documentation actually states that you should never make a Handler wait because it might eventually deadlock. The reason is that handler methods are called directly by I/O threads. One I/O thread in Netty performs multiple I/O operations in a sequence, so it's not one thread per operation.
In the channelConnected method you should start a new thread with a reference to the channel and make that thread send the time every 5 seconds. This would spawn one thread per connection.
Alternatively, you can have one single thread looping over a list of clients every 5 seconds and sending the time to each of them in a sequence.
Anyway, it's important to use a different thread for sending than the one that calls the Handler.
For what its worth, I figured the solution and here's the working code. After the "write" of time, I register the future with my ChannelFuturelistener. And then from operationComplete I keep registering the new future for every write. This works for what I want to accomplish, without using any extra threads.
import java.net.InetSocketAddress;
import java.nio.channels.ClosedChannelException;
import java.util.Date;
import java.util.concurrent.Executors;
import java.util.logging.Level;
import java.util.logging.Logger;
import org.jboss.netty.bootstrap.ServerBootstrap;
import org.jboss.netty.channel.Channel;
import org.jboss.netty.channel.ChannelFuture;
import org.jboss.netty.channel.ChannelFutureListener;
import org.jboss.netty.channel.ChannelHandlerContext;
import org.jboss.netty.channel.ChannelPipeline;
import org.jboss.netty.channel.ChannelPipelineFactory;
import org.jboss.netty.channel.ChannelStateEvent;
import org.jboss.netty.channel.Channels;
import org.jboss.netty.channel.ExceptionEvent;
import org.jboss.netty.channel.SimpleChannelUpstreamHandler;
import org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory;
import org.jboss.netty.handler.codec.string.StringEncoder;
public class SRNGServer {
public static void main(String[] args) throws Exception {
// Configure the server.
ServerBootstrap bootstrap = new ServerBootstrap(
new NioServerSocketChannelFactory(
Executors.newCachedThreadPool(),
//Executors.newCachedThreadPool()
Executors.newFixedThreadPool(2),2
));
// Configure the pipeline factory.
bootstrap.setPipelineFactory(new SRNGServerPipelineFactoryP());
// Bind and start to accept incoming connections.
bootstrap.bind(new InetSocketAddress(8080));
}
private static class SRNGServerHandlerP extends SimpleChannelUpstreamHandler {
private static final Logger logger = Logger.getLogger(SRNGServerHandlerP.class.getName());
#Override
public void channelConnected(
ChannelHandlerContext ctx, ChannelStateEvent e) throws Exception {
// Send greeting for a new connection.
Channel ch=e.getChannel();
ChannelFuture writeFuture=e.getChannel().write("It is " + new Date() + " now.\r\n");
SRNGChannelFutureListener srngcfl=new SRNGChannelFutureListener();
writeFuture.addListener(srngcfl);
}
#Override
public void exceptionCaught(
ChannelHandlerContext ctx, ExceptionEvent e) {
logger.log(
Level.WARNING,
"Unexpected exception from downstream.",
e.getCause());
if(e.getCause() instanceof ClosedChannelException){
logger.log(Level.INFO, "****** Connection closed by client - Closing Channel");
}
e.getChannel().close();
}
}
private static class SRNGServerPipelineFactoryP implements ChannelPipelineFactory {
public ChannelPipeline getPipeline() throws Exception {
// Create a default pipeline implementation.
ChannelPipeline pipeline = Channels.pipeline();
pipeline.addLast("encoder", new StringEncoder());
pipeline.addLast("handler", new SRNGServerHandlerP());
return pipeline;
}
}
private static class SRNGChannelFutureListener implements ChannelFutureListener{
public void operationComplete(ChannelFuture future) throws InterruptedException{
Thread.sleep(1000*5);
Channel ch=future.getChannel();
if(ch!=null && ch.isConnected()){
ChannelFuture writeFuture=ch.write("It is " + new Date() + " now.\r\n");
//-- Add this instance as listener itself.
writeFuture.addListener(this);
}
}
}
}
Seems that the I/O thread is getting blocked as a result of sleep, so try using 2 worker threads instead:
ServerBootstrap bootstrap = new ServerBootstrap(
new NioServerSocketChannelFactory( Executors.newCachedThreadPool(),
Executors.newCachedThreadPool(), 2 ) );