Getting the main thread back in Netty ServerSocket - java

I had a question about how I could get the master thread back in Netty while creating a TCP server socket.
In the code below taken from here, "Hello Hello" would never be written in the output as the thread that starts the Server, waits on this line: f.channel().closeFuture().sync();. Do I need to create a separate thread to get the main thread back in this case or is there any way in Netty that would allow me to do so (getting the main thread back while having the TCP running in the background)?
public void start() throws Exception {
NioEventLoopGroup group = new NioEventLoopGroup();
try {
ServerBootstrap b = new ServerBootstrap();
b.group(group)
.channel(NioServerSocketChannel.class)
.localAddress(new InetSocketAddress(port))
.childHandler(new ChannelInitializer<SocketChannel>() {
#Override
public void initChannel(SocketChannel ch)
throws Exception {
ch.pipeline().addLast(
new EchoServerHandler());
}
});
ChannelFuture f = b.bind().sync();
System.out.println(EchoServer.class.getName() + " started and listen on " + f.channel().localAddress());
f.channel().closeFuture().sync();
} finally {
group.shutdownGracefully().sync();
}
}
public static void main(String[] args) throws Exception {
if (args.length != 1) {
System.err.println(
"Usage: " + EchoServer.class.getSimpleName() +
" <port>");
return;
}
int port = Integer.parseInt(args[0]);
new EchoServer(port).start();
System.out.println("Hello Hello");
}

You are not required to wait for the closefuture. This is only done in the tutorials to make the the event loop group is properly closed.
You can remove the f.channel().closeFuture().sync(); and the group.shutdownGracefully().sync(); line from your program to make it non-blocking.
Make sure to call f.channel().close(), then f.channel().closeFuture().sync() and finally group.shutdownGracefully().sync(); when shutting down your main program to make sure the Netty stack is properly stopped

Related

netty retry connect client will freeze

I try to create a client which will retry connect when previous connection timeout. This program tries to connect to localhost:8007 which port 8007 is without any service, so the program will retry after connection time out. But this code will free after running for a while. The program freezes when there are about 3600 threads. I expect it will continue to retry rather than it will freeze.
The standard output's last output is "retry connect begin".
Does anyone know the reason why it will freeze?
JProfiler: program's Thread statistic, shows 2 threads are blocked on java.lang.ThreadGroup:
JProfiler showing program's Thread statistic
public final class EchoClient2 {
static final boolean SSL = System.getProperty("ssl") != null;
static final String HOST = System.getProperty("host", "127.0.0.1");
static final int PORT = Integer.parseInt(System.getProperty("port", "8007"));
static final int SIZE = Integer.parseInt(System.getProperty("size", "256"));
public static void main(String[] args) throws Exception {
// Configure SSL.git
EchoClient2 echoClient2 = new EchoClient2();
echoClient2.connect();
}
public void connect() throws InterruptedException {
final SslContext sslCtx;
// Configure the client.
EventLoopGroup group = new NioEventLoopGroup();
try {
Bootstrap b = new Bootstrap();
b.group(group)
.channel(NioSocketChannel.class)
.option(ChannelOption.TCP_NODELAY, true)
.option(ChannelOption.CONNECT_TIMEOUT_MILLIS, 10)
.handler(new ChannelInitializer<SocketChannel>() {
#Override
public void initChannel(SocketChannel ch) throws Exception {
ChannelPipeline p = ch.pipeline();
//p.addLast(new LoggingHandler(LogLevel.INFO));
p.addLast(new EchoClientHandler());
}
});
// Start the client.
ChannelFuture f = b.connect(HOST, PORT);
f.addListener(new ConnectionListener());
System.out.println("add listener");
f.sync();
System.out.println("connect sync finish");
// Wait until the connection is closed.
f.channel().closeFuture().sync();
System.out.println("channel close");
} finally {
// Shut down the event loop to terminate all threads.
//group.shutdownGracefully();
}
}
}
public class ConnectionListener implements ChannelFutureListener {
#Override
public void operationComplete(ChannelFuture channelFuture) throws Exception {
System.out.println("enter listener");
EventLoop eventLoop = channelFuture.channel().eventLoop();
eventLoop.schedule(new Runnable() {
#Override
public void run() {
try {
System.out.println("retry connect begin");
new EchoClient2().connect();
System.out.println("retry connect exit");
} catch (InterruptedException e) {
System.out.println(e);
e.printStackTrace();
}
}
}, 10, TimeUnit.MILLISECONDS);
System.out.println("exit listener");
}
}

My java multi-thread (client-server) program throws a NullPointerException when socket object instance is created using the accept() method

I am trying to create a multi-threaded client-server communication program that uses 2 threads to connect to multiple clients (but only 2 at a time).
The characteristics of the program are:
The clients can terminate the communication program from their side but the server thread does not exit.
The threads in the server do not close ServerSocket until the exit condition is fulfilled by the server program, i.e. the server keeps running continuously connecting to various clients if requested.
Every time a client terminates the program only the communication (related) streams are closed.
Now the problem is the line of code where the Socket object is created. After calling the accept() method of ServerSocket object, a NullPointerException is thrown. Any insight as to where I am going wrong would be very helpful.
My Server side code:
class Clientconnect implements Runnable
{
ServerSocket ss;
Socket s;
String n;
int f;
Clientconnect() throws Exception
{
new ServerSocket(776);
new Socket();
}
public void run()
{
n = Thread.currentThread().getName();
while(true) // thread iteration
{
try
{
System.out.println("Thread "+Thread.currentThread().getName()+" is ready to accept a
connection.....");
s = ss.accept(); // ------**The NullPointerException occurs here**
System.out.println("Thread "+Thread.currentThread().getName()+" has accepted a connection:\n----------");
PrintStream ps = new PrintStream (s.getOutputStream());
BufferedReader cl = new BufferedReader (new InputStreamReader (s.getInputStream()));
BufferedReader kb = new BufferedReader (new InputStreamReader (System.in));
String in, out;
ps.println("you are connected via thread "+n);
ps.println("----------------------");
while (true)
{
in = cl.readLine();
if( in.equalsIgnoreCase("system_exit"))
{
break;
}
System.out.println("Client : "+in);
System.out.print("Server "+n+" :");
out = kb.readLine();
ps.println(out);
}
s.close();
ps.close();
cl.close();
System.out.print("do you want to close the server socket\n1:close\n2:continue\nenter");
f = Integer.parseInt(kb.readLine());
if(f == 1)
{
ss.close();
break;
}
else
{
continue;
}
}
catch (Exception e){e.printStackTrace();}
}
}
}
class test2g
{
public static void main (String args[]) throws Exception
{
Clientconnect cc = new Clientconnect();
Thread t1 = new Thread (cc, "t1");
Thread t2 = new Thread (cc, "t2");
t1.start();
t2.start();
}
}
It is a fairly simple communications program with no complex resource accessing or retrieval. I am running the client end on the same machine so it's "localhost".
My client side code is merely a reciprocation of the try{} block.
P.S. I have tried declaring the ServerSocket & Socket objects as static in the Clientconnect class but it did not help.
I believe ss needs to be assigned in the constructor:
Clientconnect() throws Exception
{
ss = new ServerSocket(776);
new Socket(); // Not needed because it creates a Socket that is immediately thrown away.
}

Java sockets won't connect to server after some time

I have Java client which connects to my Java server. Java clients are in different networks and are running on MTX-GTW (Embedded Linux).
When I start server and the clients everything works fine, and clients are sending data every 1 minute. But after a day or more, clients will stop sending data one by one. Time varies.
But rest of the program runs fine, since program uses HTTP to communicate with some API and there we are still receiving data.
I checked server debug output and I can't see any errors or exceptions. I tried restarting the server and it also didn't help. My next step will be, to have similar client on my PC, so that I can see debug log, but that can take some time. So would any of you have any idea what could be the problem?
I use Java 7, here I call method to open socket:
static private void createHomeCallTimer()
{
new java.util.Timer().schedule(
new java.util.TimerTask()
{
public void run()
{
log.info("homeCall run");
Main main = new Main();
String data = "xxxxx";
try
{
log.info("Start of HOMECALL with data: " + data);
new TCPClient().openSocketAndSendData(data);
createHomeCallTimer();
} catch (Exception e)
{
log.error("Exception on homeCall: " + e);
createHomeCallTimer();
}
}
},
HOMECALLTIME
);
}
And this is client which is called by that method:
public class TCPClient
{
public void openSocketAndSendData(String data) throws IOException
{
Logger log = Logger.getLogger(TCPClient.class);
String ip = "xx.xx.xx.xx";
int port = 9000;
Socket clientSocket = new Socket(ip, port);
log.info("SOCKET TO IKU SERVER OPENED");
DataOutputStream outToServer = new DataOutputStream(clientSocket.getOutputStream());
BufferedReader inFromServer = new BufferedReader(new InputStreamReader(clientSocket.getInputStream()));
log.info("IKU SERVER: " + inFromServer.readLine());
outToServer.writeBytes(data);
clientSocket.close();
log.info("SOCKET CLOSED");
}
}
After you said that device has really small memory, then here's my suspect.
I didn't run your code but stripped it a little and speed it up to see what happens.
Here's code:
public class Main {
private static final int HOMECALLTIME = 10;
static private void createHomeCallTimer() {
new java.util.Timer().schedule(new java.util.TimerTask() {
public void run() {
// log.info("homeCall run");
// Main main = new Main();
String data = "xxxxx";
System.out.println(data);
// log.info("Start of HOMECALL with data: " + data);
// new TCPClient().openSocketAndSendData(data);
createHomeCallTimer();
}
}, HOMECALLTIME);
}
public static void main(String[] args) {
createHomeCallTimer();
}
}
And here's output after few minutes:
Exception in thread "Timer-21424" java.lang.OutOfMemoryError: unable
to create new native thread
at java.lang.Thread.start0(Native Method) at
java.lang.Thread.start(Thread.java:714) at
java.util.Timer.(Timer.java:160) at
java.util.Timer.(Timer.java:132) at
pkg.Main.createHomeCallTimer(Main.java:13) at
pkg.Main.access$0(Main.java:12) at pkg.Main$1.run(Main.java:22) at
java.util.TimerThread.mainLoop(Timer.java:555) at
java.util.TimerThread.run(Timer.java:505)
I suspect recursive call prevents freeing up memory and you run out of memory on your device. That's just a suspicion, but doesn't quite fit in a comment.
Here's the same code without recursion, using Timer:
static private void createHomeCallTimer() {
new java.util.Timer().scheduleAtFixedRate(new java.util.TimerTask() {
public void run() {
// log.info("homeCall run");
// Main main = new Main();
String data = "xxxxx";
System.out.println(data);
// log.info("Start of HOMECALL with data: " + data);
// new TCPClient().openSocketAndSendData(data);
}
}, 0, HOMECALLTIME);
}

Why a netty client performs as zombie?

I use netty as socket client:
public void run() {
isRunning = true;
EventLoopGroup group = new NioEventLoopGroup(EventLoopsPerGetter);
Bootstrap b = new Bootstrap();
b.group(group).channel(NioSocketChannel.class)
.handler(new ChannelInitializer<SocketChannel>() {
#Override
protected void initChannel(SocketChannel ch) throws Exception {
ChannelPipeline p = ch.pipeline();
p.addLast(
new ProtobufVarint32FrameDecoder(),
ZlibCodecFactory.newZlibDecoder(ZlibWrapper.GZIP),
new ProtobufDecoder(Protocol.Packet.getDefaultInstance()),
new ProtobufVarint32LengthFieldPrepender(),
ZlibCodecFactory.newZlibEncoder(ZlibWrapper.GZIP),
new ProtobufEncoder(),
session
);
}
});
try {
while(isRunning) {
try {
b.connect(host, port).sync().channel().closeFuture().sync();
} catch(Exception e) {
if (e instanceof InterruptedException) {
throw e;
}
retryLogger.warn("try to connect to " + host + " : " + port + " , but", e);
}
if(isRunning) {
retryLogger.info("netty connection lost, retry!");
Thread.sleep(RetryInterval);
}
}
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
} finally {
group.shutdownGracefully();
}
}
The session code is very simple, send Get-packet to server, get response, write file, then send next Get-packet.
In this program, I start two netty client threads, but after running several days, one of them behaves like a zombie thread, that is, even if I kill the netty server, the zombie client prints no log while the other client prints the wanted logs. By the way, the jstack file shows both threads are live, not dead.
I am using netty 5.
You don't have any mechanism for a readtimeout, what happens is that there is no traffic for 10~ (depends on the router model) minutes and the NAT table in the router thinks the connection is done, and closes the connection.
You have multiple ways to solve this problem:
Using ReadTimeoutHandler
ReadTimeoutHandler closes the channel and throws a ReadTimeoutException if a timeout is detected. You can catch this exception if needed via the exceptionCaught. With your existing logic, you don't need to catch this.
This handler can also be used in combination with a WriteTimeoutHandler to write "ping" messages to the remote. However the following solution is better for this purpose.
Using IdleStateHandler
You can also use a IdleStateHandler for this purpose, this handler has 3 arguments that stand for readerIdleTime, writeIdleTime and allIdleTime. The advantage of this class is that it doesn't throw exceptions and uses the Netty userEventTriggered to dispatch its calls, while this makes the class harder to use, you can do more things with it.
For example, if you protocol supports ping messages, you can use this class to send to send those ping messages. Its really easy to use this class and can be used in handlers as the following:
public class MyChannelInitializer extends ChannelInitializer<Channel> {
#Override
public void initChannel(Channel channel) {
channel.pipeline().addLast("idleStateHandler", new IdleStateHandler(60, 30, 0));
channel.pipeline().addLast("myHandler", new MyHandler());
}
}
// Handler should handle the IdleStateEvent triggered by IdleStateHandler.
public class MyHandler extends ChannelHandlerAdapter {
#Override
public void userEventTriggered(ChannelHandlerContext ctx, Object evt) throws Exception {
if (evt instanceof IdleStateEvent) {
IdleStateEvent e = (IdleStateEvent) evt;
if (e.state() == IdleState.READER_IDLE) {
ctx.close();
} else if (e.state() == IdleState.WRITER_IDLE) {
ctx.writeAndFlush(new PingMessage());
}
}
}
}

UnsupportedOperationException when sending messages through ServerBootstrap ChannelPipeline in Netty

I am using Netty 5.0.
I have a complementary client bootstrap for which I took the SecureChatClient.java example from netty github.
Wenn I send messages from the client bootstrap to the server it works perfectly fine. When I try to send messages from the server bootstrap to the client (after successfully initiating a connection/channel through the client first) I get a java.lang.UnsupportedOperationException without any further information on it. Sending messages from server to client is done via code above.
Is a serverbootstrap for receiving only?
Is a serverbootstrap not meant to be able to write messages back to the client as shown above? By that I mean, messages can enter a ChannelPipeline from a socket up through the ChannelHandlers, but only the ChannelHandlers are supposed to be writing responses back down the ChannelPipeline and out the socket. So in a ServerBootstrap a user is not meant to be able to send messages down the ChannelPipeline from outside the Pipeline. (Hope that makes sense)
Or am I simply missing something?
My code follows:
// Ports.
int serverPort = 8080;
EventLoopGroup bossGroup = new NioEventLoopGroup();
EventLoopGroup workerGroup = new NioEventLoopGroup();
try {
ServerBootstrap b = new ServerBootstrap();
b.group(bossGroup, workerGroup)
.channel(NioServerSocketChannel.class)
.childHandler(new ChannelInitializer<SocketChannel>() {
#Override
public void initChannel(SocketChannel ch) throws Exception {
ch.pipeline().addLast("MyMessageHandler", new MyMessageHandler());
}
})
.option(ChannelOption.SO_BACKLOG, 128)
.childOption(ChannelOption.SO_KEEPALIVE, true);
// Bind and start to accept incoming connections.
ChannelFuture f = b.bind(serverPort).sync();
Channel ch = f.channel();
System.out.println("Server: Running!");
// Read commands from the stdin.
ChannelFuture lastWriteFuture = null;
BufferedReader in = new BufferedReader(new InputStreamReader(System.in));
while(true)
{
String line = in.readLine();
if (line == null) break;
ByteBuf getOut = buffer(64);
getOut.writeBytes(line.getBytes());
// Sends the received line to the server.
lastWriteFuture = ch.writeAndFlush(getOut);
lastWriteFuture.addListener(new ChannelFutureListener() {
#Override
public void operationComplete(ChannelFuture cf) throws Exception {
if(cf.isSuccess()) {
System.out.println("CFListener: SUCCESS! YEAH! HELL! YEAH!");
} else {
System.out.println("CFListener: failure! FAILure! FAILURE!");
System.out.println(cf.cause());
}
}
});
}
// Wait until all messages are flushed before closing the channel.
if (lastWriteFuture != null) {
lastWriteFuture.sync();
}
// Wait until the server socket is closed.
// In this example, this does not happen, but you can do that to gracefully
// shut down your server.
f.channel().closeFuture().sync();
} catch (InterruptedException | UnsupportedOperationException e) {
e.printStackTrace();
} finally {
workerGroup.shutdownGracefully();
bossGroup.shutdownGracefully();
}
I started using the following example: https://github.com/netty/netty/tree/4.1/example/src/main/java/io/netty/example/securechat
My problem is that I get the following exception when calling ch.writeAndFlush:
java.lang.UnsupportedOperationException
at io.netty.channel.socket.nio.NioServerSocketChannel.filterOutboundMessage(NioServerSocketChannel.java:184)
at io.netty.channel.AbstractChannel$AbstractUnsafe.write(AbstractChannel.java:784)
at io.netty.channel.DefaultChannelPipeline$HeadContext.write(DefaultChannelPipeline.java:1278)
at io.netty.channel.ChannelHandlerInvokerUtil.invokeWriteNow(ChannelHandlerInvokerUtil.java:158)
at io.netty.channel.DefaultChannelHandlerInvoker$WriteTask.run(DefaultChannelHandlerInvoker.java:440)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:328)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
at io.netty.util.internal.chmv8.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1412)
at io.netty.util.internal.chmv8.ForkJoinTask.doExec(ForkJoinTask.java:280)
at io.netty.util.internal.chmv8.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:877)
at io.netty.util.internal.chmv8.ForkJoinPool.scan(ForkJoinPool.java:1706)
at io.netty.util.internal.chmv8.ForkJoinPool.runWorker(ForkJoinPool.java:1661)
at io.netty.util.internal.chmv8.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:126)
You cannot write to a ServerChannel, you can only connect to normal channels. Your call to writeAndFlush is failing for this reason.
To send a message to every client, you should store the channel of every client inside a ChannelGroup and invoke writeAndFlush() on that.
A quick way to do this is adding another handler to your ServerBootstrap that puts the incoming connections inside the ChannelGroup, a quick implementation of this would be this:
// In your main:
ChannelGroup allChannels =
new DefaultChannelGroup(GlobalEventExecutor.INSTANCE);
// In your ChannelInitializer<SocketChannel>
ch.pipeline().addLast("grouper", new GlobalSendHandler());
// New class:
public class MyHandler extends ChannelInboundHandlerAdapter {
#Override
public void channelActive(ChannelHandlerContext ctx) {
allChannels.add(ctx.channel());
super.channelActive(ctx);
}
}
Then we can call the following to send a message to every connection, this returns a ChannelGroupFuture instead of a normal ChannelFuture:
allChannels.writeAndFlush(getOut);
Your total code would look like this with the fixes from above:
// Ports.
int serverPort = 8080;
ChannelGroup allChannels =
new DefaultChannelGroup(GlobalEventExecutor.INSTANCE);
EventLoopGroup bossGroup = new NioEventLoopGroup();
EventLoopGroup workerGroup = new NioEventLoopGroup();
try {
ServerBootstrap b = new ServerBootstrap();
b.group(bossGroup, workerGroup)
.channel(NioServerSocketChannel.class)
.childHandler(new ChannelInitializer<SocketChannel>() {
#Override
public void initChannel(SocketChannel ch) throws Exception {
ch.pipeline().addLast("MyMessageHandler", new MyMessageHandler());
ch.pipeline().addLast("grouper", new GlobalSendHandler());
}
})
.option(ChannelOption.SO_BACKLOG, 128)
.childOption(ChannelOption.SO_KEEPALIVE, true);
// Bind and start to accept incoming connections.
ChannelFuture f = b.bind(serverPort).sync();
Channel ch = f.channel();
System.out.println("Server: Running!");
// Read commands from the stdin.
ChannelGroupFuture lastWriteFuture = null;
BufferedReader in = new BufferedReader(new InputStreamReader(System.in));
while(true)
{
String line = in.readLine();
if (line == null) break;
ByteBuf getOut = buffer(64);
getOut.writeBytes(line.getBytes());
// Sends the received line to the server.
lastWriteFuture = allChannels.writeAndFlush(getOut);
lastWriteFuture.addListener(new ChannelGroupFutureListener() {
#Override
public void operationComplete(ChannelGroupFuture cf) throws Exception {
if(cf.isSuccess()) {
System.out.println("CFListener: SUCCESS! YEAH! HELL! YEAH!");
} else {
System.out.println("CFListener: failure! FAILure! FAILURE!");
System.out.println(cf.cause());
}
}
});
}
// Wait until all messages are flushed before closing the channel.
if (lastWriteFuture != null) {
lastWriteFuture.sync();
}
// Wait until the server socket is closed.
// In this example, this does not happen, but you can do that to gracefully
// shut down your server.
f.channel().closeFuture().sync();
} catch (InterruptedException | UnsupportedOperationException e) {
e.printStackTrace();
} finally {
workerGroup.shutdownGracefully();
bossGroup.shutdownGracefully();
}
I think Netty Server has no decoder, encoder.
if you want to send String data,
serverBootstrap.group(bossGroup, workerGroup).childHandler(new ChannelInitializer<SocketChannel>() {
#Override
protected void initChannel(SocketChannel channel) throws Exception {
ChannelPipeline channelPipeline = channel.pipeline();
channelPipeline.addLast("String Encoder", new StringEncoder(CharsetUtil.UTF_8));
channelPipeline.addLast("String Decoder", new StringDecoder(CharsetUtil.UTF_8));
}
});
Add your server's Initializer!

Categories