Using JBOSS Netty, I'm trying to send data continuously to the connected client. In the example below,
I try to send the time every 5 secs to the client, as soon as the client gets connected (channelConnected).
But this is not working. It works only if I comment the while loop.
import java.net.InetAddress;
import java.net.InetSocketAddress;
import java.util.Date;
import java.util.concurrent.Executors;
import java.util.logging.Level;
import java.util.logging.Logger;
import org.jboss.netty.bootstrap.ServerBootstrap;
import org.jboss.netty.channel.ChannelHandlerContext;
import org.jboss.netty.channel.ChannelPipeline;
import org.jboss.netty.channel.ChannelPipelineFactory;
import org.jboss.netty.channel.ChannelStateEvent;
import org.jboss.netty.channel.Channels;
import org.jboss.netty.channel.ExceptionEvent;
import org.jboss.netty.channel.SimpleChannelUpstreamHandler;
import org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory;
import org.jboss.netty.handler.codec.string.StringEncoder;
public class SRNGServer {
public static void main(String[] args) throws Exception {
// Configure the server.
ServerBootstrap bootstrap = new ServerBootstrap(
new NioServerSocketChannelFactory(
Executors.newCachedThreadPool(),
Executors.newCachedThreadPool()));
// Configure the pipeline factory.
bootstrap.setPipelineFactory(new SRNGServerPipelineFactoryP());
// Bind and start to accept incoming connections.
bootstrap.bind(new InetSocketAddress(8080));
}
private static class SRNGServerHandlerP extends SimpleChannelUpstreamHandler {
private static final Logger logger = Logger.getLogger(SRNGServerHandlerP.class.getName());
#Override
public void channelConnected(
ChannelHandlerContext ctx, ChannelStateEvent e) throws Exception {
// Send greeting for a new connection.
e.getChannel().write("Welcome to " + InetAddress.getLocalHost().getHostName() + "!\r\n");
while(true){
e.getChannel().write("It is " + new Date() + " now.\r\n");
Thread.sleep(1000*5);
}
}
#Override
public void exceptionCaught(
ChannelHandlerContext ctx, ExceptionEvent e) {
logger.log(
Level.WARNING,
"Unexpected exception from downstream.",
e.getCause());
e.getChannel().close();
}
}
private static class SRNGServerPipelineFactoryP implements ChannelPipelineFactory {
public ChannelPipeline getPipeline() throws Exception {
// Create a default pipeline implementation.
ChannelPipeline pipeline = Channels.pipeline();
pipeline.addLast("encoder", new StringEncoder());
pipeline.addLast("handler", new SRNGServerHandlerP());
return pipeline;
}
}
}
The Netty documentation actually states that you should never make a Handler wait because it might eventually deadlock. The reason is that handler methods are called directly by I/O threads. One I/O thread in Netty performs multiple I/O operations in a sequence, so it's not one thread per operation.
In the channelConnected method you should start a new thread with a reference to the channel and make that thread send the time every 5 seconds. This would spawn one thread per connection.
Alternatively, you can have one single thread looping over a list of clients every 5 seconds and sending the time to each of them in a sequence.
Anyway, it's important to use a different thread for sending than the one that calls the Handler.
For what its worth, I figured the solution and here's the working code. After the "write" of time, I register the future with my ChannelFuturelistener. And then from operationComplete I keep registering the new future for every write. This works for what I want to accomplish, without using any extra threads.
import java.net.InetSocketAddress;
import java.nio.channels.ClosedChannelException;
import java.util.Date;
import java.util.concurrent.Executors;
import java.util.logging.Level;
import java.util.logging.Logger;
import org.jboss.netty.bootstrap.ServerBootstrap;
import org.jboss.netty.channel.Channel;
import org.jboss.netty.channel.ChannelFuture;
import org.jboss.netty.channel.ChannelFutureListener;
import org.jboss.netty.channel.ChannelHandlerContext;
import org.jboss.netty.channel.ChannelPipeline;
import org.jboss.netty.channel.ChannelPipelineFactory;
import org.jboss.netty.channel.ChannelStateEvent;
import org.jboss.netty.channel.Channels;
import org.jboss.netty.channel.ExceptionEvent;
import org.jboss.netty.channel.SimpleChannelUpstreamHandler;
import org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory;
import org.jboss.netty.handler.codec.string.StringEncoder;
public class SRNGServer {
public static void main(String[] args) throws Exception {
// Configure the server.
ServerBootstrap bootstrap = new ServerBootstrap(
new NioServerSocketChannelFactory(
Executors.newCachedThreadPool(),
//Executors.newCachedThreadPool()
Executors.newFixedThreadPool(2),2
));
// Configure the pipeline factory.
bootstrap.setPipelineFactory(new SRNGServerPipelineFactoryP());
// Bind and start to accept incoming connections.
bootstrap.bind(new InetSocketAddress(8080));
}
private static class SRNGServerHandlerP extends SimpleChannelUpstreamHandler {
private static final Logger logger = Logger.getLogger(SRNGServerHandlerP.class.getName());
#Override
public void channelConnected(
ChannelHandlerContext ctx, ChannelStateEvent e) throws Exception {
// Send greeting for a new connection.
Channel ch=e.getChannel();
ChannelFuture writeFuture=e.getChannel().write("It is " + new Date() + " now.\r\n");
SRNGChannelFutureListener srngcfl=new SRNGChannelFutureListener();
writeFuture.addListener(srngcfl);
}
#Override
public void exceptionCaught(
ChannelHandlerContext ctx, ExceptionEvent e) {
logger.log(
Level.WARNING,
"Unexpected exception from downstream.",
e.getCause());
if(e.getCause() instanceof ClosedChannelException){
logger.log(Level.INFO, "****** Connection closed by client - Closing Channel");
}
e.getChannel().close();
}
}
private static class SRNGServerPipelineFactoryP implements ChannelPipelineFactory {
public ChannelPipeline getPipeline() throws Exception {
// Create a default pipeline implementation.
ChannelPipeline pipeline = Channels.pipeline();
pipeline.addLast("encoder", new StringEncoder());
pipeline.addLast("handler", new SRNGServerHandlerP());
return pipeline;
}
}
private static class SRNGChannelFutureListener implements ChannelFutureListener{
public void operationComplete(ChannelFuture future) throws InterruptedException{
Thread.sleep(1000*5);
Channel ch=future.getChannel();
if(ch!=null && ch.isConnected()){
ChannelFuture writeFuture=ch.write("It is " + new Date() + " now.\r\n");
//-- Add this instance as listener itself.
writeFuture.addListener(this);
}
}
}
}
Seems that the I/O thread is getting blocked as a result of sleep, so try using 2 worker threads instead:
ServerBootstrap bootstrap = new ServerBootstrap(
new NioServerSocketChannelFactory( Executors.newCachedThreadPool(),
Executors.newCachedThreadPool(), 2 ) );
Related
I'm using Netty for a server that needs to handle hundreds of thousands of requests per second while maintaining as little variance as possible on response latencies. I'm doing some final optimizations and I'm currently looking into reducing unnecessary memory allocation by reusing whatever objects I can. A simplified example of a server highlighting my issue is the following:
import io.netty.bootstrap.ServerBootstrap;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelInitializer;
import io.netty.channel.ChannelPipeline;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.SocketChannel;
import io.netty.channel.socket.nio.NioServerSocketChannel;
import io.netty.handler.codec.http.HttpServerCodec;
import io.netty.handler.codec.http.HttpObjectAggregator;
public class NettyServer {
public void run() throws Exception {
EventLoopGroup bossGroup = new NioEventLoopGroup();
EventLoopGroup workerGroup = new NioEventLoopGroup();
try {
ServerBootstrap b = new ServerBootstrap();
b.group(bossGroup, workerGroup)
.channel(NioServerSocketChannel.class)
.childHandler(new ChannelInitializer<SocketChannel>() {
#Override
public void initChannel(SocketChannel ch) throws Exception {
ChannelPipeline p = ch.pipeline();
p.addLast(new HttpServerCodec());
p.addLast(new HttpObjectAggregator(1048576));
p.addLast(new NettyHandler());
}
});
ChannelFuture f = b.bind(8080).sync();
f.channel().closeFuture().sync();
} finally {
workerGroup.shutdownGracefully();
bossGroup.shutdownGracefully();
}
}
public static void main(String[] args) throws Exception {
new NettyServer().run();
}
}
The handler code is the following:
import io.netty.buffer.Unpooled;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelFutureListener;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.SimpleChannelInboundHandler;
import io.netty.handler.codec.http.DefaultFullHttpResponse;
import io.netty.handler.codec.http.FullHttpRequest;
import io.netty.handler.codec.http.FullHttpResponse;
import io.netty.handler.codec.http.HttpUtil;
import io.netty.handler.codec.http.HttpHeaderNames;
import io.netty.handler.codec.http.HttpHeaderValues;
import io.netty.handler.codec.http.HttpResponseStatus;
import io.netty.handler.codec.http.HttpVersion;
import io.netty.handler.codec.http.QueryStringDecoder;
import io.netty.util.CharsetUtil;
public class NettyHandler extends SimpleChannelInboundHandler<Object> {
private static final FullHttpResponse okResponse = OkResponse();
private static final FullHttpResponse failResponse = FailResponse();
#Override
public void channelReadComplete(ChannelHandlerContext ctx) {
ctx.flush();
}
#Override
protected void channelRead0(ChannelHandlerContext ctx, Object msg) {
FullHttpRequest request = (FullHttpRequest) msg;
QueryStringDecoder query = new QueryStringDecoder(request.getUri());
String path = query.path();
ChannelFuture f;
boolean keepAlive = HttpUtil.isKeepAlive(request);
if ("/ok".equals(path)) {
f = ctx.write(okResponse);
} else {
f = ctx.write(failResponse);
keepAlive = false;
}
if (!keepAlive) {
f.addListener(ChannelFutureListener.CLOSE);
}
}
private static FullHttpResponse OkResponse() {
String data = "{ \"status\": ok }";
FullHttpResponse response = new DefaultFullHttpResponse(
HttpVersion.HTTP_1_1,
HttpResponseStatus.OK,
Unpooled.copiedBuffer(data, CharsetUtil.UTF_8)
);
response.headers().set(HttpHeaderNames.CONTENT_TYPE, HttpHeaderValues.APPLICATION_JSON);
response.headers().set(HttpHeaderNames.CACHE_CONTROL, "max-age=0, no-cache, must-revalidate, proxy-revalidate");
return response;
}
private static FullHttpResponse FailResponse() {
String data = "{ \"status\": fail }";
FullHttpResponse response = new DefaultFullHttpResponse(
HttpVersion.HTTP_1_1,
HttpResponseStatus.OK,
Unpooled.copiedBuffer(data, CharsetUtil.UTF_8)
);
response.headers().set(HttpHeaderNames.CONTENT_TYPE, HttpHeaderValues.APPLICATION_JSON);
response.headers().set(HttpHeaderNames.CACHE_CONTROL, "max-age=0, no-cache, must-revalidate, proxy-revalidate");
return response;
}
}
The handler shows what I'm trying to accomplish. The handler contains static instances of fixed HTTP responses. For the server all responses except error codes come from a small group and can be preconstructed. With the above code the second query to a handler will fail, since Netty's ref counts for the response has gone down to zero. I was expecting that just calling retain() on the object would be enough, but it doesn't look like it is.
What would be the most efficient way to reuse the HTTP response objects between requests?
You should call retainedDuplicate() as otherwise the readerIndex etc may become “invalid”
I'm trying to create a TCP server that read data periodically from a database (Redis) and send it to the appropriate client.
However, since I'm pretty new to Netty, I don't know how could I schedule this. I do know that I need to use a Scheduled Executor Service like this:
ScheduledExecutorService e = Executors.newSingleThreadScheduledExecutor();
e.scheduleAtFixedRate(() -> {
System.out.println("Calling...");
// Do something
}, 1, 1, TimeUnit.SECONDS);
However, when I tried to put that in the server code, It's only calling the method once. I've tried to put that in different place but still can't seem to get it right. What should I do?
Here's the code of the server:
package com.example.test.app;
import io.netty.bootstrap.ServerBootstrap;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelInitializer;
import io.netty.channel.ChannelOption;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.SocketChannel;
import io.netty.channel.socket.nio.NioServerSocketChannel;
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.TimeUnit;
public class Server {
public static void main(String[] args) throws Exception
{
EventLoopGroup bossGroup = new NioEventLoopGroup();
EventLoopGroup workerGroup = new NioEventLoopGroup();
final ServerHandler handler = new ServerHandler();
try {
ServerBootstrap b = new ServerBootstrap();
b.group(bossGroup, workerGroup);
b.channel(NioServerSocketChannel.class);
b.childHandler(new ChannelInitializer<SocketChannel>() {
#Override
protected void initChannel(SocketChannel ch) throws Exception
{
ch.pipeline().addLast(handler);
}
});
b.option(ChannelOption.SO_BACKLOG, 128);
b.childOption(ChannelOption.SO_KEEPALIVE, true);
ScheduledExecutorService e = Executors.newSingleThreadScheduledExecutor();
e.scheduleAtFixedRate(() -> {
System.out.println("Calling...");
handler.saySomething();
}, 1, 1, TimeUnit.SECONDS);
ChannelFuture f = b.bind(1337).sync();
f.channel().closeFuture().sync();
} finally {
workerGroup.shutdownGracefully();
bossGroup.shutdownGracefully();
}
}
}
And here's the server handler:
package com.example.test.app;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelFutureListener;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.ChannelInboundHandlerAdapter;
public class ServerHandler extends ChannelInboundHandlerAdapter {
private ChannelHandlerContext ctx;
#Override
public void channelActive(ChannelHandlerContext ctx)
{
this.ctx = ctx;
System.out.println("Someone's connedted!");
}
public void saySomething()
{
final ChannelFuture f = ctx.writeAndFlush("Sup!");
f.addListener((ChannelFutureListener) (ChannelFuture future) -> {
System.out.println("Something has been said!");
});
}
}
The method saySomething() generates NullPointerException for calling final ChannelFuture f = ctx.writeAndFlush("Sup!"); while ctx is null.
EventExecutorGroup.scheduleAtFixedRate javadoc description says that "If any execution of the task encounters an exception, subsequent executions are suppressed". So this is why you get is called only once...
Also, seems like Netty allows you to re-use a handler instance for different pipeline instances only if you annotate this handler's class as #Sharable. Otherwise, it will throw exception. If your handler is stateless (which is not your case, as yours has the ctx member) then you should annotate it as #Sharable and re-use it to all created pipelines. If it is stateful, create a new instance for every new pipeline (new client connection).
Finally, to schedule your task for each connected client you can use the executor which can be referenced by the ctx of the connected client's channel (by default, as in your case, the channel's EventLoop) on your channelActive() implementation. This executor implements ScheduledExecutorService, so you have also scheduleAtFixedRate.
Take a look at my version of your code and see if it suits you.
Server:
package com.example.test.app;
import io.netty.bootstrap.ServerBootstrap;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelInitializer;
import io.netty.channel.ChannelOption;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.SocketChannel;
import io.netty.channel.socket.nio.NioServerSocketChannel;
import java.util.concurrent.Executors;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.TimeUnit;
public class Server {
public static void main(String[] args) throws Exception
{
EventLoopGroup bossGroup = new NioEventLoopGroup();
EventLoopGroup workerGroup = new NioEventLoopGroup();
try {
ServerBootstrap b = new ServerBootstrap();
b.group(bossGroup, workerGroup);
b.channel(NioServerSocketChannel.class);
b.childHandler(new ChannelInitializer<SocketChannel>() {
#Override
protected void initChannel(SocketChannel ch) throws Exception
{
ch.pipeline().addLast(new ServerHandler());
}
});
b.option(ChannelOption.SO_BACKLOG, 128);
b.childOption(ChannelOption.SO_KEEPALIVE, true);
// ScheduledExecutorService e = Executors.newSingleThreadScheduledExecutor();
// e.scheduleAtFixedRate(() -> {
// System.out.println("Calling...");
// handler.saySomething();
// }, 1, 1, TimeUnit.SECONDS);
ChannelFuture f = b.bind(1337).sync();
f.channel().closeFuture().sync();
} finally {
workerGroup.shutdownGracefully();
bossGroup.shutdownGracefully();
}
}
}
ServerHandler:
package com.example.test.app;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelFutureListener;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.ChannelInboundHandlerAdapter;
import io.netty.util.concurrent.ScheduledFuture;
import java.util.concurrent.TimeUnit;
public class ServerHandler extends ChannelInboundHandlerAdapter {
private ScheduledFuture sf;
#Override
public void channelActive(ChannelHandlerContext ctx)
{
System.out.println("Someone's connedted! "+ctx.channel());
sf = ctx.executor().scheduleAtFixedRate(() -> {
System.out.println("Calling...");
saySomething(ctx);
}, 1, 1, TimeUnit.SECONDS);
}
#Override
public void channelInactive(ChannelHandlerContext ctx) {
System.out.println("Someone's disconnected! "+ctx.channel());
sf.cancel(false);
}
private void saySomething(ChannelHandlerContext ctx)
{
final ChannelFuture f = ctx.writeAndFlush("Sup!");
f.addListener((ChannelFutureListener) (ChannelFuture future) -> {
System.out.println("Something has been said!");
});
}
}
I would like to make a game using LibGDX and Kryonet library, using RMI. So I created clean project. What I want to do for now is, setup server to listen on port 10048 and on new connection to print client's name which will I get by calling a method on client's class...
Here is the code:
ICardsTableImpl.java
package clzola.cardstable.client;
public interface ICardsTableGameImpl {
public String getName();
}
CardsTableServer.java
package clzola.cardstable.server;
import clzola.cardstable.client.ICardsTableGameImpl;
import com.esotericsoftware.kryo.Kryo;
import com.esotericsoftware.kryonet.Connection;
import com.esotericsoftware.kryonet.Server;
import com.esotericsoftware.kryonet.rmi.ObjectSpace;
import com.esotericsoftware.minlog.Log;
import java.io.IOException;
import java.util.HashMap;
public class CardsTableServer extends Server {
private HashMap<Integer, Connection> connections;
public CardsTableServer() throws IOException {
connections = new HashMap<Integer, Connection>();
addListener(new NetworkListener(this));
Kryo kryo = getKryo();
ObjectSpace.registerClasses(kryo);
kryo.register(ICardsTableGameImpl.class);
bind(10048);
}
#Override
protected Connection newConnection() {
Player player = new Player();
addConnection(player);
return player;
}
public void addConnection(Connection connection) {
this.connections.put(connection.getID(), connection);
}
public Connection getConnection(int connectionId) {
return this.connections.get(connectionId);
}
public Connection removeConnection(int connectionId) {
return this.connections.remove(connectionId);
}
public static void main(String[] args) throws IOException {
Log.set(Log.LEVEL_DEBUG);
CardsTableServer server = new CardsTableServer();
server.start();
}
}
NetworkListener.java
package clzola.cardstable.server;
import clzola.cardstable.client.ICardsTableGameImpl;
import com.badlogic.gdx.Gdx;
import com.esotericsoftware.kryonet.Connection;
import com.esotericsoftware.kryonet.Listener;
import com.esotericsoftware.kryonet.rmi.ObjectSpace;
public class NetworkListener extends Listener {
private CardsTableServer server;
public NetworkListener(CardsTableServer server) {
this.server = server;
}
#Override
public void connected(Connection connection) {
Player player = ((Player) connection);
ICardsTableGameImpl game = ObjectSpace.getRemoteObject(player, 0, ICardsTableGameImpl.class);
player.name = game.getName(); // This is where I get excpetion...
Gdx.app.log("Server", "Player name: " + player.name);
}
#Override
public void disconnected(Connection connection) {
server.removeConnection(connection.getID());
}
}
Player.java
package clzola.cardstable.server;
import com.esotericsoftware.kryonet.Connection;
public class Player extends Connection {
public String name;
}
CardsTableGame.java
package clzola.cardstable.client;
import com.badlogic.gdx.ApplicationAdapter;
import com.badlogic.gdx.Gdx;
import com.badlogic.gdx.graphics.GL20;
import com.badlogic.gdx.graphics.g2d.SpriteBatch;
import com.badlogic.gdx.scenes.scene2d.Stage;
import com.badlogic.gdx.utils.viewport.ScreenViewport;
import com.esotericsoftware.kryo.Kryo;
import com.esotericsoftware.kryonet.Client;
import com.esotericsoftware.kryonet.rmi.ObjectSpace;
public class CardsTableGame extends ApplicationAdapter implements ICardsTableGameImpl {
SpriteBatch batch;
Stage stage;
Client client;
String name = "Lazar";
ObjectSpace objectSpace;
#Override
public void create () {
batch = new SpriteBatch();
stage = new Stage(new ScreenViewport(), batch);
try {
client = new Client();
client.start();
Kryo kryo = client.getKryo();
ObjectSpace.registerClasses(kryo);
kryo.register(ICardsTableGameImpl.class);
ObjectSpace objectSpace = new ObjectSpace();
objectSpace.register(0, this);
objectSpace.addConnection(client);
client.connect(5000, "127.0.0.1", 10048);
} catch (Exception e) {
Gdx.app.log("CardsTableGame", e.getMessage(), e);
}
}
#Override
public void render () {
Gdx.gl.glClearColor(0, 0, 0, 1);
Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT);
}
#Override
public String getName() {
return this.name;
}
}
After running it, I get exception on the server side:
Exception in thread "Server" java.lang.IllegalStateException: Cannot wait for an RMI response on the connection's update thread.
at com.esotericsoftware.kryonet.rmi.ObjectSpace$RemoteInvocationHandler.waitForResponse(ObjectSpace.java:420)
at com.esotericsoftware.kryonet.rmi.ObjectSpace$RemoteInvocationHandler.invoke(ObjectSpace.java:408)
at com.sun.proxy.$Proxy0.getName(Unknown Source)
at clzola.cardstable.server.NetworkListener.connected(NetworkListener.java:24)
at com.esotericsoftware.kryonet.Server$1.connected(Server.java:48)
at com.esotericsoftware.kryonet.Connection.notifyConnected(Connection.java:214)
at com.esotericsoftware.kryonet.Server.acceptOperation(Server.java:417)
at com.esotericsoftware.kryonet.Server.update(Server.java:249)
at com.esotericsoftware.kryonet.Server.run(Server.java:372)
at java.lang.Thread.run(Thread.java:745)
And I have no idea why... What am I doing wrong??
(This is the first time ever I am trying to use RMI)
The Listener is executed by the Kryonet-update-thread. This thread is checking the socket regularly to receive the messages. Calling game.getName() makes the caller wait until the answer was delivered over the network. If you do that on the update thread you'd probably put your server in deadlock because kryonet cannot receive the answer it is waiting on, since you block the update thread. This is why it throws the exception.
In an rmi example from the kryonet git they solve this problem by using a Listener working on its own thread.
// The ThreadedListener means the network thread won't be blocked when waiting for RMI responses.
client.addListener(new ThreadedListener(new Listener() {
public void connected (final Connection connection) {
TestObject test = ObjectSpace.getRemoteObject(connection, 42, TestObject.class);
// Normal remote method call.
assertEquals(43.21f, test.other());
// Make a remote method call that returns another remote proxy object.
OtherObject otherObject = test.getOtherObject();
// Normal remote method call on the second object.
assertEquals(12.34f, otherObject.value());
// When a remote proxy object is sent, the other side recieves its actual remote object.
connection.sendTCP(otherObject);
}
}));
I have the following clientendpoint class for a websocket in tomcat 7.0.53. It is based off of this example on a website https://blog.openshift.com/how-to-build-java-websocket-applications-using-the-jsr-356-api/
import java.io.IOException;
import java.net.URI;
import java.util.ArrayList;
import javax.websocket.ClientEndpoint;
import javax.websocket.CloseReason;
import javax.websocket.ContainerProvider;
import javax.websocket.DeploymentException;
import javax.websocket.OnClose;
import javax.websocket.OnError;
import javax.websocket.OnMessage;
import javax.websocket.OnOpen;
import javax.websocket.Session;
import javax.websocket.WebSocketContainer;
#ClientEndpoint
public class connect {
private static ArrayList<Session> sessionList = new ArrayList<Session>();
public connect(URI endpointURI) throws DeploymentException, IOException
{
WebSocketContainer container = ContainerProvider.getWebSocketContainer();
container.connectToServer(this, endpointURI);
}
#OnOpen
public void onOpen(Session session) throws IOException
{
sessionList.add(session);
System.out.println(session.getId());
session.getBasicRemote().sendText("hello");
}
public void sendMessage(String message) throws IOException
{
for(Session session : sessionList){
//asynchronous communication
session.getBasicRemote().sendText(message);
}
}
#OnClose
public void onClose(Session session){
sessionList.remove(session);
System.out.println("here");
}
#OnError
public void onError(Throwable t, Session session){
System.out.println("tedt");
}
}
I then have the following code to start the client endpoint
import java.io.IOException;
import java.net.URI;
import java.net.URISyntaxException;
import javax.websocket.DeploymentException;
public class test {
public static void main(String[] args) throws DeploymentException, IOException, URISyntaxException {
// TODO Auto-generated method stub
connect connect = new connect(new URI("ws://localhost:8080/example/talk"));
connect.sendMessage("now");
}
}
The client does successfully connect to the websocket server, however then it gets disconnected right away when I try to send a message or do anything, I know this since the onError function is being called when I try to send a message from the onOpen function. Why is the websocket getting closed immediately after it is connected to the server?
You are being disconnected because your main thread in your client application is ending. After you send "now", your program simply exits. If you want to do anything else (like wait for a response from the server, for instance), then you'll have to prevent the main thread from exiting. Try something like this at the end of your main method:
System.in.read();
This will cause your process to sit and wait for input from standard input. Simply wait for your test to complete and then press ENTER on the command-line to terminate the client.
You will, of course, want to register a handler for receiving messages back from the server to the client. Right now, you can only send messages from the client to the server.
I want to modify the client handler to use Foo instead of Datagram -- what changes are required in the client itself?
Surely it's not necessary to strictly keep to datagrams to send and receive with Netty? The Factorial example uses BigInteger, so, surely, it's possible to use POJO's.
Any and all attempts to create a class like:
class FooClientHandler extends SimpleChannelInboundHandler<Foo> are just non-starters for me, it literally won't send or receive with a server. (Yes, both client and server use similar handlers, generic classes with Foo.) So, I'm coming at this now from working code.
What's the key distinction between the factorial handler and the the datagram handler below? Or, is the primary distinction in how it's used in the client?
client:
package net.bounceme.dur.netty;
import io.netty.bootstrap.Bootstrap;
import io.netty.buffer.Unpooled;
import io.netty.channel.Channel;
import io.netty.channel.ChannelOption;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.DatagramPacket;
import io.netty.channel.socket.nio.NioDatagramChannel;
import io.netty.util.CharsetUtil;
import java.net.InetSocketAddress;
import java.util.logging.Logger;
import net.bounceme.dur.client.gui.MyProps;
public final class Client {
private static final Logger log = Logger.getLogger(Client.class.getName());
public void connect() throws InterruptedException {
MyProps p = new MyProps();
String host = p.getHost();
int port = p.getServerPort();
pingPongDatagram(host, port);
}
public void pingPongDatagram(String host, int port) throws InterruptedException {
EventLoopGroup group = new NioEventLoopGroup();
try {
Bootstrap b = new Bootstrap();
b.group(group)
.channel(NioDatagramChannel.class)
.option(ChannelOption.SO_BROADCAST, true)
.handler(new DatagramClientHandler());
Channel ch = b.bind(0).sync().channel();
ch.writeAndFlush(new DatagramPacket(
Unpooled.copiedBuffer("QOTM?", CharsetUtil.UTF_8),
new InetSocketAddress(host, port))).sync();
log.info("wrote packet");
if (!ch.closeFuture().await(5000)) {
log.warning("server timed out");
}
} finally {
group.shutdownGracefully();
}
}
}
handler:
package net.bounceme.dur.netty;
import io.netty.buffer.Unpooled;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.SimpleChannelInboundHandler;
import io.netty.channel.socket.DatagramPacket;
import io.netty.util.CharsetUtil;
import java.net.InetSocketAddress;
import java.util.logging.Logger;
public class DatagramClientHandler extends SimpleChannelInboundHandler<DatagramPacket> {
private static final Logger log = Logger.getLogger(DatagramClientHandler.class.getName());
#Override
public void channelRead0(ChannelHandlerContext ctx, DatagramPacket msg) throws Exception {
String response = msg.content().toString(CharsetUtil.UTF_8);
log.info(response);
DatagramPacket foo = new DatagramPacket(
Unpooled.copiedBuffer("QOTM?", CharsetUtil.UTF_8),
new InetSocketAddress("localhost", 4454));
ctx.writeAndFlush(foo);
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) {
log.severe(cause.toString());
ctx.close();
}
}
I omitted the server code, it's almost exactly as in the Ghandi quote example.
What changes do I need to make to the client so that the handler can use Foo instead of DatagramPacket?
All I can say with certainty is that this client:
package net.bounceme.dur.netty;
import io.netty.bootstrap.Bootstrap;
import io.netty.buffer.Unpooled;
import io.netty.channel.Channel;
import io.netty.channel.ChannelOption;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.DatagramPacket;
import io.netty.channel.socket.nio.NioDatagramChannel;
import io.netty.util.CharsetUtil;
import java.net.InetSocketAddress;
import java.util.logging.Logger;
import net.bounceme.dur.client.gui.MyProps;
import net.bounceme.dur.client.jdbc.Title;
public final class Client {
private static final Logger log = Logger.getLogger(Client.class.getName());
public void connect() throws InterruptedException {
MyProps p = new MyProps();
String host = p.getHost();
int port = p.getServerPort();
pingPongDatagram(host, port);
}
public void pingPongDatagram(String host, int port) throws InterruptedException {
EventLoopGroup group = new NioEventLoopGroup();
try {
Bootstrap b = new Bootstrap();
b.group(group)
.channel(NioDatagramChannel.class)
.option(ChannelOption.SO_BROADCAST, true)
.handler(new TitleClientHandler());
Channel ch = b.bind(0).sync().channel();
ch.writeAndFlush(new DatagramPacket(
Unpooled.copiedBuffer("QOTM?", CharsetUtil.UTF_8),
new InetSocketAddress(host, port))).sync();
ch.writeAndFlush(new Title());
log.info("wrote packets");
if (!ch.closeFuture().await(5000)) {
log.warning("server timed out");
}
} finally {
group.shutdownGracefully();
}
}
}
and handler:
package net.bounceme.dur.netty;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.SimpleChannelInboundHandler;
import java.util.logging.Logger;
import net.bounceme.dur.client.jdbc.Title;
public class TitleClientHandler extends SimpleChannelInboundHandler<Title> {
private static final Logger log = Logger.getLogger(TitleClientHandler.class.getName());
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) {
log.severe(cause.toString());
ctx.close();
}
#Override
protected void channelRead0(ChannelHandlerContext chc, Title title) throws Exception {
log.info(title.toString());
}
}
don't, seemingly, communicate at all with the server -- even when the server has been modified accordingly.