Multiple Channels with Different Service Paths - java

[I am using Netty-Websokcet]
I have a use case where different service paths should be connected to the same port. I tried so many different ways, reasons I couldn't get the work done was,
In ServerBootstrap class there is only one place for ChannelHandler therefore I cannot add multiple child handlers in ServerBootstrap with different service paths
In ServerBootstrap class it is not possible to create multiple groups
This is how my init channel looks like,
#Override
protected void initChannel(SocketChannel socketChannel) throws Exception {
logger.debug(1, "Initializing the SocketChannel : {}", socketChannel.id());
socketChannel.pipeline().addLast(
new HttpRequestDecoder(),
new HttpObjectAggregator(maxPayloadSize),
new HttpResponseEncoder(),
new IdleStateHandler(0, 0, listenerConfig.getSocketTimeout(),
TimeUnit.SECONDS),
new WebSocketServerProtocolHandler(ingressConfig.getURI().getPath()), // (A)
new WebSocketServerCompressionHandler(),
new WebSocketIO(listenerConfig, manager), // a handler
new WebSocketMessageListener(messageReceiver, manager) // a handler
);
logger.debug(2, "Successfully initialized the Socket Channel : {}", socketChannel.id());
}
This code line (A) registers a handler with the given service path (service path is ingressConfig.getURI().getPath())
int maxPayloadSize = listenerConfig.getMaxPayloadSize();
try {
bossGroup = new NioEventLoopGroup(listenerConfig.getBossThreadCount());
workerGroup = new NioEventLoopGroup(listenerConfig.getWorkerThreadCount());
ServerBootstrap bootstrap = new ServerBootstrap();
bootstrap.group(bossGroup, workerGroup)
.channel(NioServerSocketChannel.class)
.handler(new LoggingHandler(LogLevel.INFO))
.childHandler(new WebSocketListenerInitializer(messageReceiver, maxPayloadSize, listenerConfig,
ingressConfig))
.option(ChannelOption.SO_BACKLOG, 128)
.childOption(ChannelOption.SO_KEEPALIVE, true);
ChannelFuture channelFuture = bootstrap.bind(port);
channelFuture.sync();
channel = channelFuture.channel();
if (channelFuture.isSuccess()) {
logger.info(1, "WebSocket listener started on port : {} successfully", port);
} else {
logger.error(2, "Failed to start WebSocket server on port : {}", port,
channelFuture.cause());
throw new TransportException("Failed to start WebSocket server", channelFuture.cause());
}
} catch (InterruptedException ex) {
logger.error(1, "Interrupted Exception from : {}", WebSocketListener.class);
throw new TransportException("Interrupted Exception", ex);
}
Can anyone suggest me a way how to do this?

Related

Java netty tcp server with postgres. Query hangs indefinitely

I write Spring Boot application with tcp server on Netty. Service get messages and check rows in postgres database. The problem is that at the moment of checking the records in the database, the service hangs and stops processing other messages from the tcp channel.
Configuration:
#Bean
public void start() throws InterruptedException {
log.info("Starting server at: {} ", tcpPort);
EventLoopGroup bossGroup = new NioEventLoopGroup();
EventLoopGroup workerGroup = new NioEventLoopGroup();
ServerBootstrap b = new ServerBootstrap();
b.group(workerGroup, bossGroup)
.channel(NioServerSocketChannel.class)
.childHandler(simpleTCPChannelInitializer)
.childOption(ChannelOption.SO_KEEPALIVE, true);
// Bind and start to accept incoming connections.
ChannelFuture f = b.bind(tcpPort).sync();
if(f.isSuccess())
log.info("Server started successfully");
f.channel().closeFuture().sync();
}
Channel initialization:
private final EventExecutorGroup sqlExecutorGroup = new DefaultEventExecutorGroup(16);
protected void initChannel(SocketChannel socketChannel) {
socketChannel.pipeline().addLast(new StringEncoder());
socketChannel.pipeline().addLast(new StringDecoder());
socketChannel.pipeline().addLast(sqlExecutorGroup, simpleTCPChannelHandler);
}
and method for database:
#Override
public void processMessage(String atmRequest) {
log.info("Receive tcp atmRequest: {}", atmRequest);
checkDeviceInDatabase(deviceUid);
log.info("Receive power up command");
}
private void checkDeviceInDatabase(String deviceUid) {
statusConnectRepository.findById(deviceUid).orElseThrow(()
-> new DeviceNotFoundException("DeviceUid: " + deviceUid + " was not found in database"));
}
In checkDeviceInDatabase(deviceUid) method query hangs forever.
Has anyone met such a problem?

Netty Add Channel to ServerBootstrap

I have a ServerBootstrap accepting data from clients. Most of them are from any endpoint that connects to it, however I also want to handle data coming from a specific endpoint.
I'm reading and writing strings from n+1 connections basically. If the one specific connection ever closes, I would need to reopen it again.
Currently I'm trying to get a Bootstrap connected to the specific endpoint, and a ServerBootstrap handling all of the incoming connections, but the sync() that starts one of the Bootstraps blocks the rest of the application and I can't run the other one.
Or is it possible to just create a channel from scratch, connect to it, and add it to the EventLoopGroup?
Here's an example of what I have so far. Currently startServer() blocks at channelfuture.channel().closeFuture().sync()
private Channel mChannel;
private EventLoopGroup mListeningGroup;
private EventLoopGroup mSpeakingGroup;
public void startServer() {
try {
ServerBootstrap bootstrap = new ServerBootstrap()
.group(mListeningGroup, mSpeakingGroup)
.channel(NioServerSocketChannel.class)
.option(ChannelOption.SO_BACKLOG, 1024)
.childOption(ChannelOption.SO_KEEPALIVE, true)
.childHandler(new ServerInitializer());
ChannelFuture channelFuture = bootstrap.bind(mListeningPort).sync();
channelFuture.channel().closeFuture().sync();
} catch (InterruptedException e) {
e.printStackTrace();
} finally {
mListeningGroup.shutdownGracefully();
mSpeakingGroup.shutdownGracefully();
}
}
public void startClient() throws InterruptedException {
Bootstrap bootstrap = new Bootstrap()
.group(mSpeakingGroup)
.channel(NioSocketChannel.class)
.option(ChannelOption.SO_BACKLOG, 1024)
.option(ChannelOption.TCP_NODELAY, true)
.option(ChannelOption.SO_KEEPALIVE, true)
.handler(new ClientInitializer());
ChannelFuture future = bootstrap.connect(mAddress,mPort).sync();
mChannel = future.channel();
mChannel.closeFuture().addListener((ChannelFutureListener) futureListener -> mChannel = null).sync();
}
Once data is read by any of the n+1 sockets it puts it's message into a PriorityQueue and a while loops continuously pops off the queue and writes the data to every Channel. Does anyone have any ideas in regards to the best way to approach this?

Netty systemd lazy initialization

Is it possible to have netty lazy initialized via systemd/inetd, using inherited server socket channel?
We used this in our old Jetty based server, where Jetty would call Java's System.inheritedChannel() to get the socket created via systemd on lazy initializations.
I have searched a lot, and all I found is a Jira ticket that says it supposedly supports in version 4: https://issues.jboss.org/browse/NETTY-309.
But this jira ticket has no example, and I couldn't find any documentation, nor anything on the source code, that could point me to how to achieve this in netty.
Any help would be appreciated.
Thanks
EDIT:
Just to make it more clear, what I want to know if is it possible to have my java application socket-activated by systemd, and then somehow pass the socket reference to netty.
EDIT 2:
Here is an approach suggested by Norman Mayer, but it actually fails with the exception below:
public class MyServerBootStrap {
private ServiceContext ctx;
private Config config;
private Collection<Channel> channels;
private Collection<Connector> connectors;
public MyServerBootStrap(List<Connector> connectors) {
this.ctx = ApplicationContext.getInstance();
this.config = ctx.getMainConfig();
this.connectors = connectors;
this.channels = new ArrayList<>(connectors.size());
}
public void run(Connector connector) throws RuntimeException, IOException, InterruptedException {
EventLoopGroup bossGroup = new NioEventLoopGroup(config.getInt("http_acceptor_threads", 0));
EventLoopGroup workerGroup = new NioEventLoopGroup(config.getIntError("http_server_threads"));
final SocketAddress addr;
final ChannelFactory<ServerChannel> channelFactory;
if (connector.geEndpoint().isInherited()) {
System.out.println(
"Trying to bootstrap inherited channel: " + connector.geEndpoint().getDescription());
ServerSocketChannel channel = (ServerSocketChannel) System.inheritedChannel();
addr = channel.getLocalAddress();
System.out.println("Channel localAddress(): " + addr);
channelFactory = new MyChannelFactory(channel);
} else {
System.out.println(
"Trying to bootstrap regular channel: " + connector.geEndpoint().getDescription());
addr = connector.geEndpoint().getSocketAdress();
channelFactory = new MyChannelFactory(null);
}
ServerBootstrap b = new ServerBootstrap();
b
.group(bossGroup, workerGroup)
.localAddress(addr)
.channelFactory(channelFactory)
.childHandler(new ChannelInitializerRouter(Collections.singletonList(connector)))
.childOption(ChannelOption.SO_KEEPALIVE, true);
if (config.contains("tcp_max_syn_backlog")) {
b.option(ChannelOption.SO_BACKLOG, config.getIntError("tcp_max_syn_backlog"));
}
Channel serverChannel = b.bind().sync().channel();
channels.add(serverChannel);
}
public void run() throws RuntimeException {
try {
for (Connector connector : connectors) {
run(connector);
}
for (Channel channel : channels) {
channel.closeFuture().sync();
}
} catch (Throwable exc) {
throw new RuntimeException("Failed to start web-server", exc);
} finally {
// TODO: fix this
// workerGroup.shutdownGracefully();
// bossGroup.shutdownGracefully();
}
}
}
class MyChannelFactory implements io.netty.channel.ChannelFactory<ServerChannel> {
private ServerSocketChannel channel;
public MyChannelFactory(ServerSocketChannel ch) {
this.channel = ch;
}
#Override
public ServerChannel newChannel() {
if (channel == null) {
return new NioServerSocketChannel();
} else {
return new NioServerSocketChannel(channel);
}
}
}
log:
Trying to bootstrap inherited channel: public (tcp port: 8080)
Channel localAddress(): /0:0:0:0:0:0:0:0:8080
java.lang.RuntimeException: Failed to start web-server
at MyServerBootStrap.run(MyServerBootStrap.java:85)
at MyServer.run(MyServer.java:61)
at Main.start(Main.java:96)
at Main.main(Main.java:165)
Caused by: java.nio.channels.AlreadyBoundException
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:216)
at sun.nio.ch.InheritedChannel$InheritedServerSocketChannelImpl.bind(InheritedChannel.java:92)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1338)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:999)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:366)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:886)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.lang.Thread.run(Thread.java:748)
Yes it should be possible.
The NioServerSocketChannel allows you to wrap an existing Channel via its constructor. So all you will need to do is to write your own ChannelFactory and use it with ServerBootstrap to ensure you create a NioServerSocketChannel that wraps it.
Another approach would be to not use ServerBootstrap at all but just call register etc with the manually created NioServerSocketChannel by yourself.

Netty 10000 connections at the same time

I am trying to simulate a 10000 client connection at the same time to server using Netty. When 956 clients connect to the server everything work great, but the 957 client cause an error exception.
Note: I am running the server and the clients at the same machine(win7 8GB ram, i7-CPU)
The error:
java.lang.IllegalStateException: failed to create a child event loop
io.netty.channel.ChannelException: failed to open a new selector
My code:
try {
con.connect();
} catch (Exception e) {
logger.error("Client: error connect to ip {} and port {}, ",id, ip, port,e);
return;
}
The code of connect method is:
public void connect() {
workerGroup = new NioEventLoopGroup();
Bootstrap bs = new Bootstrap();
bs.group(workerGroup).channel(NioSocketChannel.class);
bs.handler(new ChannelInitializer<SocketChannel>() {
#Override
protected void initChannel(SocketChannel ch) throws Exception {
ch.pipeline().addLast("idleStateHandler", new IdleStateHandler(0, 0, 300));
ch.pipeline().addLast("idleStateActionHandler", new IdleStateEventHandler());
ch.pipeline().addLast("logger", new LoggingHandler());
ch.pipeline().addLast("commandDecoder", new CommandDecoder());
ch.pipeline().addLast("commandEncoder", new CommandEncoder());
}
});
You should use the same NioEventLoopGroup instance for each connect call. Otherwise you will create a lot of Threads.

send large message between client and server with netty

I want to send large message between client and server with netty, but when I use netty for sending large message to server, In server I cannot get message complete for first time, in server I use ChannelHandlerAdapter when send large message from client method channelReadComplete run for two seconds, it must run for first time. Please see my client code and tell me my problem.
try {
Bootstrap b = new Bootstrap();
b.group(group).channel(NioSocketChannel.class)
.handler(new ChannelInitializer<SocketChannel>() {
#Override
protected void initChannel(SocketChannel ch)
throws Exception {
ChannelPipeline p = ch.pipeline();
// if (sslCtx != null) {
// p.addLast(sslCtx.newHandler(ch.alloc(), HOST,
// PORT));
// }
System.out.println("initChannel-client");
p.addLast(new DiscardClientHandler(),
new LengthFieldBasedFrameDecoder(
100 * 1024, 0, 8));
}
});
// Make the connection attempt.
ChannelFuture f = b.connect(HOST, PORT).sync();
// // Wait until the connection is closed.
// // add by test
DiscardClient discardClient = new DiscardClient();
String message = discardClient.reafFile("D:\\log\\log1.txt");
ByteBuf encoded = f.channel().alloc().buffer(message.length());
encoded.writeBytes(message.getBytes());
f.channel().write(encoded);
f.channel().flush();
f.channel().closeFuture().sync();
} finally {
// group.shutdownGracefully();
}
Best Regards

Categories