Java netty tcp server with postgres. Query hangs indefinitely - java

I write Spring Boot application with tcp server on Netty. Service get messages and check rows in postgres database. The problem is that at the moment of checking the records in the database, the service hangs and stops processing other messages from the tcp channel.
Configuration:
#Bean
public void start() throws InterruptedException {
log.info("Starting server at: {} ", tcpPort);
EventLoopGroup bossGroup = new NioEventLoopGroup();
EventLoopGroup workerGroup = new NioEventLoopGroup();
ServerBootstrap b = new ServerBootstrap();
b.group(workerGroup, bossGroup)
.channel(NioServerSocketChannel.class)
.childHandler(simpleTCPChannelInitializer)
.childOption(ChannelOption.SO_KEEPALIVE, true);
// Bind and start to accept incoming connections.
ChannelFuture f = b.bind(tcpPort).sync();
if(f.isSuccess())
log.info("Server started successfully");
f.channel().closeFuture().sync();
}
Channel initialization:
private final EventExecutorGroup sqlExecutorGroup = new DefaultEventExecutorGroup(16);
protected void initChannel(SocketChannel socketChannel) {
socketChannel.pipeline().addLast(new StringEncoder());
socketChannel.pipeline().addLast(new StringDecoder());
socketChannel.pipeline().addLast(sqlExecutorGroup, simpleTCPChannelHandler);
}
and method for database:
#Override
public void processMessage(String atmRequest) {
log.info("Receive tcp atmRequest: {}", atmRequest);
checkDeviceInDatabase(deviceUid);
log.info("Receive power up command");
}
private void checkDeviceInDatabase(String deviceUid) {
statusConnectRepository.findById(deviceUid).orElseThrow(()
-> new DeviceNotFoundException("DeviceUid: " + deviceUid + " was not found in database"));
}
In checkDeviceInDatabase(deviceUid) method query hangs forever.
Has anyone met such a problem?

Related

Netty systemd lazy initialization

Is it possible to have netty lazy initialized via systemd/inetd, using inherited server socket channel?
We used this in our old Jetty based server, where Jetty would call Java's System.inheritedChannel() to get the socket created via systemd on lazy initializations.
I have searched a lot, and all I found is a Jira ticket that says it supposedly supports in version 4: https://issues.jboss.org/browse/NETTY-309.
But this jira ticket has no example, and I couldn't find any documentation, nor anything on the source code, that could point me to how to achieve this in netty.
Any help would be appreciated.
Thanks
EDIT:
Just to make it more clear, what I want to know if is it possible to have my java application socket-activated by systemd, and then somehow pass the socket reference to netty.
EDIT 2:
Here is an approach suggested by Norman Mayer, but it actually fails with the exception below:
public class MyServerBootStrap {
private ServiceContext ctx;
private Config config;
private Collection<Channel> channels;
private Collection<Connector> connectors;
public MyServerBootStrap(List<Connector> connectors) {
this.ctx = ApplicationContext.getInstance();
this.config = ctx.getMainConfig();
this.connectors = connectors;
this.channels = new ArrayList<>(connectors.size());
}
public void run(Connector connector) throws RuntimeException, IOException, InterruptedException {
EventLoopGroup bossGroup = new NioEventLoopGroup(config.getInt("http_acceptor_threads", 0));
EventLoopGroup workerGroup = new NioEventLoopGroup(config.getIntError("http_server_threads"));
final SocketAddress addr;
final ChannelFactory<ServerChannel> channelFactory;
if (connector.geEndpoint().isInherited()) {
System.out.println(
"Trying to bootstrap inherited channel: " + connector.geEndpoint().getDescription());
ServerSocketChannel channel = (ServerSocketChannel) System.inheritedChannel();
addr = channel.getLocalAddress();
System.out.println("Channel localAddress(): " + addr);
channelFactory = new MyChannelFactory(channel);
} else {
System.out.println(
"Trying to bootstrap regular channel: " + connector.geEndpoint().getDescription());
addr = connector.geEndpoint().getSocketAdress();
channelFactory = new MyChannelFactory(null);
}
ServerBootstrap b = new ServerBootstrap();
b
.group(bossGroup, workerGroup)
.localAddress(addr)
.channelFactory(channelFactory)
.childHandler(new ChannelInitializerRouter(Collections.singletonList(connector)))
.childOption(ChannelOption.SO_KEEPALIVE, true);
if (config.contains("tcp_max_syn_backlog")) {
b.option(ChannelOption.SO_BACKLOG, config.getIntError("tcp_max_syn_backlog"));
}
Channel serverChannel = b.bind().sync().channel();
channels.add(serverChannel);
}
public void run() throws RuntimeException {
try {
for (Connector connector : connectors) {
run(connector);
}
for (Channel channel : channels) {
channel.closeFuture().sync();
}
} catch (Throwable exc) {
throw new RuntimeException("Failed to start web-server", exc);
} finally {
// TODO: fix this
// workerGroup.shutdownGracefully();
// bossGroup.shutdownGracefully();
}
}
}
class MyChannelFactory implements io.netty.channel.ChannelFactory<ServerChannel> {
private ServerSocketChannel channel;
public MyChannelFactory(ServerSocketChannel ch) {
this.channel = ch;
}
#Override
public ServerChannel newChannel() {
if (channel == null) {
return new NioServerSocketChannel();
} else {
return new NioServerSocketChannel(channel);
}
}
}
log:
Trying to bootstrap inherited channel: public (tcp port: 8080)
Channel localAddress(): /0:0:0:0:0:0:0:0:8080
java.lang.RuntimeException: Failed to start web-server
at MyServerBootStrap.run(MyServerBootStrap.java:85)
at MyServer.run(MyServer.java:61)
at Main.start(Main.java:96)
at Main.main(Main.java:165)
Caused by: java.nio.channels.AlreadyBoundException
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:216)
at sun.nio.ch.InheritedChannel$InheritedServerSocketChannelImpl.bind(InheritedChannel.java:92)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:128)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:558)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1338)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:501)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:486)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:999)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:254)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:366)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:886)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.lang.Thread.run(Thread.java:748)
Yes it should be possible.
The NioServerSocketChannel allows you to wrap an existing Channel via its constructor. So all you will need to do is to write your own ChannelFactory and use it with ServerBootstrap to ensure you create a NioServerSocketChannel that wraps it.
Another approach would be to not use ServerBootstrap at all but just call register etc with the manually created NioServerSocketChannel by yourself.

Netty: Start client after server has bootstraped, why another thread is needed?

I want to start both TCP echo server and client in one app, client after server.
What I do is:
Start a client in a ChannelFutureListener returned by server.bind().sync() like this:
public void runClientAndServer() {
server.run().addListener((ChannelFutureListener) future -> {
// client.run(); //(1) this doesn't work!
new Thread(()->client.run()).start(); //(2) this works!
});
}
and server.run() is like this:
public ChannelFuture run() {
ServerBootstrap b = new ServerBootstrap();
//doing channel config stuff
return b.bind(6666).sync();
}
and client.run() is like this:
public void run() {
Bootstrap b = new Bootstrap();
//do some config stuff
f = b.connect(host, port).sync(); //wait till connected to server
}
What happens is:
In the statement (2) that works just fine; While in the statement (1), the client sent message, that can be observed in Wireshark, and the server replies TCP ACK segment, but no channelRead() method in server side ChannelInboundHandlerAdapter is invoked, nor any attempt to write message to socket can be observed, like this capture:
wireshark capture
I guess there must be something wrong with Netty threads, but I just cannot figure out
I have prepared an example based on the newest netty version (4.1.16.Final) and the code you posted. This works fine without an extra thread maybe you did something wrong initializing your server or client.
private static final NioEventLoopGroup EVENT_LOOP_GROUP = new NioEventLoopGroup();
private static ChannelFuture runServer() throws Exception {
return new ServerBootstrap()
.group(EVENT_LOOP_GROUP)
.channel(NioServerSocketChannel.class)
.childHandler(new ChannelInitializer<Channel>() {
#Override
protected void initChannel(Channel channel) throws Exception {
System.out.println("S - Channel connected: " + channel);
channel.pipeline().addLast(new SimpleChannelInboundHandler<ByteBuf>() {
#Override
protected void channelRead0(ChannelHandlerContext ctx, ByteBuf msg) throws Exception {
System.out.println("S - read: " + msg.toString(StandardCharsets.UTF_8));
}
});
}
})
.bind(6666).sync();
}
private static void runClient() throws Exception {
new Bootstrap()
.group(EVENT_LOOP_GROUP)
.channel(NioSocketChannel.class)
.handler(new ChannelInitializer<Channel>() {
#Override
protected void initChannel(Channel channel) throws Exception {
System.out.println("C - Initilized client");
channel.pipeline().addLast(new SimpleChannelInboundHandler<ByteBuf>() {
#Override
public void channelActive(ChannelHandlerContext ctx) throws Exception {
super.channelActive(ctx);
System.out.println("C - write: Hey this is a test message enjoy!");
ctx.writeAndFlush(Unpooled.copiedBuffer("Hey this is a test message enjoy!".getBytes(StandardCharsets.UTF_8)));
}
#Override
protected void channelRead0(ChannelHandlerContext ctx, ByteBuf msg) throws Exception { }
});
}
})
.connect("localhost", 6666).sync();
}
public static void main(String[] args) throws Exception {
runServer().addListener(future -> {
runClient();
});
}
That's what the output should look like:
C - Initilized client
C - write: Hey this is a test message enjoy!
S - Channel connected: [id: 0x1d676489, L:/127.0.0.1:6666 - R:/127.0.0.1:5079]
S - read: Hey this is a test message enjoy!
I also tried this example with a single threaded eventloopgroup which also worked fine but throw me an BlockingOperationException which did not affect the functionality of the program. If this code should work fine you should probably check your code and try to orientate your code on this example (Please don't create inline ChannelHandler for your real project like I did in this example).

Multiple Channels with Different Service Paths

[I am using Netty-Websokcet]
I have a use case where different service paths should be connected to the same port. I tried so many different ways, reasons I couldn't get the work done was,
In ServerBootstrap class there is only one place for ChannelHandler therefore I cannot add multiple child handlers in ServerBootstrap with different service paths
In ServerBootstrap class it is not possible to create multiple groups
This is how my init channel looks like,
#Override
protected void initChannel(SocketChannel socketChannel) throws Exception {
logger.debug(1, "Initializing the SocketChannel : {}", socketChannel.id());
socketChannel.pipeline().addLast(
new HttpRequestDecoder(),
new HttpObjectAggregator(maxPayloadSize),
new HttpResponseEncoder(),
new IdleStateHandler(0, 0, listenerConfig.getSocketTimeout(),
TimeUnit.SECONDS),
new WebSocketServerProtocolHandler(ingressConfig.getURI().getPath()), // (A)
new WebSocketServerCompressionHandler(),
new WebSocketIO(listenerConfig, manager), // a handler
new WebSocketMessageListener(messageReceiver, manager) // a handler
);
logger.debug(2, "Successfully initialized the Socket Channel : {}", socketChannel.id());
}
This code line (A) registers a handler with the given service path (service path is ingressConfig.getURI().getPath())
int maxPayloadSize = listenerConfig.getMaxPayloadSize();
try {
bossGroup = new NioEventLoopGroup(listenerConfig.getBossThreadCount());
workerGroup = new NioEventLoopGroup(listenerConfig.getWorkerThreadCount());
ServerBootstrap bootstrap = new ServerBootstrap();
bootstrap.group(bossGroup, workerGroup)
.channel(NioServerSocketChannel.class)
.handler(new LoggingHandler(LogLevel.INFO))
.childHandler(new WebSocketListenerInitializer(messageReceiver, maxPayloadSize, listenerConfig,
ingressConfig))
.option(ChannelOption.SO_BACKLOG, 128)
.childOption(ChannelOption.SO_KEEPALIVE, true);
ChannelFuture channelFuture = bootstrap.bind(port);
channelFuture.sync();
channel = channelFuture.channel();
if (channelFuture.isSuccess()) {
logger.info(1, "WebSocket listener started on port : {} successfully", port);
} else {
logger.error(2, "Failed to start WebSocket server on port : {}", port,
channelFuture.cause());
throw new TransportException("Failed to start WebSocket server", channelFuture.cause());
}
} catch (InterruptedException ex) {
logger.error(1, "Interrupted Exception from : {}", WebSocketListener.class);
throw new TransportException("Interrupted Exception", ex);
}
Can anyone suggest me a way how to do this?

Netty - Send failed: UnsupportedOperationException: unsupported message type

I am writing server-client app using Netty 4.1.9.
Recently I successfully made communication using ByteBuf.
Then tried with String and succeed again.
But with last step I wanted to send custom Object and there is my problem:
Send failed: java.lang.UnsupportedOperationException: unsupported message type: GIdPacket (expected: ByteBuf, FileRegion)
Some code parts
Server:
ServerBootstrap b = new ServerBootstrap();
b.group(bossGroup, workerGroup)
.channel(NioServerSocketChannel.class)
.childHandler(new ChannelInitializer<SocketChannel>() {
#Override
public void initChannel(SocketChannel ch) throws Exception {
ch.pipeline().addLast(new ObjectDecoder(Integer.MAX_VALUE, ClassResolvers.cacheDisabled(null)));
ch.pipeline().addLast(new ObjectEncoder());
ch.pipeline().addLast(new ServerHandler());
ch.pipeline().addLast(loggingHandler);
}
})
.option(ChannelOption.SO_BACKLOG, 128)
.childOption(ChannelOption.SO_KEEPALIVE, true)
.option(ChannelOption.AUTO_READ, true);
ChannelFuture f = b.bind(port).sync();
ServerHandler:
#Override
public void channelActive(ChannelHandlerContext ctx) throws Exception
{
System.out.println("CHANNEL >> channelActive fired");
clientData = new ClientData(Server.newId(), ctx.channel());
Server.clients.put(clientData.id, clientData);
GIdPacket packet = new GIdPacket(/*some init atr*/);
ChannelFuture cf = ctx.writeAndFlush(packet);
if (!cf.isSuccess()) {
System.out.println("Send failed: " + cf.cause());
}
Log.out(packet.toString());
}
So, I tried many different options. I spent much time on that and I didn't found solution. Any help appreciated.
Looks like GIdPacket doesn't implement Serializable interface. According to the docs, ObjectEncoder supports only serializable objects: class ObjectEncoder
extends MessageToByteEncoder<Serializable>.
By the way, you're trying to check channelFuture.isSuccess() immediately after execution writeAndFlush(..). Because Netty is completly asynchronous, this test will fail because channelFuture is neither successful nor failed yet. You must wait until CF is completed by calling cf.await() or cf.awaitUninterruptibly().

Netty 10000 connections at the same time

I am trying to simulate a 10000 client connection at the same time to server using Netty. When 956 clients connect to the server everything work great, but the 957 client cause an error exception.
Note: I am running the server and the clients at the same machine(win7 8GB ram, i7-CPU)
The error:
java.lang.IllegalStateException: failed to create a child event loop
io.netty.channel.ChannelException: failed to open a new selector
My code:
try {
con.connect();
} catch (Exception e) {
logger.error("Client: error connect to ip {} and port {}, ",id, ip, port,e);
return;
}
The code of connect method is:
public void connect() {
workerGroup = new NioEventLoopGroup();
Bootstrap bs = new Bootstrap();
bs.group(workerGroup).channel(NioSocketChannel.class);
bs.handler(new ChannelInitializer<SocketChannel>() {
#Override
protected void initChannel(SocketChannel ch) throws Exception {
ch.pipeline().addLast("idleStateHandler", new IdleStateHandler(0, 0, 300));
ch.pipeline().addLast("idleStateActionHandler", new IdleStateEventHandler());
ch.pipeline().addLast("logger", new LoggingHandler());
ch.pipeline().addLast("commandDecoder", new CommandDecoder());
ch.pipeline().addLast("commandEncoder", new CommandEncoder());
}
});
You should use the same NioEventLoopGroup instance for each connect call. Otherwise you will create a lot of Threads.

Categories