Exception while invoking gRPC method - java

I am invoking my Java gRPC service from a PHP client (PHP client runs in CLI mode for the sake of debugging and will run in FPM mode in production). Yet the connection on a Java side seems to be refused for some reason:
14:28:57.679 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.AbstractByteBuf - -Dio.grpc.netty.shaded.io.netty.buffer.checkAccessible: true
14:28:57.679 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.AbstractByteBuf - -Dio.grpc.netty.shaded.io.netty.buffer.checkBounds: true
14:28:57.681 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.ResourceLeakDetectorFactory - Loaded default ResourceLeakDetector: io.grpc.netty.shaded.io.netty.util.ResourceLeakDetector#7663df67
14:28:57.777 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
14:28:57.782 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler - -Dio.netty.recycler.maxCapacityPerThread: 4096
14:28:57.783 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler - -Dio.netty.recycler.maxSharedCapacityFactor: 2
14:28:57.783 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler - -Dio.netty.recycler.linkCapacity: 16
14:28:57.783 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler - -Dio.netty.recycler.ratio: 8
14:28:57.812 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
14:28:57.821 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] INBOUND SETTINGS: ack=false settings={ENABLE_PUSH=0, MAX_CONCURRENT_STREAMS=0, INITIAL_WINDOW_SIZE=4194303, MAX_FRAME_SIZE=4194303, MAX_HEADER_LIST_SIZE=8192, ︃=1}
14:28:57.823 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] OUTBOUND SETTINGS: ack=true
14:28:57.825 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] INBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=4128768
14:28:57.826 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] INBOUND SETTINGS: ack=true
14:28:57.839 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] INBOUND HEADERS: streamId=1 headers=GrpcHttp2RequestHeaders[:path: /microservices.bitrix.leads.BitrixLeadService/CreateLead, :authority: localhost:50051, :method: POST, :scheme: http, te: trailers, content-type: application/grpc, grpc-accept-encoding: identity, deflate, gzip, user-agent: grpc-c/29.0.0 (linux; chttp2)] padding=0 endStream=false
14:28:57.879 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] INBOUND DATA: streamId=1 padding=0 endStream=true length=55 bytes=00000000320a0b3739323431303137313035120cd09cd0bed181d0bad0b2d0b018d7162212d0a1d182d0b0d0bdd0b8d181d0bbd0b0d0b2
14:28:57.881 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] INBOUND RST_STREAM: streamId=1 errorCode=8
14:28:57.899 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 ! R:/127.0.0.1:57746] OUTBOUND RST_STREAM: streamId=1 errorCode=8
14:28:57.900 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 ! R:/127.0.0.1:57746] OUTBOUND GO_AWAY: lastStreamId=1 errorCode=2 length=0 bytes=
14:28:57.902 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2ConnectionHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 ! R:/127.0.0.1:57746] Sending GOAWAY failed: lastStreamId '1', errorCode '2', debugData ''. Forcing shutdown of the connection.
For some reasons, another exception may be thrown, though no changes to the code were made:
14:19:56.666 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xf3d688e2, L:/127.0.0.1:50051 - R:/127.0.0.1:52836] INBOUND RST_STREAM: streamId=1 errorCode=8

Related

How do I recover a ManagedChannel that is is wrapped by Resillience4J and is in Half Open state?

I have a GRPC client proxy implementation that I wrapped in Resillience4J as a CircuitBreaker for failure scenarios.
The circuit breaker works fine and it opens, but it never recovers. When I goes back to Half-open and I hit the GRPC endpoint I get the same error as I do when it first breaks.
2022-09-18 19:34:11.258 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] OUTBOUND HEADERS: streamId=4835 headers=GrpcHttp2OutboundHeaders[:authority: 10.0.9.2:50000, :path: /grpc.reflection.v1alpha.ServerReflection/ServerReflectionInfo, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty/1.48.1, x-b3-traceid: 632772b3bbb53b9c47dddb9e00629897, x-b3-spanid: 5ad57d4d348297cb, x-b3-parentspanid: 4caf63e791f1a5ce, x-b3-sampled: 1, grpc-accept-encoding: gzip, grpc-timeout: 998924u] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
2022-09-18 19:34:11.259 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] OUTBOUND DATA: streamId=4835 padding=0 endStream=true length=8 bytes=00000000033a012a
2022-09-18 19:34:11.260 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] INBOUND PING: ack=false bytes=1234
2022-09-18 19:34:11.261 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] OUTBOUND PING: ack=true bytes=1234
2022-09-18 19:34:11.261 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] INBOUND HEADERS: streamId=4835 headers=GrpcHttp2ResponseHeaders[:status: 200, content-type: application/grpc, grpc-encoding: identity, grpc-accept-encoding: gzip] padding=0 endStream=false
2022-09-18 19:34:11.262 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] INBOUND DATA: streamId=4835 padding=0 endStream=false length=96 bytes=000000005b12033a012a32540a2a0a28677270632e7265666c656374696f6e2e7631616c7068612e5365727665725265666c656374696f6e0a260a246e65742e...
2022-09-18 19:34:11.262 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] OUTBOUND PING: ack=false bytes=1234
2022-09-18 19:34:11.262 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] INBOUND HEADERS: streamId=4835 headers=GrpcHttp2ResponseHeaders[grpc-status: 0] padding=0 endStream=true
2022-09-18 19:34:11.263 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] INBOUND PING: ack=true bytes=1234
2022-09-18 19:34:11.263 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : Window: 1048576
2022-09-18 19:34:16.257 ERROR [,632772b3bbb53b9c47dddb9e00629897,47dddb9e00629897] 1 --- [or-http-epoll-5] request : POST http://localhost:28082/grpc/Echo/echo 503 5005.583ms
Restarting the client makes it work again so I know the server is still working, but I am trying to avoid restarting the client.
I already put in enableRetry and deadline and keepalive none of them seem to make the channel reconfigure itself.
At least the HTTP service recovery still works, but the GRPC proxying service does not.
The weird part is it's getting pings according to the logs.
In case you need the source it's in https://github.com/trajano/spring-cloud-demo/tree/rework
Note the only way I can trigger this problem is through an artillery script to put sufficient load to trigger the circuit breaker.
UPDATE: I also tried using DirectExecutor on the client no luck still.
UPDATE: Removing resilience4j basically kills the gateway server under load.

Not able to call GCP NLP via java cloud sdk

I need technical help with respect to the Google Cloud Services integration with a SpringBoot Application.
Introduction: I am trying to fetch the sentiments of the statement using google cloud java sdk. I have created a spring-boot application and from there I'm trying to send the request via the GCP cloud language client jar.
Problem: GCP cloud language client jar is not able to make a request to the google services. It is going to timeout after almost 10-15 minutes.
Extra Info: I have tried the same using python and am able to do so in no time, working flawlessly. Hence, I can conclude the connection is there, my creds are working properly.
Project URL: https://github.com/chaundhyan/manas-gateway
Logs:
19:50:25.638 [main] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent - org.jctools-core.MpscChunkedArrayQueue: available
Text Received: My Name is Mukul
Document Created: My Name is Mukul
19:50:26.116 [grpc-default-executor-0] DEBUG io.grpc.netty.shaded.io.netty.channel.DefaultChannelId - -Dio.netty.processId: 3109 (auto-detected)
19:50:26.118 [grpc-default-executor-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil - -Djava.net.preferIPv4Stack: false
19:50:26.119 [grpc-default-executor-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil - -Djava.net.preferIPv6Addresses: false
19:50:26.121 [grpc-default-executor-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil - Loopback interface: lo0 (lo0, 0:0:0:0:0:0:0:1%lo0)
19:50:26.122 [grpc-default-executor-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil - Failed to get SOMAXCONN from sysctl and file /proc/sys/net/core/somaxconn. Default: 128
19:50:26.125 [grpc-default-executor-0] DEBUG io.grpc.netty.shaded.io.netty.channel.DefaultChannelId - -Dio.netty.machineId: a0:99:9b:ff:fe:15:5a:49 (auto-detected)
19:50:26.180 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.netty.util.ResourceLeakDetectorFactory - Loaded default ResourceLeakDetector: io.grpc.netty.shaded.io.netty.util.ResourceLeakDetector#76ed8938
19:50:26.471 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.netty.handler.ssl.SslHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] HANDSHAKEN: protocol:TLSv1.3 cipher suite:TLS_AES_128_GCM_SHA256
19:50:26.476 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] OUTBOUND SETTINGS: ack=false settings={ENABLE_PUSH=0, MAX_CONCURRENT_STREAMS=0, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
19:50:26.479 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
19:50:26.491 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] INBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=100, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=65536}
19:50:26.492 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] OUTBOUND SETTINGS: ack=true
19:50:26.493 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] INBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
19:50:26.494 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] INBOUND SETTINGS: ack=true
19:54:26.502 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] INBOUND GO_AWAY: lastStreamId=2147483647 errorCode=0 length=17 bytes=73657373696f6e5f74696d65645f6f7574
19:54:26.515 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] INBOUND PING: ack=false bytes=0
19:54:26.515 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] OUTBOUND PING: ack=true bytes=0
19:54:26.525 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] INBOUND GO_AWAY: lastStreamId=0 errorCode=0 length=17 bytes=73657373696f6e5f74696d65645f6f7574
Exception in thread "main" com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: Deadline exceeded after 599.986342726s.
at com.google.api.gax.rpc.ApiExceptionFactory.createException(ApiExceptionFactory.java:51)
at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:72)
at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:60)
at com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.onFailure(GrpcExceptionCallable.java:97)
at com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:68)
at com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1041)
at com.google.common.util.concurrent.DirectExecutor.execute(DirectExecutor.java:30)
at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:1215)
at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:983)
at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:771)
at io.grpc.stub.ClientCalls$GrpcFuture.setException(ClientCalls.java:563)
at io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:533)
at io.grpc.internal.DelayedClientCall$CloseListenerRunnable.runInContext(DelayedClientCall.java:406)
at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Suppressed: com.google.api.gax.rpc.AsyncTaskException: Asynchronous task failed
at com.google.api.gax.rpc.ApiExceptions.callAndTranslateApiException(ApiExceptions.java:57)
at com.google.api.gax.rpc.UnaryCallable.call(UnaryCallable.java:112)
at com.google.cloud.language.v1.LanguageServiceClient.analyzeSentiment(LanguageServiceClient.java:217)
at com.google.cloud.language.v1.LanguageServiceClient.analyzeSentiment(LanguageServiceClient.java:164)
at com.mukul.pro.manasgateway.ManasGatewayApplication.main(ManasGatewayApplication.java:22)
Caused by: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: Deadline exceeded after 599.986342726s.
at io.grpc.Status.asRuntimeException(Status.java:535)
... 10 more
I can get the top 2 sysout in the logs. But after that, it all went dark.
String text = "My Name is Mukul";
try (LanguageServiceClient language = LanguageServiceClient.create()) {
System.out.println("Text Received: " + text);
Document doc = Document.newBuilder().setContent(text).setType(Document.Type.PLAIN_TEXT).build();
System.out.println("Document Created: " + doc.getContent());
// Detects the sentiment of the text
Sentiment sentiment = language.analyzeSentiment(doc).getDocumentSentiment();
System.out.println("Got Sentiment: " + sentiment.getScore());
System.out.printf("Text: %s%n", text);
System.out.printf("Sentiment: %s, %s%n", sentiment.getScore(), sentiment.getMagnitude());
}

Error: Caused by: io.netty.handler.codec.http2.Http2Exception: Window size overflow for stream: 0

Error while calling a remote machine using grpc witout ssl.
On first execution, works fine. On second client execution, the error occurs.
On client side I'm open and close a channel on each request like that:
channel = ManagedChannelBuilder.forAddress(host, port).usePlaintext().build();
stub = ApiServiceGrpc.newBlockingStub(channel);
stub.CreateRequest(request);
channel.shutdown().awaitTermination(5, TimeUnit.SECONDS);
Server side:
"io.grpc:grpc-protobuf:1.23.0"
"io.grpc:grpc-stub:1.23.0"
"io.grpc:grpc-okhttp:1.23.0"
server = NettyServerBuilder
.forAddress(new InetSocketAddress("0.0.0.0", port))
.addService(myService)
.build()
.start();
This is the client Log using finest configuration to log
I tried to use everything as default, no customizations:
Using io.grpc:grpc-all:1.23.0
Logs:
Success Execution
Server
2020-03-05 09:51:53,818 DEBUG [grpc-nio-worker-ELG-3-1|i.g.n.NettyServerHandler] [id: 0x623f6427, L:/10.0.0.2:12577 - R:/10.0.0.3:51022] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
2020-03-05 09:51:53,842 DEBUG [grpc-nio-worker-ELG-3-1|i.g.n.NettyServerHandler] [id: 0x623f6427, L:/10.0.0.2:12577 - R:/10.0.0.3:51022] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
2020-03-05 09:51:53,864 DEBUG [grpc-nio-worker-ELG-3-1|i.g.n.NettyServerHandler] [id: 0x623f6427, L:/10.0.0.2:12577 - R:/10.0.0.3:51022] INBOUND SETTINGS: ack=false settings={HEADER_TABLE_SIZE=4096, ENABLE_PUSH=0, MAX_CONCURRENT_STREAMS=0, INITIAL_WINDOW_SIZE=1048576, MAX_FRAME_SIZE=16384, MAX_HEADER_LIST_SIZE=8192}
2020-03-05 09:51:53,866 DEBUG [grpc-nio-worker-ELG-3-1|i.g.n.NettyServerHandler] [id: 0x623f6427, L:/10.0.0.2:12577 - R:/10.0.0.3:51022] OUTBOUND SETTINGS: ack=true
2020-03-05 09:51:53,868 DEBUG [grpc-nio-worker-ELG-3-1|i.g.n.NettyServerHandler] [id: 0x623f6427, L:/10.0.0.2:12577 - R:/10.0.0.3:51022] INBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=2147418112
2020-03-05 09:51:53,869 DEBUG [grpc-nio-worker-ELG-3-1|i.g.n.NettyServerHandler] [id: 0x623f6427, L:/10.0.0.2:12577 - R:/10.0.0.3:51022] INBOUND SETTINGS: ack=true
2020-03-05 09:51:53,917 DEBUG [grpc-nio-worker-ELG-3-1|i.g.n.NettyServerHandler] [id: 0x623f6427, L:/10.0.0.2:12577 - R:/10.0.0.3:51022] INBOUND HEADERS: streamId=3 headers=GrpcHttp2RequestHeaders[:path: /service.KongApiService/CreateConfig, :authority: machine_02:12577, :method: POST, :scheme: http, te: trailers, content-type: application/grpc, user-agent: grpc-java-netty/1.23.0, grpc-accept-encoding: gzip] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
2020-03-05 09:51:53,945 DEBUG [grpc-nio-worker-ELG-3-1|i.g.n.NettyServerHandler] [id: 0x623f6427, L:/10.0.0.2:12577 - R:/10.0.0.3:51022] INBOUND DATA: streamId=3 padding=0 endStream=true length=276 bytes=000000010f0a0a636f6e6669674e616d65120a636f6e6669674e616d651a0a636f6e6669674e616d65220a636f6e6669674e616d652a0a636f6e6669674e616d...
2020-03-05 09:51:53,995 INFO [grpc-default-executor-0|c.c.s.k.g.KongApiService] Create/Replace configuration: configName
2020-03-05 09:51:54,008 DEBUG [grpc-nio-worker-ELG-3-1|i.g.n.NettyServerHandler] [id: 0x623f6427, L:/10.0.0.2:12577 - R:/10.0.0.3:51022] OUTBOUND HEADERS: streamId=3 headers=GrpcHttp2OutboundHeaders[:status: 200, content-type: application/grpc, grpc-encoding: identity, grpc-accept-encoding: gzip] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
2020-03-05 09:51:54,021 DEBUG [grpc-nio-worker-ELG-3-1|i.g.n.NettyServerHandler] [id: 0x623f6427, L:/10.0.0.2:12577 - R:/10.0.0.3:51022] OUTBOUND DATA: streamId=3 padding=0 endStream=false length=276 bytes=000000010f0a0a636f6e6669674e616d65120a636f6e6669674e616d651a0a636f6e6669674e616d65220a636f6e6669674e616d652a0a636f6e6669674e616d...
2020-03-05 09:51:54,024 DEBUG [grpc-nio-worker-ELG-3-1|i.g.n.NettyServerHandler] [id: 0x623f6427, L:/10.0.0.2:12577 - R:/10.0.0.3:51022] OUTBOUND HEADERS: streamId=3 headers=GrpcHttp2OutboundHeaders[grpc-status: 0] streamDependency=0 weight=16 exclusive=false padding=0 endStream=true
2020-03-05 09:51:54,057 DEBUG [grpc-nio-worker-ELG-3-1|i.g.n.NettyServerHandler] [id: 0x623f6427, L:/10.0.0.2:12577 - R:/10.0.0.3:51022] INBOUND GO_AWAY: lastStreamId=0 errorCode=0 length=0 bytes=
Client
Mar 05, 2020 9:51:51 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x1e2d0467, L:/10.0.0.3:51022 - R:machine_02/10.0.0.2:12577] OUTBOUND SETTINGS: ack=false settings={ENABLE_PUSH=0, MAX_CONCURRENT_STREAMS=0, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
Mar 05, 2020 9:51:51 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x1e2d0467, L:/10.0.0.3:51022 - R:machine_02/10.0.0.2:12577] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
Mar 05, 2020 9:51:51 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x1e2d0467, L:/10.0.0.3:51022 - R:machine_02/10.0.0.2:12577] INBOUND SETTINGS: ack=false settings={HEADER_TABLE_SIZE=4096, MAX_CONCURRENT_STREAMS=100, INITIAL_WINDOW_SIZE=1048576, MAX_FRAME_SIZE=16384, MAX_HEADER_LIST_SIZE=8192}
Mar 05, 2020 9:51:51 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x1e2d0467, L:/10.0.0.3:51022 - R:machine_02/10.0.0.2:12577] OUTBOUND SETTINGS: ack=true
Mar 05, 2020 9:51:51 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x1e2d0467, L:/10.0.0.3:51022 - R:machine_02/10.0.0.2:12577] INBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=2147418112
Mar 05, 2020 9:51:51 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x1e2d0467, L:/10.0.0.3:51022 - R:machine_02/10.0.0.2:12577] INBOUND SETTINGS: ack=true
Mar 05, 2020 9:51:51 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x1e2d0467, L:/10.0.0.3:51022 - R:machine_02/10.0.0.2:12577] OUTBOUND HEADERS: streamId=3 headers=GrpcHttp2OutboundHeaders[:authority: machine_02:12577, :path: /service.KongApiService/CreateConfig, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty/1.23.0, grpc-accept-encoding: gzip] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
Mar 05, 2020 9:51:51 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x1e2d0467, L:/10.0.0.3:51022 - R:machine_02/10.0.0.2:12577] OUTBOUND DATA: streamId=3 padding=0 endStream=true length=276 bytes=000000010f0a0a636f6e6669674e616d65120a636f6e6669674e616d651a0a636f6e6669674e616d65220a636f6e6669674e616d652a0a636f6e6669674e616d...
Mar 05, 2020 9:51:52 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x1e2d0467, L:/10.0.0.3:51022 - R:machine_02/10.0.0.2:12577] INBOUND HEADERS: streamId=3 headers=GrpcHttp2ResponseHeaders[:status: 200, content-type: application/grpc, grpc-encoding: identity, grpc-accept-encoding: gzip] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
Mar 05, 2020 9:51:52 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x1e2d0467, L:/10.0.0.3:51022 - R:machine_02/10.0.0.2:12577] INBOUND DATA: streamId=3 padding=0 endStream=false length=276 bytes=000000010f0a0a636f6e6669674e616d65120a636f6e6669674e616d651a0a636f6e6669674e616d65220a636f6e6669674e616d652a0a636f6e6669674e616d...
Mar 05, 2020 9:51:52 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x1e2d0467, L:/10.0.0.3:51022 - R:machine_02/10.0.0.2:12577] INBOUND HEADERS: streamId=3 headers=GrpcHttp2ResponseHeaders[grpc-status: 0] streamDependency=0 weight=16 exclusive=false padding=0 endStream=true
Mar 05, 2020 9:51:52 AM io.grpc.netty.NettyClientHandler close FINE: Network channel being closed by the application.
Mar 05, 2020 9:51:52 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x1e2d0467, L:/10.0.0.3:51022 - R:machine_02/10.0.0.2:12577] OUTBOUND GO_AWAY: lastStreamId=0 errorCode=0 length=0 bytes=
Mar 05, 2020 9:51:52 AM io.grpc.netty.NettyClientHandler channelInactive FINE: Network channel is closed
The second execution.
The server still running and the client started again.
Error Execution
Server:
2020-03-05 09:52:57,613 DEBUG [grpc-nio-worker-ELG-3-2|i.g.n.NettyServerHandler] [id: 0x40afc9b8, L:/10.0.0.2:12577 - R:/10.0.0.3:51078] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
2020-03-05 09:52:57,617 DEBUG [grpc-nio-worker-ELG-3-2|i.g.n.NettyServerHandler] [id: 0x40afc9b8, L:/10.0.0.2:12577 - R:/10.0.0.3:51078] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
2020-03-05 09:52:57,640 DEBUG [grpc-nio-worker-ELG-3-2|i.g.n.NettyServerHandler] [id: 0x40afc9b8, L:/10.0.0.2:12577 - R:/10.0.0.3:51078] INBOUND SETTINGS: ack=false settings={HEADER_TABLE_SIZE=4096, ENABLE_PUSH=0, MAX_CONCURRENT_STREAMS=0, INITIAL_WINDOW_SIZE=1048576, MAX_FRAME_SIZE=16384, MAX_HEADER_LIST_SIZE=8192}
2020-03-05 09:52:57,641 DEBUG [grpc-nio-worker-ELG-3-2|i.g.n.NettyServerHandler] [id: 0x40afc9b8, L:/10.0.0.2:12577 - R:/10.0.0.3:51078] OUTBOUND SETTINGS: ack=true
2020-03-05 09:52:57,641 DEBUG [grpc-nio-worker-ELG-3-2|i.g.n.NettyServerHandler] [id: 0x40afc9b8, L:/10.0.0.2:12577 - R:/10.0.0.3:51078] INBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=2147418112
2020-03-05 09:52:57,658 DEBUG [grpc-nio-worker-ELG-3-2|i.g.n.NettyServerHandler] [id: 0x40afc9b8, L:/10.0.0.2:12577 - R:/10.0.0.3:51078] INBOUND SETTINGS: ack=true
2020-03-05 09:52:57,659 DEBUG [grpc-nio-worker-ELG-3-2|i.g.n.NettyServerHandler] [id: 0x40afc9b8, L:/10.0.0.2:12577 - R:/10.0.0.3:51078] INBOUND GO_AWAY: lastStreamId=0 errorCode=3 length=34 bytes=57696e646f772073697a65206f766572666c6f7720666f722073747265616d3a2030
Client:
Mar 05, 2020 9:52:55 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x56cb1182, L:/10.0.0.3:51078 - R:machine_02/10.0.0.2:12577] OUTBOUND SETTINGS: ack=false settings={ENABLE_PUSH=0, MAX_CONCURRENT_STREAMS=0, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
Mar 05, 2020 9:52:55 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x56cb1182, L:/10.0.0.3:51078 - R:machine_02/10.0.0.2:12577] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
Mar 05, 2020 9:52:55 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x56cb1182, L:/10.0.0.3:51078 - R:machine_02/10.0.0.2:12577] INBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
Mar 05, 2020 9:52:55 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x56cb1182, L:/10.0.0.3:51078 - R:machine_02/10.0.0.2:12577] OUTBOUND SETTINGS: ack=true
Mar 05, 2020 9:52:55 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x56cb1182, L:/10.0.0.3:51078 - R:machine_02/10.0.0.2:12577] INBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
Mar 05, 2020 9:52:55 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x56cb1182, L:/10.0.0.3:51078 - R:machine_02/10.0.0.2:12577] INBOUND SETTINGS: ack=true
Mar 05, 2020 9:52:55 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x56cb1182, L:/10.0.0.3:51078 - R:machine_02/10.0.0.2:12577] INBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=2147418112
Mar 05, 2020 9:52:55 AM io.grpc.netty.NettyClientHandler onConnectionError FINE: Caught a connection error io.netty.handler.codec.http2.Http2Exception: Window size overflow for stream: 0
Mar 05, 2020 9:52:55 AM io.netty.util.internal.logging.AbstractInternalLogger log FINE: [id: 0x56cb1182, L:/10.0.0.3:51078 - R:machine_02/10.0.0.2:12577] OUTBOUND GO_AWAY: lastStreamId=0 errorCode=3 length=34 bytes=57696e646f772073697a65206f766572666c6f7720666f722073747265616d3a2030
Mar 05, 2020 9:52:55 AM io.netty.handler.codec.http2.Http2ConnectionHandler processGoAwayWriteResult FINE: [id: 0x56cb1182, L:/10.0.0.3:51078 - R:machine_02/10.0.0.2:12577] Sent GOAWAY: lastStreamId '0', errorCode '3', debugData 'Window size overflow for stream: 0'. Forcing shutdown of the connection.
Mar 05, 2020 9:52:55 AM io.grpc.netty.NettyClientHandler channelInactive FINE: Network channel is closed io.grpc.StatusRuntimeException: INTERNAL: http2 exception

Why doesn't AsyncHttpClient close the thread after throwing an exception?

I'm using the Zendesk Java Client. When I supply the correct credentials, it works as expected. However, I stumbled on a scenario where if the wrong credentials are passed in, the thread just hangs and it throws an error. Whether I run the code with or without the debugger, the code just stays running. It doesn't exit or return control to the debugger.
Is this an issue with the library, or am I misunderstanding how the AsyncHttpClient works?
My code is below:
fun zdtestWrongCredentials() {
val client = asyncHttpClient(
config()
.setRequestTimeout(5000)
.setReadTimeout(5000)
.setShutdownTimeout(3000)
.setPooledConnectionIdleTimeout(5000)
.setKeepAlive(false)
)
var zd = Zendesk.Builder("https://website.zendesk.com")
.setClient(client)
.setUsername("john.doe#website.com")
.setPassword("abcd")
.build()
var ticket = Ticket(123, "a", Comment("abc"))
// The code hangs here. It's unclear why it exhibits this behavior.
var test = zd.createTicket(ticket)
// The code does not reach this line.
client.close()
return
}
Although I'm using Kotlin, I tried replicating this issue in a simple Java project and the issue persists.
The stack trace is below. There is an exception at the bottom, but even after this exception, the program does not exit/give control back to the debugger.
"C:\Program Files\Java\jdk-11.0.4\bin\java.exe"
09:47:01.463 [main] DEBUG io.netty.util.internal.logging.InternalLoggerFactory - Using SLF4J as the default logging framework
09:47:01.480 [main] DEBUG io.netty.util.ResourceLeakDetector - -Dio.netty.leakDetection.level: simple
09:47:01.480 [main] DEBUG io.netty.util.ResourceLeakDetector - -Dio.netty.leakDetection.targetRecords: 4
09:47:01.550 [main] DEBUG io.netty.util.internal.PlatformDependent - Platform: Windows
09:47:01.562 [main] DEBUG io.netty.util.internal.PlatformDependent0 - -Dio.netty.noUnsafe: false
09:47:01.562 [main] DEBUG io.netty.util.internal.PlatformDependent0 - Java version: 11
09:47:01.566 [main] DEBUG io.netty.util.internal.PlatformDependent0 - sun.misc.Unsafe.theUnsafe: available
09:47:01.567 [main] DEBUG io.netty.util.internal.PlatformDependent0 - sun.misc.Unsafe.copyMemory: available
09:47:01.568 [main] DEBUG io.netty.util.internal.PlatformDependent0 - java.nio.Buffer.address: available
09:47:01.575 [main] DEBUG io.netty.util.internal.PlatformDependent0 - direct buffer constructor: unavailable
java.lang.UnsupportedOperationException: Reflective setAccessible(true) disabled
at io.netty.util.internal.ReflectionUtil.trySetAccessible(ReflectionUtil.java:31)
at io.netty.util.internal.PlatformDependent0$4.run(PlatformDependent0.java:224)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at io.netty.util.internal.PlatformDependent0.<clinit>(PlatformDependent0.java:218)
at io.netty.util.internal.PlatformDependent.isAndroid(PlatformDependent.java:212)
at io.netty.util.internal.PlatformDependent.<clinit>(PlatformDependent.java:80)
at io.netty.util.ResourceLeakDetector.<init>(ResourceLeakDetector.java:171)
at io.netty.util.ResourceLeakDetector.<init>(ResourceLeakDetector.java:213)
at io.netty.util.ResourceLeakDetectorFactory$DefaultResourceLeakDetectorFactory.newResourceLeakDetector(ResourceLeakDetectorFactory.java:201)
at io.netty.util.HashedWheelTimer.<clinit>(HashedWheelTimer.java:89)
at org.asynchttpclient.DefaultAsyncHttpClient.newNettyTimer(DefaultAsyncHttpClient.java:96)
at org.asynchttpclient.DefaultAsyncHttpClient.<init>(DefaultAsyncHttpClient.java:87)
at org.asynchttpclient.Dsl.asyncHttpClient(Dsl.java:32)
at com.website.MainKt.test123(Main.kt:321)
at com.website.MainKt.main(Main.kt:288)
at com.website.MainKt.main(Main.kt)
09:47:01.577 [main] DEBUG io.netty.util.internal.PlatformDependent0 - java.nio.Bits.unaligned: available, true
09:47:01.579 [main] DEBUG io.netty.util.internal.PlatformDependent0 - jdk.internal.misc.Unsafe.allocateUninitializedArray(int): unavailable
java.lang.IllegalAccessException: class io.netty.util.internal.PlatformDependent0$6 cannot access class jdk.internal.misc.Unsafe (in module java.base) because module java.base does not export jdk.internal.misc to unnamed module #4de5031f
at java.base/jdk.internal.reflect.Reflection.newIllegalAccessException(Reflection.java:361)
at java.base/java.lang.reflect.AccessibleObject.checkAccess(AccessibleObject.java:591)
at java.base/java.lang.reflect.Method.invoke(Method.java:558)
at io.netty.util.internal.PlatformDependent0$6.run(PlatformDependent0.java:334)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at io.netty.util.internal.PlatformDependent0.<clinit>(PlatformDependent0.java:325)
at io.netty.util.internal.PlatformDependent.isAndroid(PlatformDependent.java:212)
at io.netty.util.internal.PlatformDependent.<clinit>(PlatformDependent.java:80)
at io.netty.util.ResourceLeakDetector.<init>(ResourceLeakDetector.java:171)
at io.netty.util.ResourceLeakDetector.<init>(ResourceLeakDetector.java:213)
at io.netty.util.ResourceLeakDetectorFactory$DefaultResourceLeakDetectorFactory.newResourceLeakDetector(ResourceLeakDetectorFactory.java:201)
at io.netty.util.HashedWheelTimer.<clinit>(HashedWheelTimer.java:89)
at org.asynchttpclient.DefaultAsyncHttpClient.newNettyTimer(DefaultAsyncHttpClient.java:96)
at org.asynchttpclient.DefaultAsyncHttpClient.<init>(DefaultAsyncHttpClient.java:87)
at org.asynchttpclient.Dsl.asyncHttpClient(Dsl.java:32)
at com.website.MainKt.test123(Main.kt:321)
at com.website.MainKt.main(Main.kt:288)
at com.website.MainKt.main(Main.kt)
09:47:01.579 [main] DEBUG io.netty.util.internal.PlatformDependent0 - java.nio.DirectByteBuffer.<init>(long, int): unavailable
09:47:01.579 [main] DEBUG io.netty.util.internal.PlatformDependent - sun.misc.Unsafe: available
09:47:01.634 [main] DEBUG io.netty.util.internal.PlatformDependent - maxDirectMemory: 6404702208 bytes (maybe)
09:47:01.635 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.tmpdir: C:\Users\john~1.doe\AppData\Local\Temp (java.io.tmpdir)
09:47:01.635 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.bitMode: 64 (sun.arch.data.model)
09:47:01.639 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.maxDirectMemory: -1 bytes
09:47:01.639 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.uninitializedArrayAllocationThreshold: -1
09:47:01.649 [main] DEBUG io.netty.util.internal.CleanerJava9 - java.nio.ByteBuffer.cleaner(): available
09:47:01.649 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.noPreferDirect: false
09:47:01.649 [main] DEBUG io.netty.util.ResourceLeakDetectorFactory - Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector#13acb0d1
09:47:01.705 [main] DEBUG io.netty.util.internal.PlatformDependent - org.jctools-core.MpscChunkedArrayQueue: available
09:47:02.621 [main] DEBUG io.netty.handler.ssl.JdkSslContext - Default protocols (JDK): [TLSv1.2, TLSv1.1, TLSv1]
09:47:02.622 [main] DEBUG io.netty.handler.ssl.JdkSslContext - Default cipher suites (JDK): [TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA, TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA, TLS_RSA_WITH_AES_128_GCM_SHA256, TLS_RSA_WITH_AES_128_CBC_SHA, TLS_RSA_WITH_AES_256_CBC_SHA, TLS_AES_128_GCM_SHA256, TLS_AES_256_GCM_SHA384]
09:47:02.651 [main] DEBUG io.netty.channel.MultithreadEventLoopGroup - -Dio.netty.eventLoopThreads: 8
09:47:02.674 [main] DEBUG io.netty.channel.nio.NioEventLoop - -Dio.netty.noKeySetOptimization: false
09:47:02.675 [main] DEBUG io.netty.channel.nio.NioEventLoop - -Dio.netty.selectorAutoRebuildThreshold: 512
09:47:02.714 [main] DEBUG io.netty.util.internal.InternalThreadLocalMap - -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024
09:47:02.714 [main] DEBUG io.netty.util.internal.InternalThreadLocalMap - -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.numHeapArenas: 8
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.numDirectArenas: 8
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.pageSize: 8192
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxOrder: 11
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.chunkSize: 16777216
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.tinyCacheSize: 512
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.smallCacheSize: 256
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.normalCacheSize: 64
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxCachedBufferCapacity: 32768
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.cacheTrimInterval: 8192
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.useCacheForAllThreads: true
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxCachedByteBuffersPerChunk: 1023
09:47:02.738 [main] DEBUG io.netty.buffer.ByteBufUtil - -Dio.netty.allocator.type: pooled
09:47:02.738 [main] DEBUG io.netty.buffer.ByteBufUtil - -Dio.netty.threadLocalDirectBufferSize: 0
09:47:02.738 [main] DEBUG io.netty.buffer.ByteBufUtil - -Dio.netty.maxThreadLocalCharBufferSize: 16384
09:47:03.522 [main] DEBUG org.zendesk.client.v2.Zendesk - Request POST https://website.zendesk.com/api/v2/tickets.json application/json; charset=UTF-8 92 bytes
09:47:03.568 [main] DEBUG io.netty.buffer.AbstractByteBuf - -Dio.netty.buffer.checkAccessible: true
09:47:03.568 [main] DEBUG io.netty.buffer.AbstractByteBuf - -Dio.netty.buffer.checkBounds: true
09:47:03.569 [main] DEBUG io.netty.util.ResourceLeakDetectorFactory - Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector#604c5de8
09:47:03.650 [main] DEBUG io.netty.channel.DefaultChannelId - -Dio.netty.processId: 12648 (auto-detected)
09:47:03.653 [main] DEBUG io.netty.util.NetUtil - -Djava.net.preferIPv4Stack: false
09:47:03.653 [main] DEBUG io.netty.util.NetUtil - -Djava.net.preferIPv6Addresses: false
09:47:04.352 [main] DEBUG io.netty.util.NetUtil - Loopback interface: [lo, Software Loopback Interface 1, 127.0.0.1] ([lo, Software Loopback Interface 1, 127.0.0.1], {})
09:47:04.353 [main] DEBUG io.netty.util.NetUtil - Failed to get SOMAXCONN from sysctl and file \proc\sys\net\core\somaxconn. Default: 200
09:47:05.043 [main] DEBUG io.netty.channel.DefaultChannelId - -Dio.netty.machineId: 00:50:b6:ff:fe:ae:6e:01 (auto-detected)
09:47:05.172 [AsyncHttpClient-3-1] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.maxCapacityPerThread: 4096
09:47:05.172 [AsyncHttpClient-3-1] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.maxSharedCapacityFactor: 2
09:47:05.173 [AsyncHttpClient-3-1] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.linkCapacity: 16
09:47:05.173 [AsyncHttpClient-3-1] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.ratio: 8
09:47:05.614 [AsyncHttpClient-3-1] DEBUG org.asynchttpclient.netty.channel.NettyConnectListener - Using new Channel '[id: 0xf398d3e1, L:/192.168.108.56:64305 - R:website.zendesk.com/104.16.54.111:443]' for 'POST' to '/api/v2/tickets.json'
09:47:05.630 [AsyncHttpClient-3-1] DEBUG io.netty.handler.ssl.SslHandler - [id: 0xf398d3e1, L:/192.168.108.56:64305 - R:website.zendesk.com/104.16.54.111:443] HANDSHAKEN: TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256
09:47:05.810 [AsyncHttpClient-3-1] DEBUG org.asynchttpclient.netty.handler.HttpHandler -
Request DefaultFullHttpRequest(decodeResult: success, version: HTTP/1.1, content: UnpooledHeapByteBuf(freed))
POST /api/v2/tickets.json HTTP/1.1
Content-type: application/json; charset=UTF-8
content-length: 92
connection: close
host: website.zendesk.com
authorization: Basic RGVyZWsudG93cmlzc0BkZWxhd2FyZWxpZmUuY29tOlRlczE=
accept: */*
user-agent: AHC/2.1
Response DefaultHttpResponse(decodeResult: success, version: HTTP/1.1)
HTTP/1.1 401 Unauthorized
Date: Tue, 20 Aug 2019 13:47:06 GMT
Content-Type: application/json; charset=UTF-8
Content-Length: 37
Connection: close
Set-Cookie: __cfduid=d807076f1918856a9ecbded67e619ee901566308826; expires=Wed, 19-Aug-20 13:47:06 GMT; path=/; domain=.website.zendesk.com; HttpOnly
WWW-Authenticate: Basic realm="Web Password"
Strict-Transport-Security: max-age=31536000;
Cache-Control: no-cache
X-Zendesk-Origin-Server: app23.pod20.usw2.zdsys.com
X-Request-Id: 5094d137ce35e1fa-SEA
X-Runtime: 0.060433
X-Zendesk-Request-Id: 7d89cc0062f1b7e6f05c
Set-Cookie: __cfruid=b0a77a0d73109c7862b0ab39be944601c81b0353-1566308826; path=/; domain=.website.zendesk.com; HttpOnly
Expect-CT: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
Server: cloudflare
CF-RAY: 5094d137ce35e1fa-ORD
09:47:05.829 [AsyncHttpClient-3-1] INFO org.asynchttpclient.netty.handler.intercept.Unauthorized401Interceptor - Can't handle 401 as auth was already performed
09:47:05.839 [AsyncHttpClient-3-1] DEBUG org.asynchttpclient.netty.channel.ChannelManager - Closing Channel [id: 0xf398d3e1, L:/192.168.108.56:64305 - R:website.zendesk.com/104.16.54.111:443]
09:47:05.844 [AsyncHttpClient-3-1] DEBUG org.zendesk.client.v2.Zendesk - Response HTTP/401 Unauthorized
{"error":"Couldn't authenticate you"}
09:47:05.846 [AsyncHttpClient-3-1] DEBUG org.asynchttpclient.AsyncCompletionHandler - HTTP/401: Unauthorized - {"error":"Couldn't authenticate you"}
org.zendesk.client.v2.ZendeskResponseException: HTTP/401: Unauthorized - {"error":"Couldn't authenticate you"}
at org.zendesk.client.v2.Zendesk$BasicAsyncCompletionHandler.onCompleted(Zendesk.java:1997)
at org.asynchttpclient.AsyncCompletionHandler.onCompleted(AsyncCompletionHandler.java:66)
at org.asynchttpclient.netty.NettyResponseFuture.loadContent(NettyResponseFuture.java:223)
at org.asynchttpclient.netty.NettyResponseFuture.done(NettyResponseFuture.java:258)
at org.asynchttpclient.netty.handler.AsyncHttpClientHandler.finishUpdate(AsyncHttpClientHandler.java:239)
at org.asynchttpclient.netty.handler.HttpHandler.handleChunk(HttpHandler.java:113)
at org.asynchttpclient.netty.handler.HttpHandler.handleRead(HttpHandler.java:142)
at org.asynchttpclient.netty.handler.AsyncHttpClientHandler.channelRead(AsyncHttpClientHandler.java:76)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:438)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:323)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:297)
at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:253)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1436)
at io.netty.handler.ssl.SslHandler.decodeJdkCompatible(SslHandler.java:1203)
at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1247)
at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:502)
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:441)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:278)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1408)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:930)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:677)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:612)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:529)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:491)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.base/java.lang.Thread.run(Thread.java:834)
Exception in thread "main" org.zendesk.client.v2.ZendeskResponseException: HTTP/401: Unauthorized - {"error":"Couldn't authenticate you"}
at org.zendesk.client.v2.Zendesk.complete(Zendesk.java:2252)
at org.zendesk.client.v2.Zendesk.createTicket(Zendesk.java:307)
at com.website.MainKt.test123(Main.kt:349)
at com.website.MainKt.main(Main.kt:288)09:47:05.849 [AsyncHttpClient-3-1] DEBUG org.asynchttpclient.netty.handler.HttpHandler - Channel Closed: [id: 0xf398d3e1, L:/192.168.108.56:64305 ! R:website.zendesk.com/104.16.54.111:443] with attribute DISCARD
at com.website.MainKt.main(Main.kt)
Caused by: org.zendesk.client.v2.ZendeskResponseException: HTTP/401: Unauthorized - {"error":"Couldn't authenticate you"}
at org.zendesk.client.v2.Zendesk$BasicAsyncCompletionHandler.onCompleted(Zendesk.java:1997)
at org.asynchttpclient.AsyncCompletionHandler.onCompleted(AsyncCompletionHandler.java:66)
at org.asynchttpclient.netty.NettyResponseFuture.loadContent(NettyResponseFuture.java:223)
at org.asynchttpclient.netty.NettyResponseFuture.done(NettyResponseFuture.java:258)
at org.asynchttpclient.netty.handler.AsyncHttpClientHandler.finishUpdate(AsyncHttpClientHandler.java:239)
at org.asynchttpclient.netty.handler.HttpHandler.handleChunk(HttpHandler.java:113)
at org.asynchttpclient.netty.handler.HttpHandler.handleRead(HttpHandler.java:142)
at org.asynchttpclient.netty.handler.AsyncHttpClientHandler.channelRead(AsyncHttpClientHandler.java:76)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:438)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:323)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:297)
at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:253)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1436)
at io.netty.handler.ssl.SslHandler.decodeJdkCompatible(SslHandler.java:1203)
at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1247)
at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:502)
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:441)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:278)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1408)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:930)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:677)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:612)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:529)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:491)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.base/java.lang.Thread.run(Thread.java:834)
Depending on how the async client is implemented it can throw an error in one thread/coroutine and then wait for a completion message or trigger in another part of its code.
Because the error was thrown, the trigger is never sent and thus the thread hangs in perpetuity.
I "built" a problem like this once, so yes it is possible.
Obviously, I don't know if this is actually the issue at hand.
Cheers and all the best!

Ssl Handshake times out randomly on local server NettyClientHandler

Ssl Handler randomly times out, it works 30% of the time and time outs 70%, I am enabling ssl in gRPC hosted on kubernetes, it works without time outs outside the cluster, inside the cluster without time outs but time outs when calling from the container where the server is running
Any help is greatly appreciated, will post more details if needed
----------------OUTBOUND--------------------
[id: 0x993a7494, L:/127.0.0.1:51082 - R:localhost/127.0.0.1:6868] SETTINGS: ack=false,
settings={ENABLE_PUSH=0, MAX_CONCURRENT_STREAMS=0, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
------------------------------------
2017-10-16 04:26:10/GMT DEBUG io.grpc.netty.NettyClientHandler:68 -
----------------OUTBOUND--------------------
[id: 0x993a7494, L:/127.0.0.1:51082 - R:localhost/127.0.0.1:6868] WINDOW_UPDATE: streamId=0, windowSizeIncrement=983041
------------------------------------
2017-10-16 04:26:22/GMT DEBUG io.grpc.netty.NettyClientHandler:68 -
----------------OUTBOUND--------------------
[id: 0x993a7494, L:/127.0.0.1:51082 - R:localhost/127.0.0.1:6868] GO_AWAY: lastStreamId=0, errorCode=0, length=0, bytes=
------------------------------------
2017-10-16 04:26:22/GMT DEBUG io.grpc.netty.NettyClientHandler:68 -
----------------OUTBOUND--------------------
[id: 0x8146a138, L:/127.0.0.1:51160 - R:localhost/127.0.0.1:6868] SETTINGS: ack=false,
settings={ENABLE_PUSH=0, MAX_CONCURRENT_STREAMS=0, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
------------------------------------
2017-10-16 04:26:22/GMT DEBUG io.grpc.netty.NettyClientHandler:68 -
----------------OUTBOUND--------------------
[id: 0x8146a138, L:/127.0.0.1:51160 - R:localhost/127.0.0.1:6868] WINDOW_UPDATE: streamId=0, windowSizeIncrement=983041
------------------------------------
Exception in thread "main" io.grpc.StatusRuntimeException: UNAVAILABLE
at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:227)
at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:208)
at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:141)
at com.x.x.x.commons.probes.TestProbesGrpc$TestProbesBlockingStub.liveness(TestProbesGrpc.java:147)
at com.x.x.x.client.TestClient.main(TestClient.java:69)
Caused by: javax.net.ssl.SSLException: handshake timed out
at io.netty.handler.ssl.SslHandler.handshake(...)(Unknown Source)
2017-10-16 04:26:23/GMT DEBUG io.netty.handler.codec.http2.Http2ConnectionHandler:83 -
[id: 0x993a7494, L:/127.0.0.1:51082 ! R:localhost/127.0.0.1:6868] Sending GOAWAY failed: lastStreamId '0', errorCode '0',
debugData ''. Forcing shutdown of the connection.
java.nio.channels.ClosedChannelException: null

Categories