Not able to call GCP NLP via java cloud sdk - java

I need technical help with respect to the Google Cloud Services integration with a SpringBoot Application.
Introduction: I am trying to fetch the sentiments of the statement using google cloud java sdk. I have created a spring-boot application and from there I'm trying to send the request via the GCP cloud language client jar.
Problem: GCP cloud language client jar is not able to make a request to the google services. It is going to timeout after almost 10-15 minutes.
Extra Info: I have tried the same using python and am able to do so in no time, working flawlessly. Hence, I can conclude the connection is there, my creds are working properly.
Project URL: https://github.com/chaundhyan/manas-gateway
Logs:
19:50:25.638 [main] DEBUG io.grpc.netty.shaded.io.netty.util.internal.PlatformDependent - org.jctools-core.MpscChunkedArrayQueue: available
Text Received: My Name is Mukul
Document Created: My Name is Mukul
19:50:26.116 [grpc-default-executor-0] DEBUG io.grpc.netty.shaded.io.netty.channel.DefaultChannelId - -Dio.netty.processId: 3109 (auto-detected)
19:50:26.118 [grpc-default-executor-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil - -Djava.net.preferIPv4Stack: false
19:50:26.119 [grpc-default-executor-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil - -Djava.net.preferIPv6Addresses: false
19:50:26.121 [grpc-default-executor-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil - Loopback interface: lo0 (lo0, 0:0:0:0:0:0:0:1%lo0)
19:50:26.122 [grpc-default-executor-0] DEBUG io.grpc.netty.shaded.io.netty.util.NetUtil - Failed to get SOMAXCONN from sysctl and file /proc/sys/net/core/somaxconn. Default: 128
19:50:26.125 [grpc-default-executor-0] DEBUG io.grpc.netty.shaded.io.netty.channel.DefaultChannelId - -Dio.netty.machineId: a0:99:9b:ff:fe:15:5a:49 (auto-detected)
19:50:26.180 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.netty.util.ResourceLeakDetectorFactory - Loaded default ResourceLeakDetector: io.grpc.netty.shaded.io.netty.util.ResourceLeakDetector#76ed8938
19:50:26.471 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.netty.handler.ssl.SslHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] HANDSHAKEN: protocol:TLSv1.3 cipher suite:TLS_AES_128_GCM_SHA256
19:50:26.476 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] OUTBOUND SETTINGS: ack=false settings={ENABLE_PUSH=0, MAX_CONCURRENT_STREAMS=0, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
19:50:26.479 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
19:50:26.491 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] INBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=100, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=65536}
19:50:26.492 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] OUTBOUND SETTINGS: ack=true
19:50:26.493 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] INBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
19:50:26.494 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] INBOUND SETTINGS: ack=true
19:54:26.502 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] INBOUND GO_AWAY: lastStreamId=2147483647 errorCode=0 length=17 bytes=73657373696f6e5f74696d65645f6f7574
19:54:26.515 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] INBOUND PING: ack=false bytes=0
19:54:26.515 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] OUTBOUND PING: ack=true bytes=0
19:54:26.525 [grpc-nio-worker-ELG-1-3] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyClientHandler - [id: 0x5e9a3940, L:/192.168.1.3:51891 - R:language.googleapis.com/142.250.183.74:443] INBOUND GO_AWAY: lastStreamId=0 errorCode=0 length=17 bytes=73657373696f6e5f74696d65645f6f7574
Exception in thread "main" com.google.api.gax.rpc.DeadlineExceededException: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: Deadline exceeded after 599.986342726s.
at com.google.api.gax.rpc.ApiExceptionFactory.createException(ApiExceptionFactory.java:51)
at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:72)
at com.google.api.gax.grpc.GrpcApiExceptionFactory.create(GrpcApiExceptionFactory.java:60)
at com.google.api.gax.grpc.GrpcExceptionCallable$ExceptionTransformingFuture.onFailure(GrpcExceptionCallable.java:97)
at com.google.api.core.ApiFutures$1.onFailure(ApiFutures.java:68)
at com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1041)
at com.google.common.util.concurrent.DirectExecutor.execute(DirectExecutor.java:30)
at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:1215)
at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:983)
at com.google.common.util.concurrent.AbstractFuture.setException(AbstractFuture.java:771)
at io.grpc.stub.ClientCalls$GrpcFuture.setException(ClientCalls.java:563)
at io.grpc.stub.ClientCalls$UnaryStreamToFuture.onClose(ClientCalls.java:533)
at io.grpc.internal.DelayedClientCall$CloseListenerRunnable.runInContext(DelayedClientCall.java:406)
at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Suppressed: com.google.api.gax.rpc.AsyncTaskException: Asynchronous task failed
at com.google.api.gax.rpc.ApiExceptions.callAndTranslateApiException(ApiExceptions.java:57)
at com.google.api.gax.rpc.UnaryCallable.call(UnaryCallable.java:112)
at com.google.cloud.language.v1.LanguageServiceClient.analyzeSentiment(LanguageServiceClient.java:217)
at com.google.cloud.language.v1.LanguageServiceClient.analyzeSentiment(LanguageServiceClient.java:164)
at com.mukul.pro.manasgateway.ManasGatewayApplication.main(ManasGatewayApplication.java:22)
Caused by: io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: Deadline exceeded after 599.986342726s.
at io.grpc.Status.asRuntimeException(Status.java:535)
... 10 more
I can get the top 2 sysout in the logs. But after that, it all went dark.
String text = "My Name is Mukul";
try (LanguageServiceClient language = LanguageServiceClient.create()) {
System.out.println("Text Received: " + text);
Document doc = Document.newBuilder().setContent(text).setType(Document.Type.PLAIN_TEXT).build();
System.out.println("Document Created: " + doc.getContent());
// Detects the sentiment of the text
Sentiment sentiment = language.analyzeSentiment(doc).getDocumentSentiment();
System.out.println("Got Sentiment: " + sentiment.getScore());
System.out.printf("Text: %s%n", text);
System.out.printf("Sentiment: %s, %s%n", sentiment.getScore(), sentiment.getMagnitude());
}

Related

Exception while invoking gRPC method

I am invoking my Java gRPC service from a PHP client (PHP client runs in CLI mode for the sake of debugging and will run in FPM mode in production). Yet the connection on a Java side seems to be refused for some reason:
14:28:57.679 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.AbstractByteBuf - -Dio.grpc.netty.shaded.io.netty.buffer.checkAccessible: true
14:28:57.679 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.buffer.AbstractByteBuf - -Dio.grpc.netty.shaded.io.netty.buffer.checkBounds: true
14:28:57.681 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.ResourceLeakDetectorFactory - Loaded default ResourceLeakDetector: io.grpc.netty.shaded.io.netty.util.ResourceLeakDetector#7663df67
14:28:57.777 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] OUTBOUND SETTINGS: ack=false settings={MAX_CONCURRENT_STREAMS=2147483647, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
14:28:57.782 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler - -Dio.netty.recycler.maxCapacityPerThread: 4096
14:28:57.783 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler - -Dio.netty.recycler.maxSharedCapacityFactor: 2
14:28:57.783 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler - -Dio.netty.recycler.linkCapacity: 16
14:28:57.783 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.util.Recycler - -Dio.netty.recycler.ratio: 8
14:28:57.812 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] OUTBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=983041
14:28:57.821 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] INBOUND SETTINGS: ack=false settings={ENABLE_PUSH=0, MAX_CONCURRENT_STREAMS=0, INITIAL_WINDOW_SIZE=4194303, MAX_FRAME_SIZE=4194303, MAX_HEADER_LIST_SIZE=8192, ︃=1}
14:28:57.823 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] OUTBOUND SETTINGS: ack=true
14:28:57.825 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] INBOUND WINDOW_UPDATE: streamId=0 windowSizeIncrement=4128768
14:28:57.826 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] INBOUND SETTINGS: ack=true
14:28:57.839 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] INBOUND HEADERS: streamId=1 headers=GrpcHttp2RequestHeaders[:path: /microservices.bitrix.leads.BitrixLeadService/CreateLead, :authority: localhost:50051, :method: POST, :scheme: http, te: trailers, content-type: application/grpc, grpc-accept-encoding: identity, deflate, gzip, user-agent: grpc-c/29.0.0 (linux; chttp2)] padding=0 endStream=false
14:28:57.879 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] INBOUND DATA: streamId=1 padding=0 endStream=true length=55 bytes=00000000320a0b3739323431303137313035120cd09cd0bed181d0bad0b2d0b018d7162212d0a1d182d0b0d0bdd0b8d181d0bbd0b0d0b2
14:28:57.881 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 - R:/127.0.0.1:57746] INBOUND RST_STREAM: streamId=1 errorCode=8
14:28:57.899 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 ! R:/127.0.0.1:57746] OUTBOUND RST_STREAM: streamId=1 errorCode=8
14:28:57.900 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 ! R:/127.0.0.1:57746] OUTBOUND GO_AWAY: lastStreamId=1 errorCode=2 length=0 bytes=
14:28:57.902 [grpc-default-worker-ELG-3-1] DEBUG io.grpc.netty.shaded.io.netty.handler.codec.http2.Http2ConnectionHandler - [id: 0xd418fb07, L:/127.0.0.1:50051 ! R:/127.0.0.1:57746] Sending GOAWAY failed: lastStreamId '1', errorCode '2', debugData ''. Forcing shutdown of the connection.
For some reasons, another exception may be thrown, though no changes to the code were made:
14:19:56.666 [grpc-default-worker-ELG-3-2] DEBUG io.grpc.netty.shaded.io.grpc.netty.NettyServerHandler - [id: 0xf3d688e2, L:/127.0.0.1:50051 - R:/127.0.0.1:52836] INBOUND RST_STREAM: streamId=1 errorCode=8

How do I recover a ManagedChannel that is is wrapped by Resillience4J and is in Half Open state?

I have a GRPC client proxy implementation that I wrapped in Resillience4J as a CircuitBreaker for failure scenarios.
The circuit breaker works fine and it opens, but it never recovers. When I goes back to Half-open and I hit the GRPC endpoint I get the same error as I do when it first breaks.
2022-09-18 19:34:11.258 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] OUTBOUND HEADERS: streamId=4835 headers=GrpcHttp2OutboundHeaders[:authority: 10.0.9.2:50000, :path: /grpc.reflection.v1alpha.ServerReflection/ServerReflectionInfo, :method: POST, :scheme: http, content-type: application/grpc, te: trailers, user-agent: grpc-java-netty/1.48.1, x-b3-traceid: 632772b3bbb53b9c47dddb9e00629897, x-b3-spanid: 5ad57d4d348297cb, x-b3-parentspanid: 4caf63e791f1a5ce, x-b3-sampled: 1, grpc-accept-encoding: gzip, grpc-timeout: 998924u] streamDependency=0 weight=16 exclusive=false padding=0 endStream=false
2022-09-18 19:34:11.259 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] OUTBOUND DATA: streamId=4835 padding=0 endStream=true length=8 bytes=00000000033a012a
2022-09-18 19:34:11.260 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] INBOUND PING: ack=false bytes=1234
2022-09-18 19:34:11.261 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] OUTBOUND PING: ack=true bytes=1234
2022-09-18 19:34:11.261 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] INBOUND HEADERS: streamId=4835 headers=GrpcHttp2ResponseHeaders[:status: 200, content-type: application/grpc, grpc-encoding: identity, grpc-accept-encoding: gzip] padding=0 endStream=false
2022-09-18 19:34:11.262 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] INBOUND DATA: streamId=4835 padding=0 endStream=false length=96 bytes=000000005b12033a012a32540a2a0a28677270632e7265666c656374696f6e2e7631616c7068612e5365727665725265666c656374696f6e0a260a246e65742e...
2022-09-18 19:34:11.262 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] OUTBOUND PING: ack=false bytes=1234
2022-09-18 19:34:11.262 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] INBOUND HEADERS: streamId=4835 headers=GrpcHttp2ResponseHeaders[grpc-status: 0] padding=0 endStream=true
2022-09-18 19:34:11.263 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : [id: 0x13ba49c9, L:/10.0.9.172:53308 - R:10.0.9.2/10.0.9.2:50000] INBOUND PING: ack=true bytes=1234
2022-09-18 19:34:11.263 DEBUG [,,] 1 --- [-worker-ELG-6-4] io.grpc.netty.NettyClientHandler : Window: 1048576
2022-09-18 19:34:16.257 ERROR [,632772b3bbb53b9c47dddb9e00629897,47dddb9e00629897] 1 --- [or-http-epoll-5] request : POST http://localhost:28082/grpc/Echo/echo 503 5005.583ms
Restarting the client makes it work again so I know the server is still working, but I am trying to avoid restarting the client.
I already put in enableRetry and deadline and keepalive none of them seem to make the channel reconfigure itself.
At least the HTTP service recovery still works, but the GRPC proxying service does not.
The weird part is it's getting pings according to the logs.
In case you need the source it's in https://github.com/trajano/spring-cloud-demo/tree/rework
Note the only way I can trigger this problem is through an artillery script to put sufficient load to trigger the circuit breaker.
UPDATE: I also tried using DirectExecutor on the client no luck still.
UPDATE: Removing resilience4j basically kills the gateway server under load.

Blocked vertx threads

I must learn to develop microservices using:
Java 8
Vertx 3.9
Maven 3.6
MongoDB over Docker 19.03.1
And as an IDE I am using Intellij
The code I'm working on is this:
import io.vertx.core.AbstractVerticle;
import io.vertx.core.Future;
import io.vertx.core.Promise;
import io.vertx.core.Vertx;
import io.vertx.core.http.HttpServer;
import io.vertx.core.json.Json;
import io.vertx.core.json.JsonObject;
import io.vertx.ext.mongo.MongoClient;
import io.vertx.ext.web.Router;
import io.vertx.ext.web.RoutingContext;
import io.vertx.ext.web.handler.BodyHandler;
public class MainVerticle extends AbstractVerticle {
MongoClient mongoSharedClient = null;
private Future<Void> prepareDatabase(){
Promise<Void> promises = Promise.promise();
JsonObject config = Vertx.currentContext().config();
String uri = config.getString("mongo_uri");
if(uri == null){
uri = "mongodb://localhost:27017";
}
String database = config.getString("mongo_db");
if (database == null){
database = "test";
}
JsonObject configMongo = new JsonObject();
configMongo.put("connection_string", uri);
configMongo.put("db_name", database);
mongoSharedClient = MongoClient.create(vertx, configMongo);
if(mongoSharedClient != null){
promises.complete();
}else {
promises.fail("Error in Database");
}
return promises.future();
}
#Override
public void start(Promise<Void> startFuture) throws Exception {
prepareDatabase().compose(as-> HttpServer()).onComplete(asyn->{
if(asyn.succeeded()){
startFuture.complete();
}else {
startFuture.fail("Error");
}
});
}
private Future<Void> HttpServer(){
Promise<Void> promises = Promise.promise();
HttpServer server = vertx.createHttpServer();
Router router = Router.router(vertx);
router.get("/test/").handler(this::pruebaRuta);
router.post("/create/").handler(this::createPrueba);
router.post().handler(BodyHandler.create());
server.requestHandler(router).listen(9090, ar -> {
if(ar.succeeded()){
promises.complete();
}else {
promises.fail(ar.cause());
}
});
return promises.future();
}
private void createPrueba(RoutingContext routingContext) {
JsonObject data = routingContext.getBodyAsJson();
mongoSharedClient.insert("User", data, result -> {
if(result.succeeded()){
routingContext.response().setStatusCode(200).putHeader("Content-Type", "text/html").end("Operation Successful");
}else {
routingContext.response().setStatusCode(400).putHeader("Content-Type", "text/html").end(result.cause().getMessage());
}
});
routingContext.response().setStatusCode(200).putHeader("Content-Type", "Application/Json; charset=utf-8").end(Json.encodePrettily(data));
}
private void pruebaRuta(RoutingContext routingContext) {
routingContext.response().setStatusCode(200).putHeader("Content-Type", "text/html").end("Success Execute!");
}
}
At the moment I only want to do small tests with postman, the problem is that when I run the project I get the following:
Connected to the target VM, address: '127.0.0.1:51951', transport: 'socket'
18:36:02.095 [main] DEBUG io.netty.util.internal.logging.InternalLoggerFactory - Using SLF4J as the default logging framework
18:36:02.108 [main] DEBUG io.netty.util.ResourceLeakDetector - -Dio.netty.leakDetection.level: simple
18:36:02.108 [main] DEBUG io.netty.util.ResourceLeakDetector - -Dio.netty.leakDetection.targetRecords: 4
18:36:02.393 [main] DEBUG io.netty.util.internal.InternalThreadLocalMap - -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024
18:36:02.393 [main] DEBUG io.netty.util.internal.InternalThreadLocalMap - -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096
18:36:02.578 [main] DEBUG io.netty.channel.MultithreadEventLoopGroup - -Dio.netty.eventLoopThreads: 8
18:36:02.824 [main] DEBUG io.netty.channel.nio.NioEventLoop - -Dio.netty.noKeySetOptimization: false
18:36:02.824 [main] DEBUG io.netty.channel.nio.NioEventLoop - -Dio.netty.selectorAutoRebuildThreshold: 512
18:36:02.874 [main] DEBUG io.netty.util.internal.PlatformDependent - Platform: Windows
18:36:02.878 [main] DEBUG io.netty.util.internal.PlatformDependent0 - -Dio.netty.noUnsafe: false
18:36:02.880 [main] DEBUG io.netty.util.internal.PlatformDependent0 - Java version: 8
18:36:02.883 [main] DEBUG io.netty.util.internal.PlatformDependent0 - sun.misc.Unsafe.theUnsafe: available
18:36:02.886 [main] DEBUG io.netty.util.internal.PlatformDependent0 - sun.misc.Unsafe.copyMemory: available
18:36:02.888 [main] DEBUG io.netty.util.internal.PlatformDependent0 - java.nio.Buffer.address: available
18:36:02.889 [main] DEBUG io.netty.util.internal.PlatformDependent0 - direct buffer constructor: available
18:36:02.892 [main] DEBUG io.netty.util.internal.PlatformDependent0 - java.nio.Bits.unaligned: available, true
18:36:02.892 [main] DEBUG io.netty.util.internal.PlatformDependent0 - jdk.internal.misc.Unsafe.allocateUninitializedArray(int): unavailable prior to Java9
18:36:02.892 [main] DEBUG io.netty.util.internal.PlatformDependent0 - java.nio.DirectByteBuffer.<init>(long, int): available
18:36:02.892 [main] DEBUG io.netty.util.internal.PlatformDependent - sun.misc.Unsafe: available
18:36:02.894 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.tmpdir: C:\Users\kathy\AppData\Local\Temp (java.io.tmpdir)
18:36:02.894 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.bitMode: 64 (sun.arch.data.model)
18:36:02.899 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.maxDirectMemory: 934281216 bytes
18:36:02.899 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.uninitializedArrayAllocationThreshold: -1
18:36:02.902 [main] DEBUG io.netty.util.internal.CleanerJava6 - java.nio.ByteBuffer.cleaner(): available
18:36:02.902 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.noPreferDirect: false
18:36:02.924 [main] DEBUG io.netty.util.internal.PlatformDependent - org.jctools-core.MpscChunkedArrayQueue: available
18:36:06.345 [main] DEBUG io.netty.resolver.dns.DefaultDnsServerAddressStreamProvider - Default DNS servers: [/1.1.1.1:53, /8.8.8.8:53] (sun.net.dns.ResolverConfiguration)
18:36:08.487 [vert.x-eventloop-thread-0] INFO org.mongodb.driver.cluster - Cluster created with settings {hosts=[localhost:27017], mode=SINGLE, requiredClusterType=UNKNOWN, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500}
18:36:08.619 [vert.x-eventloop-thread-0] DEBUG org.mongodb.driver.cluster - Updating cluster description to {type=UNKNOWN, servers=[{address=localhost:27017, type=UNKNOWN, state=CONNECTING}]
jul 07, 2020 6:36:09 PM io.vertx.core.impl.BlockedThreadChecker
ADVERTENCIA: Thread Thread[vert.x-eventloop-thread-0,5,main]=Thread[vert.x-eventloop-thread-0,5,main] has been blocked for 2459 ms, time limit is 2000 ms
18:36:10.012 [vert.x-eventloop-thread-0] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.numHeapArenas: 8
18:36:10.013 [vert.x-eventloop-thread-0] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.numDirectArenas: 8
18:36:10.013 [vert.x-eventloop-thread-0] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.pageSize: 8192
18:36:10.013 [vert.x-eventloop-thread-0] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxOrder: 11
18:36:10.013 [vert.x-eventloop-thread-0] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.chunkSize: 16777216
18:36:10.013 [vert.x-eventloop-thread-0] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.tinyCacheSize: 512
18:36:10.013 [vert.x-eventloop-thread-0] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.smallCacheSize: 256
18:36:10.013 [vert.x-eventloop-thread-0] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.normalCacheSize: 64
18:36:10.013 [vert.x-eventloop-thread-0] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxCachedBufferCapacity: 32768
18:36:10.014 [vert.x-eventloop-thread-0] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.cacheTrimInterval: 8192
18:36:10.014 [vert.x-eventloop-thread-0] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.cacheTrimIntervalMillis: 0
18:36:10.014 [vert.x-eventloop-thread-0] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.useCacheForAllThreads: true
18:36:10.014 [vert.x-eventloop-thread-0] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxCachedByteBuffersPerChunk: 1023
18:36:10.327 [vert.x-eventloop-thread-0] DEBUG io.netty.util.NetUtil - -Djava.net.preferIPv4Stack: false
18:36:10.328 [vert.x-eventloop-thread-0] DEBUG io.netty.util.NetUtil - -Djava.net.preferIPv6Addresses: false
jul 07, 2020 6:36:10 PM io.vertx.core.impl.BlockedThreadChecker
ADVERTENCIA: Thread Thread[vert.x-eventloop-thread-0,5,main]=Thread[vert.x-eventloop-thread-0,5,main] has been blocked for 3485 ms, time limit is 2000 ms
18:36:11.188 [vert.x-eventloop-thread-0] DEBUG io.netty.util.NetUtil - Loopback interface: lo (Software Loopback Interface 1, 127.0.0.1)
18:36:11.189 [vert.x-eventloop-thread-0] DEBUG io.netty.util.NetUtil - Failed to get SOMAXCONN from sysctl and file \proc\sys\net\core\somaxconn. Default: 200
jul 07, 2020 6:36:11 PM io.vertx.core.impl.BlockedThreadChecker
ADVERTENCIA: Thread Thread[vert.x-eventloop-thread-0,5,main]=Thread[vert.x-eventloop-thread-0,5,main] has been blocked for 4489 ms, time limit is 2000 ms
18:36:12.550 [vert.x-eventloop-thread-0] DEBUG io.netty.channel.DefaultChannelId - -Dio.netty.processId: 6252 (auto-detected)
jul 07, 2020 6:36:12 PM io.vertx.core.impl.BlockedThreadChecker
ADVERTENCIA: Thread Thread[vert.x-eventloop-thread-0,5,main]=Thread[vert.x-eventloop-thread-0,5,main] has been blocked for 5489 ms, time limit is 2000 ms
io.vertx.core.VertxException: Thread blocked
at java.net.NetworkInterface.getAll(Native Method)
at java.net.NetworkInterface.getNetworkInterfaces(NetworkInterface.java:355)
at io.netty.util.internal.MacAddressUtil.bestAvailableMac(MacAddressUtil.java:55)
at io.netty.util.internal.MacAddressUtil.defaultMachineId(MacAddressUtil.java:138)
at io.netty.channel.DefaultChannelId.<clinit>(DefaultChannelId.java:99)
at io.netty.channel.AbstractChannel.newId(AbstractChannel.java:101)
at io.netty.channel.AbstractChannel.<init>(AbstractChannel.java:73)
at io.netty.channel.nio.AbstractNioChannel.<init>(AbstractNioChannel.java:80)
at io.netty.channel.nio.AbstractNioMessageChannel.<init>(AbstractNioMessageChannel.java:42)
at io.netty.channel.socket.nio.NioDatagramChannel.<init>(NioDatagramChannel.java:150)
at io.netty.channel.socket.nio.NioDatagramChannel.<init>(NioDatagramChannel.java:118)
at io.vertx.core.net.impl.transport.Transport.datagramChannel(Transport.java:162)
at io.vertx.core.impl.resolver.DnsResolverProvider$1.lambda$newResolver$0(DnsResolverProvider.java:136)
at io.vertx.core.impl.resolver.DnsResolverProvider$1$$Lambda$43/1292567456.newChannel(Unknown Source)
at io.netty.bootstrap.AbstractBootstrap.initAndRegister(AbstractBootstrap.java:310)
at io.netty.bootstrap.AbstractBootstrap.register(AbstractBootstrap.java:227)
at io.netty.resolver.dns.DnsNameResolver.<init>(DnsNameResolver.java:451)
at io.netty.resolver.dns.DnsNameResolverBuilder.build(DnsNameResolverBuilder.java:473)
at io.vertx.core.impl.resolver.DnsResolverProvider$1$1.newNameResolver(DnsResolverProvider.java:186)
at io.netty.resolver.dns.DnsAddressResolverGroup.newResolver(DnsAddressResolverGroup.java:91)
at io.netty.resolver.dns.DnsAddressResolverGroup.newResolver(DnsAddressResolverGroup.java:76)
at io.netty.resolver.AddressResolverGroup.getResolver(AddressResolverGroup.java:70)
at io.vertx.core.impl.resolver.DnsResolverProvider$1.newResolver(DnsResolverProvider.java:190)
at io.netty.resolver.AddressResolverGroup.getResolver(AddressResolverGroup.java:70)
at io.vertx.core.impl.AddressResolver.resolveHostname(AddressResolver.java:82)
at io.vertx.core.impl.VertxImpl.resolveAddress(VertxImpl.java:810)
at io.vertx.core.net.impl.AsyncResolveConnectHelper.doBind(AsyncResolveConnectHelper.java:56)
at io.vertx.core.http.impl.HttpServerImpl.listen(HttpServerImpl.java:253)
at io.vertx.core.http.impl.HttpServerImpl.listen(HttpServerImpl.java:188)
at io.vertx.core.http.impl.HttpServerImpl.listen(HttpServerImpl.java:184)
at com.lakatuna.com.MainVerticle.HttpServer(MainVerticle.java:75)
at com.lakatuna.com.MainVerticle.lambda$start$0(MainVerticle.java:56)
at com.lakatuna.com.MainVerticle$$Lambda$30/752448968.apply(Unknown Source)
at io.vertx.core.Future.lambda$compose$3(Future.java:363)
at io.vertx.core.Future$$Lambda$32/767632927.handle(Unknown Source)
at io.vertx.core.impl.FutureImpl.dispatch(FutureImpl.java:105)
at io.vertx.core.impl.FutureImpl.onComplete(FutureImpl.java:83)
at io.vertx.core.Future.compose(Future.java:359)
at io.vertx.core.Future.compose(Future.java:331)
at com.lakatuna.com.MainVerticle.start(MainVerticle.java:56)
at io.vertx.core.impl.DeploymentManager.lambda$doDeploy$9(DeploymentManager.java:556)
at io.vertx.core.impl.DeploymentManager$$Lambda$9/726379593.handle(Unknown Source)
at io.vertx.core.impl.ContextImpl.executeTask(ContextImpl.java:369)
at io.vertx.core.impl.EventLoopContext.lambda$executeAsync$0(EventLoopContext.java:38)
at io.vertx.core.impl.EventLoopContext$$Lambda$10/1212772528.run(Unknown Source)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute$$$capture(AbstractEventExecutor.java:164)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:500)
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.lang.Thread.run(Thread.java:748)
18:36:13.149 [vert.x-eventloop-thread-0] DEBUG io.netty.channel.DefaultChannelId - -Dio.netty.machineId: 9c:ad:97:ff:fe:8b:00:df (auto-detected)
18:36:13.342 [cluster-ClusterId{value='5f0506e849515074214c3f60', description='null'}-localhost:27017] DEBUG org.mongodb.driver.connection - Closing connection connectionId{localValue:1}
18:36:13.602 [vert.x-eventloop-thread-0] DEBUG io.netty.buffer.ByteBufUtil - -Dio.netty.allocator.type: pooled
18:36:13.602 [vert.x-eventloop-thread-0] DEBUG io.netty.buffer.ByteBufUtil - -Dio.netty.threadLocalDirectBufferSize: 0
18:36:13.602 [vert.x-eventloop-thread-0] DEBUG io.netty.buffer.ByteBufUtil - -Dio.netty.maxThreadLocalCharBufferSize: 16384
18:36:13.635 [cluster-ClusterId{value='5f0506e849515074214c3f60', description='null'}-localhost:27017] INFO org.mongodb.driver.cluster - Exception in monitor thread while connecting to server localhost:27017
com.mongodb.MongoSocketOpenException: Exception opening socket
at com.mongodb.internal.connection.AsynchronousSocketChannelStream$OpenCompletionHandler.failed(AsynchronousSocketChannelStream.java:117)
at sun.nio.ch.Invoker.invokeUnchecked(Invoker.java:128)
at sun.nio.ch.Invoker.invokeDirect(Invoker.java:157)
at sun.nio.ch.Invoker.invoke(Invoker.java:185)
at sun.nio.ch.Invoker.invoke(Invoker.java:297)
at sun.nio.ch.WindowsAsynchronousSocketChannelImpl$ConnectTask.failed(WindowsAsynchronousSocketChannelImpl.java:302)
at sun.nio.ch.Iocp$EventHandlerTask.run(Iocp.java:399)
at sun.nio.ch.AsynchronousChannelGroupImpl$1.run(AsynchronousChannelGroupImpl.java:112)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: El equipo remoto rechazó la conexión de red.
at sun.nio.ch.Iocp.translateErrorToIOException(Iocp.java:309)
at sun.nio.ch.Iocp.access$700(Iocp.java:46)
... 5 common frames omitted
jul 07, 2020 6:36:13 PM io.vertx.core.impl.BlockedThreadChecker
ADVERTENCIA: Thread Thread[vert.x-eventloop-thread-0,5,main]=Thread[vert.x-eventloop-thread-0,5,main] has been blocked for 6490 ms, time limit is 2000 ms
io.vertx.core.VertxException: Thread blocked
at java.net.DatagramSocket$1.run(DatagramSocket.java:312)
at java.net.DatagramSocket$1.run(DatagramSocket.java:309)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.DatagramSocket.checkOldImpl(DatagramSocket.java:308)
at java.net.DatagramSocket.<init>(DatagramSocket.java:211)
at sun.nio.ch.DatagramSocketAdaptor.<init>(DatagramSocketAdaptor.java:57)
at sun.nio.ch.DatagramSocketAdaptor.create(DatagramSocketAdaptor.java:63)
at sun.nio.ch.DatagramChannelImpl.socket(DatagramChannelImpl.java:173)
at io.netty.channel.socket.nio.NioDatagramChannelConfig.<init>(NioDatagramChannelConfig.java:117)
at io.netty.channel.socket.nio.NioDatagramChannel.<init>(NioDatagramChannel.java:151)
at io.netty.channel.socket.nio.NioDatagramChannel.<init>(NioDatagramChannel.java:118)
at io.vertx.core.net.impl.transport.Transport.datagramChannel(Transport.java:162)
at io.vertx.core.impl.resolver.DnsResolverProvider$1.lambda$newResolver$0(DnsResolverProvider.java:136)
at io.vertx.core.impl.resolver.DnsResolverProvider$1$$Lambda$43/1292567456.newChannel(Unknown Source)
at io.netty.bootstrap.AbstractBootstrap.initAndRegister(AbstractBootstrap.java:310)
at io.netty.bootstrap.AbstractBootstrap.register(AbstractBootstrap.java:227)
at io.netty.resolver.dns.DnsNameResolver.<init>(DnsNameResolver.java:451)
at io.netty.resolver.dns.DnsNameResolverBuilder.build(DnsNameResolverBuilder.java:473)
at io.vertx.core.impl.resolver.DnsResolverProvider$1$1.newNameResolver(DnsResolverProvider.java:186)
at io.netty.resolver.dns.DnsAddressResolverGroup.newResolver(DnsAddressResolverGroup.java:91)
at io.netty.resolver.dns.DnsAddressResolverGroup.newResolver(DnsAddressResolverGroup.java:76)
at io.netty.resolver.AddressResolverGroup.getResolver(AddressResolverGroup.java:70)
at io.vertx.core.impl.resolver.DnsResolverProvider$1.newResolver(DnsResolverProvider.java:190)
at io.netty.resolver.AddressResolverGroup.getResolver(AddressResolverGroup.java:70)
at io.vertx.core.impl.AddressResolver.resolveHostname(AddressResolver.java:82)
at io.vertx.core.impl.VertxImpl.resolveAddress(VertxImpl.java:810)
at io.vertx.core.net.impl.AsyncResolveConnectHelper.doBind(AsyncResolveConnectHelper.java:56)
at io.vertx.core.http.impl.HttpServerImpl.listen(HttpServerImpl.java:253)
at io.vertx.core.http.impl.HttpServerImpl.listen(HttpServerImpl.java:188)
at io.vertx.core.http.impl.HttpServerImpl.listen(HttpServerImpl.java:184)
at com.lakatuna.com.MainVerticle.HttpServer(MainVerticle.java:75)
at com.lakatuna.com.MainVerticle.lambda$start$0(MainVerticle.java:56)
at com.lakatuna.com.MainVerticle$$Lambda$30/752448968.apply(Unknown Source)
at io.vertx.core.Future.lambda$compose$3(Future.java:363)
at io.vertx.core.Future$$Lambda$32/767632927.handle(Unknown Source)
at io.vertx.core.impl.FutureImpl.dispatch(FutureImpl.java:105)
at io.vertx.core.impl.FutureImpl.onComplete(FutureImpl.java:83)
at io.vertx.core.Future.compose(Future.java:359)
at io.vertx.core.Future.compose(Future.java:331)
at com.lakatuna.com.MainVerticle.start(MainVerticle.java:56)
at io.vertx.core.impl.DeploymentManager.lambda$doDeploy$9(DeploymentManager.java:556)
at io.vertx.core.impl.DeploymentManager$$Lambda$9/726379593.handle(Unknown Source)
at io.vertx.core.impl.ContextImpl.executeTask(ContextImpl.java:369)
at io.vertx.core.impl.EventLoopContext.lambda$executeAsync$0(EventLoopContext.java:38)
at io.vertx.core.impl.EventLoopContext$$Lambda$10/1212772528.run(Unknown Source)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute$$$capture(AbstractEventExecutor.java:164)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:472)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:500)
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.lang.Thread.run(Thread.java:748)
18:36:13.660 [cluster-ClusterId{value='5f0506e849515074214c3f60', description='null'}-localhost:27017] DEBUG org.mongodb.driver.cluster - Updating cluster description to {type=UNKNOWN, servers=[{address=localhost:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.io.IOException: El equipo remoto rechazó la conexión de red.
}}]
Vertx threads are blocked.
In postman I try to test the route localhost:9090/create/ and it returns error 500
I know the question is long, but seriously I don't know what to do, I have looked for a solution but the truth is that I don't really understand what my mistake is or what is happening, I need your help. Thank you very much
If you look at the middle stacktrace, it is reporting that the host it's trying to reach to connect to the Mongo database rejected the connection. Make sure it is reachable.
I'll bet the 500 error you're seeing is a Null Pointer Exception caused by you trying to use the mongo client even though it failed to initialize and the variable was never assigned.
The first and third stacktrace are complaining that binding to the ports while starting up the server is taking too long. This might be an issue in the underlying netty library.

Why doesn't AsyncHttpClient close the thread after throwing an exception?

I'm using the Zendesk Java Client. When I supply the correct credentials, it works as expected. However, I stumbled on a scenario where if the wrong credentials are passed in, the thread just hangs and it throws an error. Whether I run the code with or without the debugger, the code just stays running. It doesn't exit or return control to the debugger.
Is this an issue with the library, or am I misunderstanding how the AsyncHttpClient works?
My code is below:
fun zdtestWrongCredentials() {
val client = asyncHttpClient(
config()
.setRequestTimeout(5000)
.setReadTimeout(5000)
.setShutdownTimeout(3000)
.setPooledConnectionIdleTimeout(5000)
.setKeepAlive(false)
)
var zd = Zendesk.Builder("https://website.zendesk.com")
.setClient(client)
.setUsername("john.doe#website.com")
.setPassword("abcd")
.build()
var ticket = Ticket(123, "a", Comment("abc"))
// The code hangs here. It's unclear why it exhibits this behavior.
var test = zd.createTicket(ticket)
// The code does not reach this line.
client.close()
return
}
Although I'm using Kotlin, I tried replicating this issue in a simple Java project and the issue persists.
The stack trace is below. There is an exception at the bottom, but even after this exception, the program does not exit/give control back to the debugger.
"C:\Program Files\Java\jdk-11.0.4\bin\java.exe"
09:47:01.463 [main] DEBUG io.netty.util.internal.logging.InternalLoggerFactory - Using SLF4J as the default logging framework
09:47:01.480 [main] DEBUG io.netty.util.ResourceLeakDetector - -Dio.netty.leakDetection.level: simple
09:47:01.480 [main] DEBUG io.netty.util.ResourceLeakDetector - -Dio.netty.leakDetection.targetRecords: 4
09:47:01.550 [main] DEBUG io.netty.util.internal.PlatformDependent - Platform: Windows
09:47:01.562 [main] DEBUG io.netty.util.internal.PlatformDependent0 - -Dio.netty.noUnsafe: false
09:47:01.562 [main] DEBUG io.netty.util.internal.PlatformDependent0 - Java version: 11
09:47:01.566 [main] DEBUG io.netty.util.internal.PlatformDependent0 - sun.misc.Unsafe.theUnsafe: available
09:47:01.567 [main] DEBUG io.netty.util.internal.PlatformDependent0 - sun.misc.Unsafe.copyMemory: available
09:47:01.568 [main] DEBUG io.netty.util.internal.PlatformDependent0 - java.nio.Buffer.address: available
09:47:01.575 [main] DEBUG io.netty.util.internal.PlatformDependent0 - direct buffer constructor: unavailable
java.lang.UnsupportedOperationException: Reflective setAccessible(true) disabled
at io.netty.util.internal.ReflectionUtil.trySetAccessible(ReflectionUtil.java:31)
at io.netty.util.internal.PlatformDependent0$4.run(PlatformDependent0.java:224)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at io.netty.util.internal.PlatformDependent0.<clinit>(PlatformDependent0.java:218)
at io.netty.util.internal.PlatformDependent.isAndroid(PlatformDependent.java:212)
at io.netty.util.internal.PlatformDependent.<clinit>(PlatformDependent.java:80)
at io.netty.util.ResourceLeakDetector.<init>(ResourceLeakDetector.java:171)
at io.netty.util.ResourceLeakDetector.<init>(ResourceLeakDetector.java:213)
at io.netty.util.ResourceLeakDetectorFactory$DefaultResourceLeakDetectorFactory.newResourceLeakDetector(ResourceLeakDetectorFactory.java:201)
at io.netty.util.HashedWheelTimer.<clinit>(HashedWheelTimer.java:89)
at org.asynchttpclient.DefaultAsyncHttpClient.newNettyTimer(DefaultAsyncHttpClient.java:96)
at org.asynchttpclient.DefaultAsyncHttpClient.<init>(DefaultAsyncHttpClient.java:87)
at org.asynchttpclient.Dsl.asyncHttpClient(Dsl.java:32)
at com.website.MainKt.test123(Main.kt:321)
at com.website.MainKt.main(Main.kt:288)
at com.website.MainKt.main(Main.kt)
09:47:01.577 [main] DEBUG io.netty.util.internal.PlatformDependent0 - java.nio.Bits.unaligned: available, true
09:47:01.579 [main] DEBUG io.netty.util.internal.PlatformDependent0 - jdk.internal.misc.Unsafe.allocateUninitializedArray(int): unavailable
java.lang.IllegalAccessException: class io.netty.util.internal.PlatformDependent0$6 cannot access class jdk.internal.misc.Unsafe (in module java.base) because module java.base does not export jdk.internal.misc to unnamed module #4de5031f
at java.base/jdk.internal.reflect.Reflection.newIllegalAccessException(Reflection.java:361)
at java.base/java.lang.reflect.AccessibleObject.checkAccess(AccessibleObject.java:591)
at java.base/java.lang.reflect.Method.invoke(Method.java:558)
at io.netty.util.internal.PlatformDependent0$6.run(PlatformDependent0.java:334)
at java.base/java.security.AccessController.doPrivileged(Native Method)
at io.netty.util.internal.PlatformDependent0.<clinit>(PlatformDependent0.java:325)
at io.netty.util.internal.PlatformDependent.isAndroid(PlatformDependent.java:212)
at io.netty.util.internal.PlatformDependent.<clinit>(PlatformDependent.java:80)
at io.netty.util.ResourceLeakDetector.<init>(ResourceLeakDetector.java:171)
at io.netty.util.ResourceLeakDetector.<init>(ResourceLeakDetector.java:213)
at io.netty.util.ResourceLeakDetectorFactory$DefaultResourceLeakDetectorFactory.newResourceLeakDetector(ResourceLeakDetectorFactory.java:201)
at io.netty.util.HashedWheelTimer.<clinit>(HashedWheelTimer.java:89)
at org.asynchttpclient.DefaultAsyncHttpClient.newNettyTimer(DefaultAsyncHttpClient.java:96)
at org.asynchttpclient.DefaultAsyncHttpClient.<init>(DefaultAsyncHttpClient.java:87)
at org.asynchttpclient.Dsl.asyncHttpClient(Dsl.java:32)
at com.website.MainKt.test123(Main.kt:321)
at com.website.MainKt.main(Main.kt:288)
at com.website.MainKt.main(Main.kt)
09:47:01.579 [main] DEBUG io.netty.util.internal.PlatformDependent0 - java.nio.DirectByteBuffer.<init>(long, int): unavailable
09:47:01.579 [main] DEBUG io.netty.util.internal.PlatformDependent - sun.misc.Unsafe: available
09:47:01.634 [main] DEBUG io.netty.util.internal.PlatformDependent - maxDirectMemory: 6404702208 bytes (maybe)
09:47:01.635 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.tmpdir: C:\Users\john~1.doe\AppData\Local\Temp (java.io.tmpdir)
09:47:01.635 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.bitMode: 64 (sun.arch.data.model)
09:47:01.639 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.maxDirectMemory: -1 bytes
09:47:01.639 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.uninitializedArrayAllocationThreshold: -1
09:47:01.649 [main] DEBUG io.netty.util.internal.CleanerJava9 - java.nio.ByteBuffer.cleaner(): available
09:47:01.649 [main] DEBUG io.netty.util.internal.PlatformDependent - -Dio.netty.noPreferDirect: false
09:47:01.649 [main] DEBUG io.netty.util.ResourceLeakDetectorFactory - Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector#13acb0d1
09:47:01.705 [main] DEBUG io.netty.util.internal.PlatformDependent - org.jctools-core.MpscChunkedArrayQueue: available
09:47:02.621 [main] DEBUG io.netty.handler.ssl.JdkSslContext - Default protocols (JDK): [TLSv1.2, TLSv1.1, TLSv1]
09:47:02.622 [main] DEBUG io.netty.handler.ssl.JdkSslContext - Default cipher suites (JDK): [TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA, TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA, TLS_RSA_WITH_AES_128_GCM_SHA256, TLS_RSA_WITH_AES_128_CBC_SHA, TLS_RSA_WITH_AES_256_CBC_SHA, TLS_AES_128_GCM_SHA256, TLS_AES_256_GCM_SHA384]
09:47:02.651 [main] DEBUG io.netty.channel.MultithreadEventLoopGroup - -Dio.netty.eventLoopThreads: 8
09:47:02.674 [main] DEBUG io.netty.channel.nio.NioEventLoop - -Dio.netty.noKeySetOptimization: false
09:47:02.675 [main] DEBUG io.netty.channel.nio.NioEventLoop - -Dio.netty.selectorAutoRebuildThreshold: 512
09:47:02.714 [main] DEBUG io.netty.util.internal.InternalThreadLocalMap - -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024
09:47:02.714 [main] DEBUG io.netty.util.internal.InternalThreadLocalMap - -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.numHeapArenas: 8
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.numDirectArenas: 8
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.pageSize: 8192
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxOrder: 11
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.chunkSize: 16777216
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.tinyCacheSize: 512
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.smallCacheSize: 256
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.normalCacheSize: 64
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxCachedBufferCapacity: 32768
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.cacheTrimInterval: 8192
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.useCacheForAllThreads: true
09:47:02.728 [main] DEBUG io.netty.buffer.PooledByteBufAllocator - -Dio.netty.allocator.maxCachedByteBuffersPerChunk: 1023
09:47:02.738 [main] DEBUG io.netty.buffer.ByteBufUtil - -Dio.netty.allocator.type: pooled
09:47:02.738 [main] DEBUG io.netty.buffer.ByteBufUtil - -Dio.netty.threadLocalDirectBufferSize: 0
09:47:02.738 [main] DEBUG io.netty.buffer.ByteBufUtil - -Dio.netty.maxThreadLocalCharBufferSize: 16384
09:47:03.522 [main] DEBUG org.zendesk.client.v2.Zendesk - Request POST https://website.zendesk.com/api/v2/tickets.json application/json; charset=UTF-8 92 bytes
09:47:03.568 [main] DEBUG io.netty.buffer.AbstractByteBuf - -Dio.netty.buffer.checkAccessible: true
09:47:03.568 [main] DEBUG io.netty.buffer.AbstractByteBuf - -Dio.netty.buffer.checkBounds: true
09:47:03.569 [main] DEBUG io.netty.util.ResourceLeakDetectorFactory - Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector#604c5de8
09:47:03.650 [main] DEBUG io.netty.channel.DefaultChannelId - -Dio.netty.processId: 12648 (auto-detected)
09:47:03.653 [main] DEBUG io.netty.util.NetUtil - -Djava.net.preferIPv4Stack: false
09:47:03.653 [main] DEBUG io.netty.util.NetUtil - -Djava.net.preferIPv6Addresses: false
09:47:04.352 [main] DEBUG io.netty.util.NetUtil - Loopback interface: [lo, Software Loopback Interface 1, 127.0.0.1] ([lo, Software Loopback Interface 1, 127.0.0.1], {})
09:47:04.353 [main] DEBUG io.netty.util.NetUtil - Failed to get SOMAXCONN from sysctl and file \proc\sys\net\core\somaxconn. Default: 200
09:47:05.043 [main] DEBUG io.netty.channel.DefaultChannelId - -Dio.netty.machineId: 00:50:b6:ff:fe:ae:6e:01 (auto-detected)
09:47:05.172 [AsyncHttpClient-3-1] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.maxCapacityPerThread: 4096
09:47:05.172 [AsyncHttpClient-3-1] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.maxSharedCapacityFactor: 2
09:47:05.173 [AsyncHttpClient-3-1] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.linkCapacity: 16
09:47:05.173 [AsyncHttpClient-3-1] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.ratio: 8
09:47:05.614 [AsyncHttpClient-3-1] DEBUG org.asynchttpclient.netty.channel.NettyConnectListener - Using new Channel '[id: 0xf398d3e1, L:/192.168.108.56:64305 - R:website.zendesk.com/104.16.54.111:443]' for 'POST' to '/api/v2/tickets.json'
09:47:05.630 [AsyncHttpClient-3-1] DEBUG io.netty.handler.ssl.SslHandler - [id: 0xf398d3e1, L:/192.168.108.56:64305 - R:website.zendesk.com/104.16.54.111:443] HANDSHAKEN: TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256
09:47:05.810 [AsyncHttpClient-3-1] DEBUG org.asynchttpclient.netty.handler.HttpHandler -
Request DefaultFullHttpRequest(decodeResult: success, version: HTTP/1.1, content: UnpooledHeapByteBuf(freed))
POST /api/v2/tickets.json HTTP/1.1
Content-type: application/json; charset=UTF-8
content-length: 92
connection: close
host: website.zendesk.com
authorization: Basic RGVyZWsudG93cmlzc0BkZWxhd2FyZWxpZmUuY29tOlRlczE=
accept: */*
user-agent: AHC/2.1
Response DefaultHttpResponse(decodeResult: success, version: HTTP/1.1)
HTTP/1.1 401 Unauthorized
Date: Tue, 20 Aug 2019 13:47:06 GMT
Content-Type: application/json; charset=UTF-8
Content-Length: 37
Connection: close
Set-Cookie: __cfduid=d807076f1918856a9ecbded67e619ee901566308826; expires=Wed, 19-Aug-20 13:47:06 GMT; path=/; domain=.website.zendesk.com; HttpOnly
WWW-Authenticate: Basic realm="Web Password"
Strict-Transport-Security: max-age=31536000;
Cache-Control: no-cache
X-Zendesk-Origin-Server: app23.pod20.usw2.zdsys.com
X-Request-Id: 5094d137ce35e1fa-SEA
X-Runtime: 0.060433
X-Zendesk-Request-Id: 7d89cc0062f1b7e6f05c
Set-Cookie: __cfruid=b0a77a0d73109c7862b0ab39be944601c81b0353-1566308826; path=/; domain=.website.zendesk.com; HttpOnly
Expect-CT: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
Server: cloudflare
CF-RAY: 5094d137ce35e1fa-ORD
09:47:05.829 [AsyncHttpClient-3-1] INFO org.asynchttpclient.netty.handler.intercept.Unauthorized401Interceptor - Can't handle 401 as auth was already performed
09:47:05.839 [AsyncHttpClient-3-1] DEBUG org.asynchttpclient.netty.channel.ChannelManager - Closing Channel [id: 0xf398d3e1, L:/192.168.108.56:64305 - R:website.zendesk.com/104.16.54.111:443]
09:47:05.844 [AsyncHttpClient-3-1] DEBUG org.zendesk.client.v2.Zendesk - Response HTTP/401 Unauthorized
{"error":"Couldn't authenticate you"}
09:47:05.846 [AsyncHttpClient-3-1] DEBUG org.asynchttpclient.AsyncCompletionHandler - HTTP/401: Unauthorized - {"error":"Couldn't authenticate you"}
org.zendesk.client.v2.ZendeskResponseException: HTTP/401: Unauthorized - {"error":"Couldn't authenticate you"}
at org.zendesk.client.v2.Zendesk$BasicAsyncCompletionHandler.onCompleted(Zendesk.java:1997)
at org.asynchttpclient.AsyncCompletionHandler.onCompleted(AsyncCompletionHandler.java:66)
at org.asynchttpclient.netty.NettyResponseFuture.loadContent(NettyResponseFuture.java:223)
at org.asynchttpclient.netty.NettyResponseFuture.done(NettyResponseFuture.java:258)
at org.asynchttpclient.netty.handler.AsyncHttpClientHandler.finishUpdate(AsyncHttpClientHandler.java:239)
at org.asynchttpclient.netty.handler.HttpHandler.handleChunk(HttpHandler.java:113)
at org.asynchttpclient.netty.handler.HttpHandler.handleRead(HttpHandler.java:142)
at org.asynchttpclient.netty.handler.AsyncHttpClientHandler.channelRead(AsyncHttpClientHandler.java:76)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:438)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:323)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:297)
at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:253)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1436)
at io.netty.handler.ssl.SslHandler.decodeJdkCompatible(SslHandler.java:1203)
at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1247)
at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:502)
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:441)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:278)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1408)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:930)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:677)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:612)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:529)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:491)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.base/java.lang.Thread.run(Thread.java:834)
Exception in thread "main" org.zendesk.client.v2.ZendeskResponseException: HTTP/401: Unauthorized - {"error":"Couldn't authenticate you"}
at org.zendesk.client.v2.Zendesk.complete(Zendesk.java:2252)
at org.zendesk.client.v2.Zendesk.createTicket(Zendesk.java:307)
at com.website.MainKt.test123(Main.kt:349)
at com.website.MainKt.main(Main.kt:288)09:47:05.849 [AsyncHttpClient-3-1] DEBUG org.asynchttpclient.netty.handler.HttpHandler - Channel Closed: [id: 0xf398d3e1, L:/192.168.108.56:64305 ! R:website.zendesk.com/104.16.54.111:443] with attribute DISCARD
at com.website.MainKt.main(Main.kt)
Caused by: org.zendesk.client.v2.ZendeskResponseException: HTTP/401: Unauthorized - {"error":"Couldn't authenticate you"}
at org.zendesk.client.v2.Zendesk$BasicAsyncCompletionHandler.onCompleted(Zendesk.java:1997)
at org.asynchttpclient.AsyncCompletionHandler.onCompleted(AsyncCompletionHandler.java:66)
at org.asynchttpclient.netty.NettyResponseFuture.loadContent(NettyResponseFuture.java:223)
at org.asynchttpclient.netty.NettyResponseFuture.done(NettyResponseFuture.java:258)
at org.asynchttpclient.netty.handler.AsyncHttpClientHandler.finishUpdate(AsyncHttpClientHandler.java:239)
at org.asynchttpclient.netty.handler.HttpHandler.handleChunk(HttpHandler.java:113)
at org.asynchttpclient.netty.handler.HttpHandler.handleRead(HttpHandler.java:142)
at org.asynchttpclient.netty.handler.AsyncHttpClientHandler.channelRead(AsyncHttpClientHandler.java:76)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:438)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:323)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:297)
at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:253)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1436)
at io.netty.handler.ssl.SslHandler.decodeJdkCompatible(SslHandler.java:1203)
at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1247)
at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:502)
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:441)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:278)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1408)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:930)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:677)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:612)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:529)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:491)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.base/java.lang.Thread.run(Thread.java:834)
Depending on how the async client is implemented it can throw an error in one thread/coroutine and then wait for a completion message or trigger in another part of its code.
Because the error was thrown, the trigger is never sent and thus the thread hangs in perpetuity.
I "built" a problem like this once, so yes it is possible.
Obviously, I don't know if this is actually the issue at hand.
Cheers and all the best!

Ssl Handshake times out randomly on local server NettyClientHandler

Ssl Handler randomly times out, it works 30% of the time and time outs 70%, I am enabling ssl in gRPC hosted on kubernetes, it works without time outs outside the cluster, inside the cluster without time outs but time outs when calling from the container where the server is running
Any help is greatly appreciated, will post more details if needed
----------------OUTBOUND--------------------
[id: 0x993a7494, L:/127.0.0.1:51082 - R:localhost/127.0.0.1:6868] SETTINGS: ack=false,
settings={ENABLE_PUSH=0, MAX_CONCURRENT_STREAMS=0, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
------------------------------------
2017-10-16 04:26:10/GMT DEBUG io.grpc.netty.NettyClientHandler:68 -
----------------OUTBOUND--------------------
[id: 0x993a7494, L:/127.0.0.1:51082 - R:localhost/127.0.0.1:6868] WINDOW_UPDATE: streamId=0, windowSizeIncrement=983041
------------------------------------
2017-10-16 04:26:22/GMT DEBUG io.grpc.netty.NettyClientHandler:68 -
----------------OUTBOUND--------------------
[id: 0x993a7494, L:/127.0.0.1:51082 - R:localhost/127.0.0.1:6868] GO_AWAY: lastStreamId=0, errorCode=0, length=0, bytes=
------------------------------------
2017-10-16 04:26:22/GMT DEBUG io.grpc.netty.NettyClientHandler:68 -
----------------OUTBOUND--------------------
[id: 0x8146a138, L:/127.0.0.1:51160 - R:localhost/127.0.0.1:6868] SETTINGS: ack=false,
settings={ENABLE_PUSH=0, MAX_CONCURRENT_STREAMS=0, INITIAL_WINDOW_SIZE=1048576, MAX_HEADER_LIST_SIZE=8192}
------------------------------------
2017-10-16 04:26:22/GMT DEBUG io.grpc.netty.NettyClientHandler:68 -
----------------OUTBOUND--------------------
[id: 0x8146a138, L:/127.0.0.1:51160 - R:localhost/127.0.0.1:6868] WINDOW_UPDATE: streamId=0, windowSizeIncrement=983041
------------------------------------
Exception in thread "main" io.grpc.StatusRuntimeException: UNAVAILABLE
at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:227)
at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:208)
at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:141)
at com.x.x.x.commons.probes.TestProbesGrpc$TestProbesBlockingStub.liveness(TestProbesGrpc.java:147)
at com.x.x.x.client.TestClient.main(TestClient.java:69)
Caused by: javax.net.ssl.SSLException: handshake timed out
at io.netty.handler.ssl.SslHandler.handshake(...)(Unknown Source)
2017-10-16 04:26:23/GMT DEBUG io.netty.handler.codec.http2.Http2ConnectionHandler:83 -
[id: 0x993a7494, L:/127.0.0.1:51082 ! R:localhost/127.0.0.1:6868] Sending GOAWAY failed: lastStreamId '0', errorCode '0',
debugData ''. Forcing shutdown of the connection.
java.nio.channels.ClosedChannelException: null

Categories