I've an application that sends requests to a Server using Spring Remoting (HttpInvokerServiceExporter). Normaly that works fine. But some Requests causes an error on the server because of a corrupted Byte Stream. Then I get Errors like this:
java.io.StreamCorruptedException: invalid handle value: 0050001A
at java.io.ObjectInputStream.readHandle(ObjectInputStream.java:1433)
java.lang.ClassNotFoundException: org.smitch.debol.service.bo.type.MinPoadAmount
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1387)
java.lang.ClassCastException: cannot assign instance of java.lang.Integer to field java.math.BigDecimal.intVal of type java.math.BigInteger in instance of java.math.BigDecimal
at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2032)
org.springframework.web.servlet.DispatcherServlet: Could not complete request
java.io.StreamCorruptedException: invalid handle value: 005001DB
at java.io.ObjectInputStream.readHandle(ObjectInputStream.java:1433)
org.springframework.web.servlet.DispatcherServlet: Could not complete request
java.io.StreamCorruptedException: invalid type code: 50
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1356)
The ClassNotFoundException I've quietly often.
org.smitch.debol.service.bo.type.MinPoadAmount
org.smitch.Pebol.service.bo.type.CashAmount
org.smitch.debol.service.bo.type.CaPhAmount
org.smitch.debol.serviPe.bo.type.EmvProfile
Then it's always a P somewhere in the Classname that should not be there.
But there can be 100 Requests in a Row without Problem.
Any hint's what's shaking my beautiful Stream?
Thanks,
Boskop
Related
I developed a java application which reads data from an avro topic, using Schema Registry, then makes simple transformations and prints the result in the console. By default I used GenericAvroSerde class for keys and values. Everything worked fine except that I had to define additionally configuration for each serde like
final Map<String, String> serdeConfig = Collections.singletonMap("schema.registry.url", kafkaStreamsConfig.getProperty("schema.registry.url"));
final Serde<GenericRecord> keyGenericAvroSerde = new GenericAvroSerde();
final Serde<GenericRecord> valueGenericAvroSerde = new GenericAvroSerde();
keyGenericAvroSerde.configure(serdeConfig, true);
valueGenericAvroSerde.configure(serdeConfig, false);
Without that I always get an error like:
Exception in thread "NTB27821-StreamThread-1" org.apache.kafka.streams.errors.StreamsException: Failed to deserialize value for record. topic=CH-PGP-LP2_S20-002_agg, partition=0, offset=4482940
at org.apache.kafka.streams.processor.internals.SourceNodeRecordDeserializer.deserialize(SourceNodeRecordDeserializer.java:46)
at org.apache.kafka.streams.processor.internals.RecordQueue.addRawRecords(RecordQueue.java:84)
at org.apache.kafka.streams.processor.internals.PartitionGroup.addRawRecords(PartitionGroup.java:117)
at org.apache.kafka.streams.processor.internals.StreamTask.addRecords(StreamTask.java:474)
at org.apache.kafka.streams.processor.internals.StreamThread.addRecordsToTasks(StreamThread.java:642)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:548)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:519)
Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id 69
Caused by: java.lang.NullPointerException
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:122)
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:93)
at io.confluent.kafka.serializers.KafkaAvroDeserializer.deserialize(KafkaAvroDeserializer.java:55)
at io.confluent.kafka.streams.serdes.avro.GenericAvroDeserializer.deserialize(GenericAvroDeserializer.java:63)
at io.confluent.kafka.streams.serdes.avro.GenericAvroDeserializer.deserialize(GenericAvroDeserializer.java:39)
at org.apache.kafka.common.serialization.ExtendedDeserializer$Wrapper.deserialize(ExtendedDeserializer.java:65)
at org.apache.kafka.common.serialization.ExtendedDeserializer$Wrapper.deserialize(ExtendedDeserializer.java:55)
at org.apache.kafka.streams.processor.internals.SourceNode.deserializeValue(SourceNode.java:56)
at org.apache.kafka.streams.processor.internals.SourceNodeRecordDeserializer.deserialize(SourceNodeRecordDeserializer.java:44)
at org.apache.kafka.streams.processor.internals.RecordQueue.addRawRecords(RecordQueue.java:84)
at org.apache.kafka.streams.processor.internals.PartitionGroup.addRawRecords(PartitionGroup.java:117)
at org.apache.kafka.streams.processor.internals.StreamTask.addRecords(StreamTask.java:474)
at org.apache.kafka.streams.processor.internals.StreamThread.addRecordsToTasks(StreamThread.java:642)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:548)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:519)
Well, it was unsual, but fine, after that (when I added configuration call as I posted above) - it worked and my application was able to to all the operations and print out the result.
But!
When I tried to use call through() - just to post data to the new topic - I faced the problem I am asking about: TOPIC WAS CREATED WITHOUT A SCHEMA.
How it can be???
Interesting fact is that the data is being written, but it is:
a) in binary format, so simple consumer cannot read it
b) it has not a schema - so avro consumer can't read it either:
Processed a total of 1 messages
[2017-10-05 11:25:53,241] ERROR Unknown error when running consumer: (kafka.tools.ConsoleConsumer$:105)
org.apache.kafka.common.errors.SerializationException: Error retrieving Avro schema for id 0
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Schema not found; error code: 40403
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:182)
at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:203)
at io.confluent.kafka.schemaregistry.client.rest.RestService.getId(RestService.java:379)
at io.confluent.kafka.schemaregistry.client.rest.RestService.getId(RestService.java:372)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaByIdFromRegistry(CachedSchemaRegistryClient.java:65)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getBySubjectAndId(CachedSchemaRegistryClient.java:131)
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:122)
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:93)
at io.confluent.kafka.formatter.AvroMessageFormatter.writeTo(AvroMessageFormatter.java:122)
at io.confluent.kafka.formatter.AvroMessageFormatter.writeTo(AvroMessageFormatter.java:114)
at kafka.tools.ConsoleConsumer$.process(ConsoleConsumer.scala:140)
at kafka.tools.ConsoleConsumer$.run(ConsoleConsumer.scala:78)
at kafka.tools.ConsoleConsumer$.main(ConsoleConsumer.scala:53)
at kafka.tools.ConsoleConsumer.main(ConsoleConsumer.scala)
[2017-10-05 11:25:53,241] ERROR Unknown error when running consumer: (kafka.tools.ConsoleConsumer$:105)
org.apache.kafka.common.errors.SerializationException: Error retrieving Avro schema for id 0
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Schema not found; error code: 40403
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:182)
at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:203)
at io.confluent.kafka.schemaregistry.client.rest.RestService.getId(RestService.java:379)
at io.confluent.kafka.schemaregistry.client.rest.RestService.getId(RestService.java:372)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaByIdFromRegistry(CachedSchemaRegistryClient.java:65)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getBySubjectAndId(CachedSchemaRegistryClient.java:131)
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:122)
at io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:93)
at io.confluent.kafka.formatter.AvroMessageFormatter.writeTo(AvroMessageFormatter.java:122)
at io.confluent.kafka.formatter.AvroMessageFormatter.writeTo(AvroMessageFormatter.java:114)
at kafka.tools.ConsoleConsumer$.process(ConsoleConsumer.scala:140)
at kafka.tools.ConsoleConsumer$.run(ConsoleConsumer.scala:78)
at kafka.tools.ConsoleConsumer$.main(ConsoleConsumer.scala:53)
at kafka.tools.ConsoleConsumer.main(ConsoleConsumer.scala)
Of course I checked out the schema registry for the subject:
curl -X GET http://localhost:8081/subjects/agg_value_9-value/versions
{"error_code":40401,"message":"Subject not found."}
But the same call to another topic written by Java App - producer of the initial data shows that schema exist:
curl -X GET http://localhost:8081/subjects/CH-PGP-LP2_S20-002_agg-value/versions
[1]
Both applications use identical "schema.registry.url" configuration
Just to summarize - topic is created, data is written, can be read with simple consumer, but it is binary and the schema doesn't exist.
Also I tried to create a schema with a Landoop, somehow to match the data, but no success - and by the way it is not a proper way to use kafka streams - everything should be done on the fly.
Help, please!
When through is called, the default serde defined via StreamsConfig is used unless users specifically overrides it. Which default serde did you use? To be correct you should be using the AbstractKafkaAvroSerializer which will automatically register the schema for that through topic.
I have use Confluent HDFS Connector for moving data from Kafka topics to HDFS log file. But when I run these commands:
./bin/connect-standalone
etc/schema-registry/connect-avro-standalone.properties \
etc/kafka-connect-hdfs/quickstart-hdfs.properties
I am taking follow error. How can i solve this problem. What is the reason of that ?
Caused by: org.apache.kafka.common.errors.SerializationException:
Error deserializing Avro message for id -1 Caused by:
org.apache.kafka.common.errors.SerializationException: Unknown magic
byte! [2017-06-03 13:44:41,895] ERROR Task is being killed and will
not recover until manually restarted
(org.apache.kafka.connect.runtime.WorkerTask:142)
This happens if you are trying to read data read the connector and have set key.converter and value.converter to be the AvroConverter but your input topic has data that was not serialized by the same AvroSerializer that uses the schema registry.
You have to match your converter to the input data. In other words, use a serializer that can deserialize the input data.
Using java I am sending a json object to kafka, initially it worked for me for 2 days, now I am getting the following exception
Exception in thread "main" java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.UnknownTopicOrPartitionException: This server does not host this topic-partition.
at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.valueOrError(FutureRecordMetadata.java:65)
at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.get(FutureRecordMetadata.java:52)
at org.apache.kafka.clients.producer.internals.FutureRecordMetadata.get(FutureRecordMetadata.java:25)
at dummy.DummySyntheticManifestProducer.main(DummySyntheticManifestProducer.java:164)
Caused by: org.apache.kafka.common.errors.UnknownTopicOrPartitionException: This server does not host this topic-partition.
setting the retries property on the producer settings.
also need to set the property max.in.flight.requests.per.connection to 1
We are seeing an intermittent error in CXF. The response is fairly large (several hundred KB), MTOM is enabled, and enabling DEBUG for the CXF request/response logging interceptors fixes the issue, similarly to this post (which is unresolved). Our project is leveraging CXF version 2.2.9.
javax.xml.ws.soap.SOAPFaultException: Unmarshalling Error: [was class java.io.IOException] Strange I/O stream, returned 0 bytes on read
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:146)
at com.sun.proxy.$Proxy751.browseFiles(Unknown Source)
…
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.RuntimeException: [was class java.io.IOException] Strange I/O stream, returned 0 bytes on read
at com.ctc.wstx.util.ExceptionUtil.throwRuntimeException(ExceptionUtil.java:18)
at com.ctc.wstx.sr.StreamScanner.throwLazyError(StreamScanner.java:731)
at com.ctc.wstx.sr.BasicStreamReader.safeFinishToken(BasicStreamReader.java:3657)
at com.ctc.wstx.sr.BasicStreamReader.getTextCharacters(BasicStreamReader.java:830)
at com.sun.xml.bind.v2.runtime.unmarshaller.StAXStreamConnector.handleCharacters(StAXStreamConnector.java:323)
…
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:124)
... 51 more
Caused by: java.io.IOException: Strange I/O stream, returned 0 bytes on read
at com.ctc.wstx.io.BaseReader.reportStrangeStream(BaseReader.java:148)
at com.ctc.wstx.io.UTF8Reader.loadMore(UTF8Reader.java:373)
…
I initially thought this was caused by a bad/invalid character (encoding?) in the response data; however, it is now looking more like a network issue. It is very odd, the service has been operating for a long time (years) without issue prior to running into this problem.
Why does this error occur? Is there a way to resolve this without enabling debug logging?
Likely an upgrade to a newer version of CXF may fix this. There were some bugs in the CXF mime streams. In particular, this looks very similar to:
https://issues.apache.org/jira/browse/CXF-3068
Based on this and considering the source code of com.ctc.wstx.io.UTF8Reader.loadMore, the problem is possible when the buffer passed to mBuffer through BaseReader(ReaderConfig, InputStream, byte[], int, int) is zero length.
Is the loadMore() method implementation correct? Should they do that if read() return 0?
Is there a maximum content length for text sent over CXF with SOAP 1.1? I'm running into an issue where one SOAP request is failing while the other succeeds. The only difference I have pinpointed between these request so far has been the bytes of text I am sending.
I see an error like the following:
checkException (UnexpectedServiceExceptionCheckImpl.java:35) - An unexpected exception was found from source=[DesignService.generate] type=[class javax.xml.ws.soap.SOAPFaultException] message=[Unmarshalling Error: [was class java.io.IOException] Strange I/O stream, returned 0 bytes on read ]:
javax.xml.ws.soap.SOAPFaultException: Unmarshalling Error: [was class java.io.IOException] Strange I/O stream, returned 0 bytes on read
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:145)
at $Proxy146.generate(Unknown Source)
I don't believe there are any maximums except for memory usage. Are you seeing any errors? Can you trace the transaction using wireshark or something?