Custom mapper for schema validation errors - java

I've used camel validator and I'm catching errors from the schema validation like a :
org.xml.sax.SAXParseException: cvc-minLength-valid: Value '' with length = '0' is not facet-valid with respect to minLength '1' for type
Is it any tool which will be good to map this errors for a prettier statements? I can always just iterate on the erros, split on them and prepare custom mapper, but maybe there is sth better than this? :)

Saxon is really good at error reporting. Its validator gives you understandable messages in the first place.

That's a SAX error message, and it appears to be quite clearly stated, but see ErrorHandler and DefaultHandler to customize it however you'd prefer.

I've created validation with xsd through camel validation component:
<to uri="validator:xsd/myValidator.xsd"/>
then I've used doCatch inside doTry block to catch exception:
<doCatch>
<exception>org.apache.camel.ValidationException</exception>
<log message="catch exception ${body}" loggingLevel="ERROR" />
<process ref="schemaErrorHandler"/>
</doCatch>
After that I wrote custom Camel Processor and it works great :)
public class SchemaErrorHandler implements Processor {
private final String STATUS_CODE = "6103";
private final String SEVERITY_CODE = "2";
#Override
public void process(Exchange exchange) throws Exception {
Map<String, Object> map = exchange.getProperties();
String statusDesc = "Unknown exception";
if (map != null) {
SchemaValidationException exception = (SchemaValidationException) map.get("CamelExceptionCaught");
if (exception != null && !CollectionUtils.isEmpty(exception.getErrors())) {
StringBuffer buffer = new StringBuffer();
for (SAXParseException e : exception.getErrors()) {
statusDesc = e.getMessage();
buffer.append(statusDesc);
}
statusDesc = buffer.toString();
}
}
Fault fault = new Fault(new Message(statusDesc, (ResourceBundle) null));
fault.setDetail(ErrorUtils.createDetailSection(STATUS_CODE, statusDesc, exchange, SEVERITY_CODE));
throw fault;
}
}

Related

I am unable to test the below mentioned method in JUnit testcase and getting ClassCastException

For below method i am writing JUnit testcase for sonarqube coverage.
#Transformer
public Object errorUnWrapper(Message<?> message) {
String value = "";
try {
if (!errorFlag) {
value = errorTransform(ESBConstants.SYSTEMERRCODE, ESBConstants.SYSTEMERROR, ESBConstants.SYSTEMTEXT);
} else {
value = getbankholiday.getErrorMessage();
}
} catch (Exception e) {
getbankholiday.getE2EObject().logMessage("3005", "Error Occurred in error unwrapper");
value = ESBConstants.SYSTEMERRORXML;
}
MessageHeaders headers = ((MessagingException) message.getPayload()).getFailedMessage().getHeaders();
return MessageBuilder.withPayload(value).copyHeaders(message.getHeaders())
.copyHeadersIfAbsent(headers).setHeader(ESBConstants.CONTENTTYPE, ESBConstants.CONTENTVALUE)
.build();
}
JUnit testcase:
#Test
void testErrorUnWrapper() throws IOException {
String xml = FileUtils.readFileToString(new File("src/test/resources/JunitTestCases/input/TC02_wltRequest.xml"),
"UTF-8");
test = MessageBuilder.withPayload(xml).build();
errorTransform.errorUnWrapper(test);
Assertions.assertTrue(true);
}
but, unable to mock or test the below line in JUnit testcase.
MessageHeaders headers = ((MessagingException) message.getPayload()).getFailedMessage().getHeaders();
Exception:
java.lang.ClassCastException: java.lang.String cannot be cast to org.springframework.messaging.MessagingException
at com.bt.or.esb.exceptions.ErrorTransform.errorUnWrapper(ErrorTransform.java:75)
at com.bt.or.esb.exceptions.ErrorTransformTest.testErrorUnWrapper(ErrorTransformTest.java:87)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
Your productive code
(MessagingException) message.getPayload()
means you expect message.payload to be a MessagingException. But in your test, you create
MessageBuilder.withPayload(xml).build()
so payload will be a string. You need to build a message with a MessagingException as payload.
This usually means you need to refactor.
except for value everything else is common in if else and catch blocks
why don't you compute value first and then execute below two lines?
MessageHeaders headers = ((MessagingException) message.getPayload()).getFailedMessage().getHeaders();
return MessageBuilder.withPayload(value).copyHeaders(message.getHeaders()).copyHeadersIfAbsent(headers)
.setHeader(ESBConstants.CONTENTTYPE, ESBConstants.CONTENTVALUE).build();

How to deserialize avro data using Apache Beam (KafkaIO)

I've only seen one thread containing information about the topic I've mentioned which is :
How to Deserialising Kafka AVRO messages using Apache Beam
However, after trying a few variations of kafkaserializers I still cannot deserialize kafka messages. Here's my code:
public class Readkafka {
private static final Logger LOG = LoggerFactory.getLogger(Readkafka.class);
public static void main(String[] args) throws IOException {
// Create the Pipeline object with the options we defined above.
Pipeline p = Pipeline.create(
PipelineOptionsFactory.fromArgs(args).withValidation().create());
PTransform<PBegin, PCollection<KV<action_states_pkey, String>>> kafka =
KafkaIO.<action_states_pkey, String>read()
.withBootstrapServers("mybootstrapserver")
.withTopic("action_States")
.withKeyDeserializer(MyClassKafkaAvroDeserializer.class)
.withValueDeserializer(StringDeserializer.class)
.updateConsumerProperties(ImmutableMap.of("schema.registry.url", (Object)"schemaregistryurl"))
.withMaxNumRecords(5)
.withoutMetadata();
p.apply(kafka)
.apply(Keys.<action_states_pkey>create())
}
where MyClassKafkaAvroDeserilizer is
public class MyClassKafkaAvroDeserializer extends
AbstractKafkaAvroDeserializer implements Deserializer<action_states_pkey> {
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
configure(new KafkaAvroDeserializerConfig(configs));
}
#Override
public action_states_pkey deserialize(String s, byte[] bytes) {
return (action_states_pkey) this.deserialize(bytes);
}
#Override
public void close() {} }
and the class action_states_pkey is code generated from avro tools using
java -jar pathtoavrotools/avro-tools-1.8.1.jar compile schema pathtoschema/action_states_pkey.avsc destination path
where the action_states_pkey.avsc is literally
{"type":"record","name":"action_states_pkey","namespace":"namespace","fields":[{"name":"ad_id","type":["null","int"]},{"name":"action_id","type":["null","int"]},{"name":"state_id","type":["null","int"]}]}
With this code I'm getting the error :
Caused by: java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to my.mudah.beam.test.action_states_pkey
at my.mudah.beam.test.MyClassKafkaAvroDeserializer.deserialize(MyClassKafkaAvroDeserializer.java:20)
at my.mudah.beam.test.MyClassKafkaAvroDeserializer.deserialize(MyClassKafkaAvroDeserializer.java:1)
at org.apache.beam.sdk.io.kafka.KafkaUnboundedReader.advance(KafkaUnboundedReader.java:221)
at org.apache.beam.sdk.io.BoundedReadFromUnboundedSource$UnboundedToBoundedSourceAdapter$Reader.advanceWithBackoff(BoundedReadFromUnboundedSource.java:279)
at org.apache.beam.sdk.io.BoundedReadFromUnboundedSource$UnboundedToBoundedSourceAdapter$Reader.start(BoundedReadFromUnboundedSource.java:256)
at com.google.cloud.dataflow.worker.WorkerCustomSources$BoundedReaderIterator.start(WorkerCustomSources.java:592)
... 14 more
It seems there's an error in trying to map the Avro Data to my custom class ?
Alternatively, I've tried the following code :
PTransform<PBegin, PCollection<KV<action_states_pkey, String>>> kafka =
KafkaIO.<action_states_pkey, String>read()
.withBootstrapServers("bootstrapserver")
.withTopic("action_states")
.withKeyDeserializerAndCoder((Class)KafkaAvroDeserializer.class, AvroCoder.of(action_states_pkey.class))
.withValueDeserializer(StringDeserializer.class)
.updateConsumerProperties(ImmutableMap.of("schema.registry.url", (Object)"schemaregistry"))
.withMaxNumRecords(5)
.withoutMetadata();
p.apply(kafka);
.apply(Keys.<action_states_pkey>create())
// .apply("ExtractWords", ParDo.of(new DoFn<action_states_pkey, String>() {
// #ProcessElement
// public void processElement(ProcessContext c) {
// action_states_pkey key = c.element();
// c.output(key.getAdId().toString());
// }
// }));
which does not give me any error until i try to print out the data. I have to verify that I'm succesfully reading the data one way or another so my intent here is to log the data in the console. If I uncomment the commented section i get the same error once again:
SEVERE: 2019-09-13T07:53:56.168Z: java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to my.mudah.beam.test.action_states_pkey
at my.mudah.beam.test.Readkafka$1.processElement(Readkafka.java:151)
Another thing to note is that if I specify :
.updateConsumerProperties(ImmutableMap.of("specific.avro.reader", (Object)"true"))
always gives me an error of
Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id 443
Caused by: org.apache.kafka.common.errors.SerializationException: Could not find class NAMESPACE.action_states_pkey specified in writer's schema whilst finding reader's schema for a SpecificRecord.
It seems there's something wrong with my approach?
If anyone has any experience reading AVRO data from Kafka Streams using Apache Beam, please do help me out. I greatly appreciate it.
Here's a snapshot of my package with the schema and class in it as well:
package/working path details
Thanks.
public class MyClassKafkaAvroDeserializer extends
AbstractKafkaAvroDeserializer
Your class is extending the AbstractKafkaAvroDeserializer which returns GenericRecord.
You need to convert the GenericRecord to your custom object.
OR
Use SpecificRecord for this as stated in one of the following answers:
/**
* Extends deserializer to support ReflectData.
*
* #param <V>
* value type
*/
public abstract class ReflectKafkaAvroDeserializer<V> extends KafkaAvroDeserializer {
private Schema readerSchema;
private DecoderFactory decoderFactory = DecoderFactory.get();
protected ReflectKafkaAvroDeserializer(Class<V> type) {
readerSchema = ReflectData.get().getSchema(type);
}
#Override
protected Object deserialize(
boolean includeSchemaAndVersion,
String topic,
Boolean isKey,
byte[] payload,
Schema readerSchemaIgnored) throws SerializationException {
if (payload == null) {
return null;
}
int schemaId = -1;
try {
ByteBuffer buffer = ByteBuffer.wrap(payload);
if (buffer.get() != MAGIC_BYTE) {
throw new SerializationException("Unknown magic byte!");
}
schemaId = buffer.getInt();
Schema writerSchema = schemaRegistry.getByID(schemaId);
int start = buffer.position() + buffer.arrayOffset();
int length = buffer.limit() - 1 - idSize;
DatumReader<Object> reader = new ReflectDatumReader(writerSchema, readerSchema);
BinaryDecoder decoder = decoderFactory.binaryDecoder(buffer.array(), start, length, null);
return reader.read(null, decoder);
} catch (IOException e) {
throw new SerializationException("Error deserializing Avro message for id " + schemaId, e);
} catch (RestClientException e) {
throw new SerializationException("Error retrieving Avro schema for id " + schemaId, e);
}
}
}
The above is copied from https://stackoverflow.com/a/39617120/2534090
https://stackoverflow.com/a/42514352/2534090

Spring Integration error is attaching completed payload

I have a listener to the JMS. Once I read the message then I convert to my custom object
public IntegrationFlow queueProcessorFlow() {
return IntegrationFlows.from(Jms.inboundAdapter(jmsTemplate)
.destination("test_queue"),
c -> c.poller(Pollers.fixedDelay(5000L)
.maxMessagesPerPoll(1)))
//convert json to our custom object
.transform(new JsonToQueueEventConverterTransformer(springBeanFactory))
.transform(new CustomTransformer(springBeanFactory))
.handle(o -> {
}).get();
}
The transformer
public class CustomerTransformer implements GenericTransformer<CustomPojo, CustomPojo> {
private final QueueDataProcessorSpringBeanFactory factory;
#Override
public CustomPojo transform(CustomPojo CustomPojo) {
try {
//do something e.g. service call
throw new Exception("This failed mate !! SOS");
} catch (Exception e) {
//ISSUE here
//e contains the original payload in the stack trace
throw new RuntimeException(e);
}
return CustomPojo;
}
Now when I throw my custom exception the stack trace contains everything. It even contains the payload. I am not interested in the payload in case of exception.
How do I update not to include payload?
** Update **
After changing as per answer I still see the issue
org.springframework.integration.transformer.MessageTransformationException: Failed to transform Message; nested exception is org.springframework.messaging.MessageHandlingException: nested exception is org.springframework.integration.transformer.MessageTransformationException: Error initiliazing the :; nested exception is CustomException Error lab lab lab , failedMessage=GenericMessage [payload=
my error handler
#Bean
public IntegrationFlow errorHandlingFlow() {
return IntegrationFlows.from("errorChannel")
.handle(message -> {
try {
ErrorMessage e = (ErrorMessage) message;
if (e.getPayload() instanceof MessageTransformationException) {
String stackTrace = ExceptionUtils.getStackTrace(e.getPayload());
LOG.info("Exception trace {} ", stackTrace);
Not sure what is the business purpose to lose a payload in the stack trace, but you can achieve that throwing a MessageTransformationException instead of that RuntimeException.
To avoid a message in stack trace with the mentioned payload, you need to use one of these constructors:
public MessageTransformationException(String description, Throwable cause) {
super(description, cause);
}
public MessageTransformationException(String description) {
super(description);
}
Instead of those based on the Message<?>.
This way a wrapping MessageTransformingHandler will do an appropriate logic:
protected Object handleRequestMessage(Message<?> message) {
try {
return this.transformer.transform(message);
}
catch (Exception e) {
if (e instanceof MessageTransformationException) {
throw (MessageTransformationException) e;
}
throw new MessageTransformationException(message, "Failed to transform Message", e);
}
}
UPDATE
It turned out that MessageTransformationException is not enough since the AbstractMessageHandler checks for the MessageHandlingException for wrapping in the IntegrationUtils.wrapInHandlingExceptionIfNecessary(). Therefore I suggest to throw a MessageHandlingException from your code instead. And use this constructor with the null for the message arg:
MessageHandlingException(Message<?> failedMessage, Throwable cause)
I had almost the same issue maybe this can help you. If you use the default errorChannel Bean this already has been subscribed to a LoggingHandler which prints the the full message, if you want avoid printing the payload you can create your own errorChannel by this way you'll override the default behavior
#Bean
#Qualifier(IntegrationContextUtils.ERROR_CHANNEL_BEAN_NAME)
public MessageChannel errorChannel() {
return new PublishSubscribeChannel();
}
If your problem is when you use the .log() handler you can always use a function to decide which part of the Message you want to show
#Bean
public IntegrationFlow errorFlow(IntegrationFlow
createOutFileInCaseErrorFlow) {
return
IntegrationFlows.from(IntegrationContextUtils.ERROR_CHANNEL_BEAN_NAME)
.log(LoggingHandler.Level.ERROR, m -> m.getHeaders())
.<MessagingException>log(Level.ERROR, p -> p.getPayload().getMessage())
.get();
}

How to convert pubsub payload to LogEntry object in log export

I have enabled log exports to a pub sub topic. I am using dataflow to process these logs and store relevant columns in BigQuery. Can someone please help with the conversion of the pubsub message payload to a LogEntry object.
I have tried the following code:
#ProcessElement
public void processElement(ProcessContext c) throws Exception {
PubsubMessage pubsubMessage = c.element();
ObjectMapper mapper = new ObjectMapper();
byte[] payload = pubsubMessage.getPayload();
String s = new String(payload, "UTF8");
LogEntry logEntry = mapper.readValue(s, LogEntry.class);
}
But I got the following error:
com.fasterxml.jackson.databind.JsonMappingException: Can not find a (Map) Key deserializer for type [simple type, class com.google.protobuf.Descriptors$FieldDescriptor]
Edit:
I tried the following code:
try {
ByteArrayInputStream stream = new ByteArrayInputStream(Base64.decodeBase64(pubsubMessage.getPayload()));
LogEntry logEntry = LogEntry.parseDelimitedFrom(stream);
System.out.println("Log Entry = " + logEntry);
} catch (InvalidProtocolBufferException e) {
e.printStackTrace();
}
But I get the following error now:
com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag
The JSON format parser should be able to do this. Java's not my strength, but I think you're looking for something like:
#ProcessElement
public void processElement(ProcessContext c) throws Exception {
LogEntry.Builder entryBuilder = LogEntry.newBuilder();
JsonFormat.Parser.usingTypeRegistry(
JsonFormat.TypeRegistry.newBuilder()
.add(LogEntry.getDescriptor())
.build())
.ignoringUnknownFields()
.merge(c.element(), entryBuilder);
LogEntry entry = entryBuilder.build();
...
}
You might be able to get away without registering the type. I think in C++ the proto types are linked into a global registry.
You'll want "ignoringUnknownFields" in case the service adds new fields and exports them and you haven't updated your copy of the proto descriptor. Any "#type" fields in the exported JSON that will cause problems too.
You may need special handling of the payload (i.e. strip if from the JSON and then parse it separately). If it's JSON I'd expect the parser to try populating sub-messages that don't exist. If it's proto ... it actually might work if you register the Any type too.

Namespace error while validation schema with StAXSource

I'm try to validate an XML using StAX and javax Validator however I'm getting the following cast error:
org.xml.sax.SAXException: java.lang.ClassCastException: org.codehaus.stax2.ri.evt.NamespaceEventImpl cannot be cast to java.lang.String
javax.xml.transform.TransformerException: java.lang.ClassCastException: org.codehaus.stax2.ri.evt.NamespaceEventImpl cannot be cast to java.lang.String
The basic idea is that I need to parse an XML using StAX and I'm attempting to reuse the event reader I'll be using for parsing and creating a StAXSource to perform the validation.
I was able to debug the error and trace the cast exception to the class com.sun.org.apache.xalan.internal.xsltc.trax.StAXEvent2SAX, line 341, where there is a loop through an iterator and a cast to a String when in fact the iterator has the type NamespaceEventImpl (snippet code of the portion of code below).
// end namespace bindings
for( Iterator i = event.getNamespaces(); i.hasNext();) {
String prefix = (String)i.next();
if( prefix == null ) { // true for default namespace
prefix = "";
}
_sax.endPrefixMapping(prefix);
}
The following is the content of the iterator "i" while performing the logic I'm referring to:
iterator content
Below is a snippet of code describing how I'm doing it.
public void validateRequest(RequestMessage message) {
try {
XMLInputFactory factory = XMLInputFactory.newInstance();
XMLEventReader eventReader = factory.createXMLEventReader(new ByteArrayInputStream(message.getMessage().getBytes()));
this.validateSchema(eventReader);
if(this.isSchemaValid()) {
// parse through XML
}
} catch(Exception e) {
LOGGER.error(e.getMessage(), e);
}
}
private void validateSchema(XMLEventReader eventReader) {
try {
StAXErrorHandler errorHandler = new StAXErrorHandler();
this.validator.setErrorHandler(errorHandler);
this.validator.validate(new StAXSource(eventReader));
} catch (SAXException | IOException | XMLStreamException e) {
LOGGER.error(e.getMessage(), e);
}
}
I was wondering if someone faced this issue before and if it is a limitation of using StAXSource with the Validator itself.

Categories