Where do I implement a converter in a camel component? - java

Me and my colleague want to develop a camel component that not only takes care of the connectivity, but also converts the standard xml and / or json formats into the necessary message format for the target system.
Where should we implement that? In our opinion, we have two options:
Implementation directly in the producer
Implementation in a converter class which is used by the producer
Is there a standard or is it up to the developer himself how many helper classes he defines for his camel component?

There is no enforced standard. Both options are valid and it depends a bit. The type converters are more flexible and allows you to do these convertions elsewhere than only when sending via the producer.
For example some components that support industry standards like HL7 provide type converters to offer that kind of flexibility.
And some other components where these data formats of the target system are very special/specific do not use type converters but do it directly in the producer.

Related

What Martin Fowler meant by "avoid automatic deserialization" in a REST API?

Martin Fowler said to avoid automatic deserialization in an API:
I prefer to avoid automatic deserialization altogether. Automatic
deserialization usually falls into the WSDL pitfall of coupling
consumers and producers by duplicating a static class structure in
both.
What this means?
Is it to receive all information as JSON in each Rest Service without any "converter" in the middle?
By "converter" I mean some type adapter, like in GsonBuilder.
By automatic deserialization he means that there's a predefined hard structure for the JSON object which is used to retrieve the object itself.
This is however appropriate for most use cases.
Examples of predefined structure are Java Class or XML XSD.
Automatic deserialization usually falls into the WSDL pitfall of coupling consumers and producers by duplicating a static class structure in both.
What he means here is that using classes for deserialization is same as using WSDL to serialize or deserialize objects.
On the contrary to the hard structure of classes and XSD documents, JSON is much more relaxed as it's based on Javascript which allows modification to the object definition at any point of it's life cycle.
So the alternative would be to use a HashMap and ArrayList in Java combination (or parsing String itself) to deserialize the object, as then even if the server produces something different (like new fields) no change would be needed at the client side. And new clients can take advantage of the new fields.
In a hard structure since both the producer and consumer are strongly coupled because of the shared structure of the model classes, any change in the producer has to be reflected in the consumer.
In some SOA projects where I worked, we used to add some extra fields in all the request/response objects for future use so that there was no need to change the clients running in the production to accommodate the needs of a new client. These fields had some random name like customParam1 to customParam5, where the meaning of these fields was released with the documentation. These names were not intuitive all because we were coupling the producer and consumer on the shared structure or models.

Java to XSD or XSD to Java

I know that, using JAXB, you can generate Java files from an XSD and that you can also generate the XSD from annotated POJOs. What are the advantages and disadvantages of each? Is one overall better than the other?
We basically want to serialize events to a log in XML format.
Ultimately it depends on where you want to focus:
If the XML Schema is the Most Important Thing
Then it is best to start from the XML schema and generate a JAXB model. There are details of an XML schema that a JAXB (JSR-222) implementation just can't generate:
A maxOccurs other than 0, 1, or unbounded
many facets on a simple type
model groups
If the Object Model is the Most Important Thing
If you will be using the Java model for more than just converting between objects and XML (i.e. using it with JPA for persistence) then I would recommend starting with Java objects. This will give you the greatest control.
It depends on your requirement and scenario with respect to the point of initiation.
Given your requirement, use generate Java files from an XSD as you want to define the output(XML) format first which should be supported by Java.,
Given that one of the main points of XML is to be a portable data-transfer format, usable regardless of platform or language, I would avoid generating XSD from any specific programming language, as a rule of thumb. This may not matter if you know you are only communicating between Java endpoints (but are you sure this will be true forever?).
It is better, all else being equal, to define the interface/schema in a programming-language neutral way.
There are lots of exceptions to this general principle, especially if you are integrating with existing or legacy code...
If you have the opportunity to design both pojo and schema, It's a matter of design - do you design for a "perfect" schema or for a "perfect" java class.
In some cases, you don't have the luxury of a choice, in system integration scenarios, you might be given a predefined XSD from another system that you need to adapt to, then XSD -> Class will be the only way.

Design decision as to provide a single ejb component which invokes other specific ejb components

we have multiple buisiness format converter ejb components which do the following tasks
if the data submitted to them is xml , then convert from xml to corresponding format
else assume it is the buisiness format and convet it to the xml.
now we have a requirement to have single converter component which can pick up the right converter component based on the format of xml passed to it.
how would I decide which component to choose, because the ability to understand the format lies within the specific component?
You could implement this using a chain of responsibility pattern. Inject all the possible EJBs to the single Converter EJB and let the latter build up the chain. Each of the concrete converters implement an interface which provides some method like boolean canHandle(XML xml). Once one returns true let it handle the xml and return.

Best practice for emitting JMX notifications

Looking for guidelines when defining an MBean that emits notifications, specifically on the type of notifications. The JMX Best Practices on Oracle's site says the following. But it's a bit old and pre-Java6.
Notifications should be instances of javax.management.Notification or one of the subclasses from the javax.management namespace. Information that does not fit into one of those subclasses should be conveyed by attaching a CompositeData to the notification using the setUserData method.
Also on Oracle's site, I see that Weblogic defines some of its own subclasses, e.g. WebLogicLogNotification. Its Best Practices states:
All JMX notification objects extend the javax.management.Notification object type. JMX and WebLogic Server define additional notification object types, such as javax.management.AttributeChangeNotification. The additional object types contain specialized sets of information that are appropriate for different types of events.
Our notifications don't fit any of the standard subclasses, so like WLS, considering defining our own subclass with custom getters for the information we wish to convey with the notifications. Or would it be better to stick with the base javax.management.Notification and just attach our info with the generic setUserData(Object)? If we do the latter, I suppose the Object should be a JMX type such as CompositeData, which doesn't seem as nice. Thoughts on which would be better from a consumer's point of view?
EDIT: From the consumers view, I guess the downside of a custom subclass is they would have to include that in their application/classpath.
it's almost always a bad idea to use custom data types in jmx. it is very limiting. stick to the open types and your data can be consumed by any jmx client (java or otherwise).
note, you can always provide some helper classes which do some sort of "custom bean" <-> "open type" conversion. classes which have access to the helper classes can utilize these convenience methods (e.g. ThreadInfo.from()), while external and non-java code is still able to consume the data.

JSON "flat" serialization with Java

I am looking for a JSON library that supports defining the serialization depth. Ideally I would like a library that handles all primitive types and java.lang and the rest will be sent to a custom handler. In my case the custom handler should generate a Link object with URL and class of the element (for my REST API).
Here an example:
Person : String name, Car car
Would be serialized to
{
“name”:”Peter”,
Link : {“class”:”my.company.Car”, “url”:”http://www.mycompany/myapp/Car/5”}
}
Any ideas which library I could use (and enhance)?
Kind regards,
Daniel
Check out http://code.google.com/p/google-gson/
You want to have a look at Jackson.
Jackson Wiki
Jackson in 5 Minutes
AS you can see you can use simple and complex data-binding rules, and there's a streaming API which will allow you to limit the exploration's depth.
If GSON does not fit your needs, I recommend JsonMarshaller. It is highly configurable and customizable yet strives to be simple. It also has a very active user and developer community.
I am not sure whether you want actual control for serialization depth (as mentioned) or not -- your explanation rather suggest you want to be able to add custom serializers.
Now: if you really need limits (like only shallow copy), you could check out FlexJson, which supposedly has explicit control over serialization depth.
Otherwise, Jackson and GSON at least have full bean serialization as well as allowing custom serializers.

Categories