I have xml based on fpml schema.
Used xjc command line tool to generate corresponding pojo classes.
Then I am using JAXB to unmarshal xml into java objects.
I converted this to objects as an intermediate step because then it is easy to read values of some fields.
But problem is fpml schema generated ~1200 classes.
so I am not sure if this is correct approach as jar size will also increase.
My problem statement : convert one xml based on one schema to another xml based on another schema. Both involves fpml. While populating another xml I need to validate few fields from database.
please give me suggestions
Data binding technologies such as JAXB work fine for simple cases, but when the schema is large, complex, or frequently changing, they become quite unwieldy, as you have discovered.
This is a task for XSLT. Use schema-aware XSLT if possible, because it makes your stylesheet easier to debug.
Related
I can receive one of 82 XML structures, each of which contains a root which is not in a name space, and also contains several xmlns attributes the first of which defines a urn for the schema for the object, and the rest (which define namespaces) also contain the urns for the common objects.
The Schema Aware Parsing in Java assumes you know the schema before you start the parsing, but I do not know it until either I have loaded the XML without validation and extracted the root, at point I can load it again with the right schema, or I can find some way to get to the xmlns elements in the root and select the right schema (I know how to map the urn to the correct schema, and all the schemas are held as resources in my classpath.
It seems a shame to load the XML twice, is there a way to do this in a single pass?
As an example I have a possible document which looks like:-
<?xml version="1.0" encoding="UTF-8"?>
<BusinessCard xmlns="urn:oasis:names:specification:ubl:schema:xsd:BusinessCard-2"
xmlns:cac="urn:oasis:names:specification:ubl:schema:xsd:CommonAggregateComponents-2"
xmlns:cbc="urn:oasis:names:specification:ubl:schema:xsd:CommonBasicComponents-2">
</BusinessCard>
(there is obviously content inside the BusinessCard object, but I left it out as it is no relevance here)
and I the schema for this is in resource "xsd/main/UBL-BusinessCard-2.2.xsd".
I have tried using an EntityResolver, but it does not get called before the parser complains that it can not find the declaration of BusinessCard.
I'm not sure why you say the root isn't in a namespace, when the xmlns="urn:oasis:names:... declaration makes it clear that it is.
One way to do this is to load a single composite schema that contains all the different component schemas, and validate against that. If the union of the schemas is a valid schema (i.e. no conflicting type definitions) then this might be the best approach, especially if you are validating thousands of document and most of the component schemas are going to be used in each run.
On the other hand, if you're only using a small number of the component schemas in a given run, then this would be expensive.
One approach would be to detect the namespace using an abortive parse of the document. Write a SAX filter that captures the first namespace declaration and then aborts the parse by throwing an exception. Or you could also do this with a streaming XSLT 3.0 transformation.
Even smarter would be to write a little SAX pipeline that does some buffering. Capture the first startElement event, extract the namespace, load the schema, create a validator, feed it the SAX events that you've already consumed (the first startElement), then feed the rest of the SAX events from your preprocessor straight through to the validator.
I use JAXB to load an XML configuration file to a Java object (ConfigurationDTO). Is it good practice to add some logic code on the this Java object (ConfigurationDTO) or I should create a different java object with this logic code (ie Configuration). When I say logic code I mean some checks/constraints that the configuration file should have. Should the java class 'ConfigurationDTO' contain only getters?
The question is why do you need that constraints? Are you going to use your object not only for marshalling/unmarshalling? If so it is bad idea. The rule of thumb is not to spread DTO objects among all levels of an application. If you follow this rule you'll not need to have additional constraints in your DTO.
The JAXB standard provides you with ability to validate an object during marshal and unmarshal time. It means that if your XML schema requires nonempty field but the corresponding java object has null value then marshal will fail. And vise versa.
Here is quote from the JAXB documentation
Validation is the process of verifying that an XML document meets all the constraints expressed in the schema. JAXB 1.0 provided validation at unmarshal time and also enabled on-demand validation on a JAXB content tree. JAXB 2.0 only allows validation at unmarshal and marshal time. A web service processing model is to be lax in reading in data and strict on writing it out. To meet that model, validation was added to marshal time so users could confirm that they did not invalidate an XML document when modifying the document in JAXB form.
Such approach has its own drawbacks (if you spread the DTO among the application you'll lost control on it) but the advantages are more valuable.
I'm trying to find some tutorial examples on how to exchange data between databases and XML files using Java, from getting and setting specific data from a database to (if possible) change how the database is structured.
I have conducted research into this, but I'm unsure on whether its JDBC I should look into, XML:DB, JAXB, or if any of them are even relevant to what I'm trying to do.
I plan to create a database example, and then see if I can exchange data to and from an XML file using Java, just to see how it works; what should I look into in order to accomplish this?
Many thanks.
You can do this in many other ways but I do this way
Get data from Databases
Convert it to HashMap
Create a JaxB detail class matching your schema
Create a constructor in the JaxB class which accepts the HashMap and assign the data to the variables in JaxB
Convert JaxB object to XML/JSON by Marshaling
Write to a file if you want
If your new to Jax-B view this tutorial here!
You could do the following:
Use a JPA implementation (EclipseLink, Hibernate, Open JPA, etc) to convert the database data to/from Java objects.
Use a JAXB implementation (EclipseLink MOXy, reference implementation, etc) to convert the Java objects to/from XML.
After further research into my query, I found JDBC (database manipulations) and XStream (XML conversions) to be my most preferred solutions.
For JDBC, I referred to this link
For XStream, I referred to this link
Thank you for your replies.
I'am currently working on an application (Java), and can figured out the best way to solve my issue.
I need to store data in mongoDB (with the actual data type when supported by bson format), I get the data in an xml file, and his schema (both are created dynamically at runtime, so I have no idea what's in it).
To be more specific, I didn't have any information on the fields, names of the data.
A user can create new "object" (for which there is no java class in the application) dynamically.
When a user create a new object, I receive a xml schema which describe the object.
So when a user try to add an object of this type (data are in a xml format for the new entity), I validate it with the xml schema and now I need to store the object in mogoDB.
So I need to be able to transform my xml in bson (or basic java object with mongo java driver) and back into xml after a query.
Example:
If a user want to manage people, he will define the people schema:
<People>
<Name>...</Name>
<Lastname>...</Lastname>
<Age>...</age>
...
</People>
Here I got the xsd (a valid xsd format with all informations). Then when a user add a People I get the data like that:
<People>
<Name>John</Name>
<Lastname>Smith</Lastname>
<Age>32</Age>
...
</People>
So i wonder if the best approach will be something like jackson: xml -> Pojo -> bson, or with XSLT xml -> json/bson (with encoding for data types). Or simply by reading the xml file and my basic java objects manually.
Did anyone have some advice on how to implements one of those solutions or better solutions?
I believe you can use MongoJack to magically turn your XML into something that MongoDB understands (and vice-versa)
Best approach would seem to go XML <-> JSON
See: Quickest way to convert XML to JSON in Java
Then you can go JSON <-> BSON using com.mongodb.util.JSON parse and serialize.
I have
#XmlAttribute(required=true)
in hundreds places in a projects.
Can I make this default?...
...So that I then only need to specify
#XmlAttribute(required=false)
when needed.
No, that behaviour is hard-wired. However, the required attribute is really a lightweight alternative to a proper XML schema. If you need better control over document validation, then I suggest you define an XML Schema for your documents, and inject the schema into the JAXBContext. The documents will then be checked on marshalling and unmarshalling, and you won't have to rely on the annotations for validation.