I'm able to create java object from an xml schema and creating a new xml is also working.
Now using my java object how can i search for a particular tag and update it back into xml?
You can use an instance of Marshaller to write the object back to XML. If you want to apply changes back to an existing DOM then you can use an instance of Binder. Binder is useful when there is unmapped content in your document that you wish to preserve.
For More Information
http://blog.bdoughan.com/2010/09/jaxb-xml-infoset-preservation.html
Related
I can receive one of 82 XML structures, each of which contains a root which is not in a name space, and also contains several xmlns attributes the first of which defines a urn for the schema for the object, and the rest (which define namespaces) also contain the urns for the common objects.
The Schema Aware Parsing in Java assumes you know the schema before you start the parsing, but I do not know it until either I have loaded the XML without validation and extracted the root, at point I can load it again with the right schema, or I can find some way to get to the xmlns elements in the root and select the right schema (I know how to map the urn to the correct schema, and all the schemas are held as resources in my classpath.
It seems a shame to load the XML twice, is there a way to do this in a single pass?
As an example I have a possible document which looks like:-
<?xml version="1.0" encoding="UTF-8"?>
<BusinessCard xmlns="urn:oasis:names:specification:ubl:schema:xsd:BusinessCard-2"
xmlns:cac="urn:oasis:names:specification:ubl:schema:xsd:CommonAggregateComponents-2"
xmlns:cbc="urn:oasis:names:specification:ubl:schema:xsd:CommonBasicComponents-2">
</BusinessCard>
(there is obviously content inside the BusinessCard object, but I left it out as it is no relevance here)
and I the schema for this is in resource "xsd/main/UBL-BusinessCard-2.2.xsd".
I have tried using an EntityResolver, but it does not get called before the parser complains that it can not find the declaration of BusinessCard.
I'm not sure why you say the root isn't in a namespace, when the xmlns="urn:oasis:names:... declaration makes it clear that it is.
One way to do this is to load a single composite schema that contains all the different component schemas, and validate against that. If the union of the schemas is a valid schema (i.e. no conflicting type definitions) then this might be the best approach, especially if you are validating thousands of document and most of the component schemas are going to be used in each run.
On the other hand, if you're only using a small number of the component schemas in a given run, then this would be expensive.
One approach would be to detect the namespace using an abortive parse of the document. Write a SAX filter that captures the first namespace declaration and then aborts the parse by throwing an exception. Or you could also do this with a streaming XSLT 3.0 transformation.
Even smarter would be to write a little SAX pipeline that does some buffering. Capture the first startElement event, extract the namespace, load the schema, create a validator, feed it the SAX events that you've already consumed (the first startElement), then feed the rest of the SAX events from your preprocessor straight through to the validator.
I have xml based on fpml schema.
Used xjc command line tool to generate corresponding pojo classes.
Then I am using JAXB to unmarshal xml into java objects.
I converted this to objects as an intermediate step because then it is easy to read values of some fields.
But problem is fpml schema generated ~1200 classes.
so I am not sure if this is correct approach as jar size will also increase.
My problem statement : convert one xml based on one schema to another xml based on another schema. Both involves fpml. While populating another xml I need to validate few fields from database.
please give me suggestions
Data binding technologies such as JAXB work fine for simple cases, but when the schema is large, complex, or frequently changing, they become quite unwieldy, as you have discovered.
This is a task for XSLT. Use schema-aware XSLT if possible, because it makes your stylesheet easier to debug.
I have a JSON which I am using as a fabricated data to create a POJO using GSON. This is expected data for my tests. Next, I have data from the cucumber feature file which would override some of the fields in the already created POJO i.e. update expected data object. I was wondering if anybody has done this earlier or is aware of pattern which I can use to achieve this. In terms of approaches, in was wondering if makes more sense to create an updated json first from both the data sources or to create POJO from the first JSON first and then apply mutation.
I have found a way around - using Apache Velocity template instead of vanilla json files simplifies the implementation and provides flexibility.
I use JAXB to load an XML configuration file to a Java object (ConfigurationDTO). Is it good practice to add some logic code on the this Java object (ConfigurationDTO) or I should create a different java object with this logic code (ie Configuration). When I say logic code I mean some checks/constraints that the configuration file should have. Should the java class 'ConfigurationDTO' contain only getters?
The question is why do you need that constraints? Are you going to use your object not only for marshalling/unmarshalling? If so it is bad idea. The rule of thumb is not to spread DTO objects among all levels of an application. If you follow this rule you'll not need to have additional constraints in your DTO.
The JAXB standard provides you with ability to validate an object during marshal and unmarshal time. It means that if your XML schema requires nonempty field but the corresponding java object has null value then marshal will fail. And vise versa.
Here is quote from the JAXB documentation
Validation is the process of verifying that an XML document meets all the constraints expressed in the schema. JAXB 1.0 provided validation at unmarshal time and also enabled on-demand validation on a JAXB content tree. JAXB 2.0 only allows validation at unmarshal and marshal time. A web service processing model is to be lax in reading in data and strict on writing it out. To meet that model, validation was added to marshal time so users could confirm that they did not invalidate an XML document when modifying the document in JAXB form.
Such approach has its own drawbacks (if you spread the DTO among the application you'll lost control on it) but the advantages are more valuable.
I'am currently working on an application (Java), and can figured out the best way to solve my issue.
I need to store data in mongoDB (with the actual data type when supported by bson format), I get the data in an xml file, and his schema (both are created dynamically at runtime, so I have no idea what's in it).
To be more specific, I didn't have any information on the fields, names of the data.
A user can create new "object" (for which there is no java class in the application) dynamically.
When a user create a new object, I receive a xml schema which describe the object.
So when a user try to add an object of this type (data are in a xml format for the new entity), I validate it with the xml schema and now I need to store the object in mogoDB.
So I need to be able to transform my xml in bson (or basic java object with mongo java driver) and back into xml after a query.
Example:
If a user want to manage people, he will define the people schema:
<People>
<Name>...</Name>
<Lastname>...</Lastname>
<Age>...</age>
...
</People>
Here I got the xsd (a valid xsd format with all informations). Then when a user add a People I get the data like that:
<People>
<Name>John</Name>
<Lastname>Smith</Lastname>
<Age>32</Age>
...
</People>
So i wonder if the best approach will be something like jackson: xml -> Pojo -> bson, or with XSLT xml -> json/bson (with encoding for data types). Or simply by reading the xml file and my basic java objects manually.
Did anyone have some advice on how to implements one of those solutions or better solutions?
I believe you can use MongoJack to magically turn your XML into something that MongoDB understands (and vice-versa)
Best approach would seem to go XML <-> JSON
See: Quickest way to convert XML to JSON in Java
Then you can go JSON <-> BSON using com.mongodb.util.JSON parse and serialize.