I have created a library which includes some 200 Java classes generated from an existing XSD with JAXB like this:
xjc -no-header -d schemas -b xsd/binding.xml xsd
Alas, JAXB is not supported on Android, and the general suggestion seems to be to use a different library. Castor seems to be a suitable alternative—especially as it offers conversion of .xsd into Java classes. However, doing so seems to be more complex than in Java, and I have no idea of how much the result differs from xjc output.
My use case is unmarshaling and reading the unmarshaled data (changing data or marshaling are not needed). That is, there is a vast amount of code which relies on the resulting Java class schema, therefore any difference between xjc-generated classes and their Castor counterparts would mean a lot of refactoring.
Is there a simple recipe on how to generate Java classes from .xsd in Castor and get a result that is as close as possible to what xjc produces?
The following approach is as close as I got, though some refactoring is still required:
Place a file named castorbuilder.properties on your classpath, with the following content:
org.exolab.castor.builder.javaclassmapping=type
org.exolab.castor.builder.javaVersion=5.0
# Replace the following lines with your schema(s)
org.exolab.castor.builder.nspackages=\
http://example.com/schema/foo=com.example.schema.foo,\
http://example.org/schema/bar=org.example.schema.bar
org.exolab.castor.builder.primitivetowrapper=true
Then run the following command line:
java -cp "*" org.exolab.castor.builder.SourceGeneratorMain -i schema.xsd -types j2
In the above, replace the classpath with the path to the Castor JAR files and castorbuilder.properties, and replace schema.xsd with the path to your XSD file.
Of course, you can invoke Castor via Ant or Maven—in this case, be sure to use the respective equivalent to the command line options above and make sure the properties file is picked up.
Differences which require refactoring:
Enum types are now in their own types sub-package
Where XML Enum literals are in camelCase, they now become CAMELCASE rather than CAMEL_CASE
Castor generates array properties where JAXB has Collection properties
Some class names get an underscore as a prefix; apparently JAXB drops those while Castor preserves them
Date fields are now Date instances
In one instance a property changed its name from value to content
However—there seems to be a bug in Castor, as some of the generated classes invoke a non-existent constructor. This has stopped me from successfully trying this out; I only got as far as refactoring my existing code to the point of being free from syntax and compiler errors.
Related
Is there a way to configure a ClassLoader or a JVM to load annotations with CLASS retention policy, so I can access them using reflection?
This is useful for compile-time post-processing, as mentioned here.
I annotate some of my classes in order to generate an antlib.xml file automatically. I would prefer if my annotation could have CLASS retention policy, so that it does not create runtime dependencies.
javac can process source and class level annotations, with the -processor option. See javax.annotation.processing.AbstractProcessor. (Since java 1.6).
I started using it while compiling .java files. Apparently it can also be used to process CLASS annotations with .class input files. I haven't tried this because I'm using ant to compile, and ant does not seem to pass .class files to the compiler.
I have to do a full compile when I want to process all the annotations in my project.
I think you might want to have a look at this tutorial.
It explains how to create your own annotation processor and how to use it to generate code. It doesn't handle bytecode manipulation though.
He also gave a presentation available on YouTube. In case you're too lazy to read... ;-)
I'm trying to parse a xml-file based on a xsd in Matlab.
As recommended in this thread MATLAB parse and store XML with XSD, I'm trying to use JAXB to do this, but I got stuck.
I was able to create the .java class files representing my xsd-levels via
xjc myFile.xsd –d myDirectory (within the windows command-line).
I've read the linked articles in the other thread, but now I don't know how to go on.
What are the next steps before I can go on in Matlab?
Thanks in advance
Edit:
Matlab Version is 2010b (Java 1.6.0_17)
Edit2:
I tried now JAXB.unmarshal(input, MyClass.class) as I managed to get the MyClass.class by using an eclipse JAXB-Project (only using the command line input mentioned above just gave me the MyClass.java files).
But Matlab can't find the method JAXB.unmarshal(probably because of the java version of matlab?).
After adding the folder, where my package is in(D:\PathToMyPackage\my\Package\MyClass.class), to the dynamic java path in Matlab
javaaddpath('D:\PathToMyPackage') and after importing 'my.package' I was also trying:
jc = JAXBContext.newInstance('my.package')
jc = JAXBContext.newInstance('ObjectFactory.class')
jc = JAXBContext.newInstance('my.package.ObjectFactory.class')
But each time I get a JAXB Exception eg.
javax.xml.bind.JAXBException: "my.package" doesnt contain
ObjectFactory.class or jaxb.index
although they're inside the folder.
So is it even possible to perform the whole JAXB-thing only in Matlab? Or do I first have to wrap around some classes in eclipse to export the whole thing as a .jar-file, which I then import in Matlab?
If it is possible, which mistakes could I make? Or what could have an impact?
Newer Java version in eclipse, the dynamic java path in Matlab?
If you use Java 8 and don't need xml validation, converting .xml source into Java object is quite easy:
MyClass myObject = JAXB.unmarshal(input, MyClass.class);
Here, input is either InputStream, File, URL etc, pointing to .xml source, MyClass is a class that represents root element--one which has #XmlRootElement annotation. It can be found among classes that were generated by xjc.
See more:
https://docs.oracle.com/javase/8/docs/api/javax/xml/bind/JAXB.html
I am using several different schemas in my project. They are each compiled into a separate jar, each using a separate package, using the xmlbeans ant task. I can only seem to successfully parse xml (using the .Factory.parse(String xml) method) for the schema jar that is first in the classpath, otherwise I get a ClassCastException as described in this bug. If I change the jar ordering, a different schema will be able to parse successfully, and the ClassCastException will be thrown for a different class.
I've done some debugging, and I'm coming to the conclusion that the structure of the schemaorg_apache_xmlbeans.namespace package is probably responsible. Since my schemas do not have namespaces, each of jars I've built share some files that are identically named in identical packages. Specifically, I've seen that each jar has a schemaorg_apache_xmlbeans.namespace._nons.xmlns.xsb file that seems to point to the actual schema for that jar. If the factory uses this file to determine some of the classes it will use to parse the xml it has, this may explain the ClassCastException, as it's only looking at the first file on the classpath and not the correct one for the XML it has.
Whether there is any option to specify the namespaces for the schemas generated (like java namespaces) in the wsdls or xsds or in the ant task "wsdl to java" compilation?
I think the problem is that XMLBeans uses some kind of internal schema cache that mixes them up.
Have you tried giving your schemas (xsds) different namespaces?
This can be resolved by using the XMLOptions argument to the parse method.
Example:
XmlOptions opts = new XmlOptions();
opts.setDocumentType(YourDocument.Factory.newInstance().schemaType());
YourDocument.parse(String xml, opts);
see xmloptions.setDocumentType
When I'm using wsdl2java tool to generate java classes based on wsdl file I get two files for each class: first - pure virtual class file, second - class file with implementation, which have Impl postfix in classname.
So, for example if I specify in wsdl message with name ServerMessage, then ServerMessage.java will be virtual and ServerMessageImpl.java will consist needed thing.
How should I use resulting files in non-generated code? I just want to use classes as specified in my wsdl file, but with such generation I'm forced to write Impl postfix after each class name. Am I misunderstanding something?
Solved this issue. Command line argument -uw did the trick.
unwrap -
This will select between wrapped and unwrapped during code generation. Default is set to false. Maps to the -uw option of the command line tool.
http://axis.apache.org/axis2/java/core/tools/CodegenToolReference.html
I have a couple of huge XML schema definition (XSD) files and I want to generate only for a subset of the defined types the corresponding Java classes.
More precisely I have a list of "root" types that I want to transform into Java classes including all types needed by these root types.
Is it possible to define some "root" types in a JAXB bindings file and tell JAXB to transform only them with all their dependent types into Java classes and ignore all the other unnecessary types?
Thanks in advance.
There may be a more straightforward way, but one way is to make copies of the XSDs and remove all XML types from the copies excepted the root types you want with their dependencies. Then apply xjc on the copies instead of the original ones.
You can automate this process with XSLT and build automation tools like Maven, Gradle, Ant, etc. You first write the XSLT stylesheet that transforms the XSDs to copy only the root types with dependencies, and save the result to a temporary location (e.g. target/generated-sources folder with Maven). Then, with Maven for example, you automate the process with build plugins in your pom.xml:
Run the XSLT transformation with Maven XML plugin, using preferably Saxon as XSLT processor.
Run JAXB2 Maven plugin to generate the Java classes from the new XSD copies (with schemaDirectory/schemaIncludes parameters).