I am using several different schemas in my project. They are each compiled into a separate jar, each using a separate package, using the xmlbeans ant task. I can only seem to successfully parse xml (using the .Factory.parse(String xml) method) for the schema jar that is first in the classpath, otherwise I get a ClassCastException as described in this bug. If I change the jar ordering, a different schema will be able to parse successfully, and the ClassCastException will be thrown for a different class.
I've done some debugging, and I'm coming to the conclusion that the structure of the schemaorg_apache_xmlbeans.namespace package is probably responsible. Since my schemas do not have namespaces, each of jars I've built share some files that are identically named in identical packages. Specifically, I've seen that each jar has a schemaorg_apache_xmlbeans.namespace._nons.xmlns.xsb file that seems to point to the actual schema for that jar. If the factory uses this file to determine some of the classes it will use to parse the xml it has, this may explain the ClassCastException, as it's only looking at the first file on the classpath and not the correct one for the XML it has.
Whether there is any option to specify the namespaces for the schemas generated (like java namespaces) in the wsdls or xsds or in the ant task "wsdl to java" compilation?
I think the problem is that XMLBeans uses some kind of internal schema cache that mixes them up.
Have you tried giving your schemas (xsds) different namespaces?
This can be resolved by using the XMLOptions argument to the parse method.
Example:
XmlOptions opts = new XmlOptions();
opts.setDocumentType(YourDocument.Factory.newInstance().schemaType());
YourDocument.parse(String xml, opts);
see xmloptions.setDocumentType
Related
I have created a library which includes some 200 Java classes generated from an existing XSD with JAXB like this:
xjc -no-header -d schemas -b xsd/binding.xml xsd
Alas, JAXB is not supported on Android, and the general suggestion seems to be to use a different library. Castor seems to be a suitable alternative—especially as it offers conversion of .xsd into Java classes. However, doing so seems to be more complex than in Java, and I have no idea of how much the result differs from xjc output.
My use case is unmarshaling and reading the unmarshaled data (changing data or marshaling are not needed). That is, there is a vast amount of code which relies on the resulting Java class schema, therefore any difference between xjc-generated classes and their Castor counterparts would mean a lot of refactoring.
Is there a simple recipe on how to generate Java classes from .xsd in Castor and get a result that is as close as possible to what xjc produces?
The following approach is as close as I got, though some refactoring is still required:
Place a file named castorbuilder.properties on your classpath, with the following content:
org.exolab.castor.builder.javaclassmapping=type
org.exolab.castor.builder.javaVersion=5.0
# Replace the following lines with your schema(s)
org.exolab.castor.builder.nspackages=\
http://example.com/schema/foo=com.example.schema.foo,\
http://example.org/schema/bar=org.example.schema.bar
org.exolab.castor.builder.primitivetowrapper=true
Then run the following command line:
java -cp "*" org.exolab.castor.builder.SourceGeneratorMain -i schema.xsd -types j2
In the above, replace the classpath with the path to the Castor JAR files and castorbuilder.properties, and replace schema.xsd with the path to your XSD file.
Of course, you can invoke Castor via Ant or Maven—in this case, be sure to use the respective equivalent to the command line options above and make sure the properties file is picked up.
Differences which require refactoring:
Enum types are now in their own types sub-package
Where XML Enum literals are in camelCase, they now become CAMELCASE rather than CAMEL_CASE
Castor generates array properties where JAXB has Collection properties
Some class names get an underscore as a prefix; apparently JAXB drops those while Castor preserves them
Date fields are now Date instances
In one instance a property changed its name from value to content
However—there seems to be a bug in Castor, as some of the generated classes invoke a non-existent constructor. This has stopped me from successfully trying this out; I only got as far as refactoring my existing code to the point of being free from syntax and compiler errors.
I'm trying to parse a xml-file based on a xsd in Matlab.
As recommended in this thread MATLAB parse and store XML with XSD, I'm trying to use JAXB to do this, but I got stuck.
I was able to create the .java class files representing my xsd-levels via
xjc myFile.xsd –d myDirectory (within the windows command-line).
I've read the linked articles in the other thread, but now I don't know how to go on.
What are the next steps before I can go on in Matlab?
Thanks in advance
Edit:
Matlab Version is 2010b (Java 1.6.0_17)
Edit2:
I tried now JAXB.unmarshal(input, MyClass.class) as I managed to get the MyClass.class by using an eclipse JAXB-Project (only using the command line input mentioned above just gave me the MyClass.java files).
But Matlab can't find the method JAXB.unmarshal(probably because of the java version of matlab?).
After adding the folder, where my package is in(D:\PathToMyPackage\my\Package\MyClass.class), to the dynamic java path in Matlab
javaaddpath('D:\PathToMyPackage') and after importing 'my.package' I was also trying:
jc = JAXBContext.newInstance('my.package')
jc = JAXBContext.newInstance('ObjectFactory.class')
jc = JAXBContext.newInstance('my.package.ObjectFactory.class')
But each time I get a JAXB Exception eg.
javax.xml.bind.JAXBException: "my.package" doesnt contain
ObjectFactory.class or jaxb.index
although they're inside the folder.
So is it even possible to perform the whole JAXB-thing only in Matlab? Or do I first have to wrap around some classes in eclipse to export the whole thing as a .jar-file, which I then import in Matlab?
If it is possible, which mistakes could I make? Or what could have an impact?
Newer Java version in eclipse, the dynamic java path in Matlab?
If you use Java 8 and don't need xml validation, converting .xml source into Java object is quite easy:
MyClass myObject = JAXB.unmarshal(input, MyClass.class);
Here, input is either InputStream, File, URL etc, pointing to .xml source, MyClass is a class that represents root element--one which has #XmlRootElement annotation. It can be found among classes that were generated by xjc.
See more:
https://docs.oracle.com/javase/8/docs/api/javax/xml/bind/JAXB.html
I have an android project that relies on two jar files. Each jar file contains org.slf4j.impl.StaticLoggerBinder. The implementation of this class is different in each file. When I try to build this is causing the following exception:
com.android.dex.DexException: Multiple dex files define Lorg/slf4j/impl/StaticLoggerBinder;
One of these libraries is logback-android, the other is closed source.
Is there any way to get these both working?
Having two jars with the same class inside is not forbidden in Java, but is dangerous and that's why Android is being conservative and raising an error.
What could happen is that having two different versions of the class (say 1.0 and 1.1), when loading the class, one or the other gets loaded in no really predictable way. So, if the compiler let you call a given method on version 1.1, the JVM will not find that method cause it loaded version 1.0 which didn't have it. Replace method with everything else (constructor, field etc..), and consider that usually this happens with full packages and not single classes, so you'll have a lot of classes of version 1.1 not finding methods on other classes of version 1.0 and so on.
Java itself does not have a standard solution to this. However, jar files are nothing more than zip files, and unless they are signed they can be opened and modified and re-jarred.
You could open the closed source .jar, remove it's org/slf4j folder, re-jar it, and try if it works with the other version of org.slf4j.
Or better yet, tell those guys that having a "single jar" with every kind of stuff inside is not cooler than having the jars separated.
I had to add
<prefer-web-inf-classes>true</prefer-web-inf-classes>
to weblogic.xml to resolve a Hibernate antr compatibility issue with Weblogic. after adding that I was getting all different kind of classCastException related to XML parsers.
I understood from reading other threads that weblogic is trying to use a different class that what the application is expecting.
I spend all day researching and tried different solutions like removing "xml-apis......." jar files. but everytime I get ClassCastException. The cast "from" class changes when I remove jar files, but I always get
ClassCastException: "some xml parser related class" can not be cast to javax.xml.parsers.DocumentBuilderFactory
is there a way to know which xml parser jars are really causing the issue.
I m using Maven 2 to manage dependencies
Answering my own question:
I removed all jars that contains classes from javax.xml.* package by doing a java search and searching Package and check search in "application Libraries". then I had to remove sax..jar file. everything worked as expected after that.
Good answer to your own question. I also found this method (if you use maven), which generates a really nice dependency chart. I had this same problem, and using the dependency chart I was able to determine that the offending library, xml-apis*.jar, was loaded by maven as a dependency of jasperreports. Adding the exclusion element to the pom.xml for the jasperreports module fixed the problem.
http://maven.40175.n5.nabble.com/where-is-xml-apis-1-0-b2-jar-coming-from-td88057.html
I have a couple of huge XML schema definition (XSD) files and I want to generate only for a subset of the defined types the corresponding Java classes.
More precisely I have a list of "root" types that I want to transform into Java classes including all types needed by these root types.
Is it possible to define some "root" types in a JAXB bindings file and tell JAXB to transform only them with all their dependent types into Java classes and ignore all the other unnecessary types?
Thanks in advance.
There may be a more straightforward way, but one way is to make copies of the XSDs and remove all XML types from the copies excepted the root types you want with their dependencies. Then apply xjc on the copies instead of the original ones.
You can automate this process with XSLT and build automation tools like Maven, Gradle, Ant, etc. You first write the XSLT stylesheet that transforms the XSDs to copy only the root types with dependencies, and save the result to a temporary location (e.g. target/generated-sources folder with Maven). Then, with Maven for example, you automate the process with build plugins in your pom.xml:
Run the XSLT transformation with Maven XML plugin, using preferably Saxon as XSLT processor.
Run JAXB2 Maven plugin to generate the Java classes from the new XSD copies (with schemaDirectory/schemaIncludes parameters).