I am using jaxb (xjc version "2.0-b26-ea3")
I have been able to generate the classes for the schema (.xsd) using xjc but however when I try to compile the generated classes I get errors saying... "Package javax.xml.bind.annotation" doesnot exist."
I am using jdk1.5.0_14. I am trying to run through command prompt.
Any help will be appreciated.
Thanks in advance!!
XJC-generated Java source files use annotations from the JAXB API. In order to compile them, those annotation types must be on the classpath.
To use JAXB (for marshalling and unmarshalling to/from XML documents) you'll need these things:
The JAXB API definitions.
A JAXB implementation.
Any libraries the implementation depends on.
An implementation is separate from the API and interchangeable. It will be located via the Java service provider mechanism. I won't go into the details of that here, but let's say it should be sufficient to have a jar with an implementation on your classpath. Normally you'll only make calls to the JAXB API classes. For example, JAXBContext.newInstance("my.sample.pack");.
The actual implementation is located at runtime and loaded through your API calls. This means that in order to compile JAXB code, a jar with the API should suffice. An implementation and its dependencies is only required at runtime.
Now for a JDK 1.6.x, you won't need to include anything additional on your classpath. Starting with Java SE 6, the JAXB API was included in the standard Java SE API. The Java Runtime Environment also includes an implementation of that API. This is the Reference Implementation available on the java.net JAXB site.
For JDK 1.5.x, things are a bit different. The JAXB API was not yet included as a standard Java API. So you'll need to make it available manually. At the very least you'll need the API; the implementation you're going to use is free to choose, although I don't know any besides the reference implementation from the top of my head. It is probably the best one to start with.
Click the "download now" button on the JAXB site linked above. You'll see a link to download a jar file. Get that and open it either by double-clicking it in your file system or running it via the command line. This will extract some content to a folder in the same location as the jar. You'll see a number of folders. The bin folder contains runnables for xjc and schemagen. There's also documentation and sample folders. The lib folder is the one interesting to us. Here's a rundown:
jaxb-api.jar: this is the JAXB API; you'll need this for compiling your generated code
jaxb-impl.jar: the reference implementation; not needed for compiling but you'll want this at runtime
jaxb-xjc.jar: useful for invoking xjc programmatically or in Ant
jaxb1-impl.jar: the reference implementation of the JAXB 1 API; I assume you'll stick to JAXB 2 so ignore it
activation.jar: a dependency, not needed for compiling but might be needed at runtime
jsr173_1.0_api.jar: this is in fact the Java Streaming API for XML (StAX); it is used by the JAXB reference implementation
That last one works like JAXB in terms of implementation. It is an API with interchangeable implementations. And just like JAXB, it's not available by default in Java 5 but was included in the Java 6 API. You'll probably need an implementation for this as well in Java 5. I leave you to find and use it; the steps followed will closely ressemble what I described for JAXB.
So, to wrap it all up in a concise overview... If you can use Java 6, pretty much all dependencies are available out of the box and you won't need any additional stuff on your classpath to compile JAXB-related code and run it. For Java 5, you'll at least need the JAXB API for compilation and the API plus an implementation at runtime. The implementation might have some dependencies of its own, so if you're still getting ClassNotFound errors, try to find out what project the missing class is a part from.
The jarfinder site suggested by Pangea can be very useful for this. But don't skip the step of checking the actual project site to make sure you get all dependencies, the latest version and see what the licensing terms are.
Good luck!
That package is part of jaxb 2.0 api. you can download it here. These kind of questions can be easily self answered with the help of http://www.jarfinder.com
Related
I need help for the following problem: I use Websphere Liberty 19.0.0.9 with Oracle and IBM Java 1.8 an run an older application (EAR) containing an EJB, which serializes XML with JAXB. The application needs to control XML namespace definitions and prefixes and does this by providing an implementation of com.sun.xml.bind.namespacePrefixMapper to javax.xml.bind.Marshaller.setProperty with property "com.sun.xml.bind.namespacePrefixMapper".
At runtime the error java.lang.NoClassDefFoundError: com/sun/xml/bind/marshaller/NamespacePrefixMapper occurs when loading the implementation class.
The server.xml contains feature javaee-8.0, and the liberties’ JAXB implementation wlp-19.0.0.9\lib\com.ibm.ws.jaxb.tools.2.2.10_1.0.32.jar contains the class com.sun..xml.bind.marshaller.NamespacePrefixMapper.
I tried to solve it by putting the jaxb-impl-2.2.4.jar to the EAR/lib (which is the wrong way because JAXB is provided by JEE) but then an error occurred in the com.sun.xml.bind.v2.runtime.MarshallerImpl.setProperty(MarshallerImpl.java:511) because the check if(!(value instanceof NamespacePrefixMapper)) failed, because the Classloader (AppClassLoader) of the implementation provided another class object for class NamespacePrefixMapper than the MarshallerImpls’ classloader (org.eclipse.osgi.internal.loader.EquinoxClassLoader). But this showed that the liberty can access the the NamespacePrefixMapper.
I made several attempts to use the same classloader for the implementation and the MarschallerImpl when loading them, and I tried to solve it by classloder settings in the server.xml. No success.
I know that it is not recommended to use such JAXB implementation specific classes, but the application was developed this way and cannot be changed easily.
Any help is appreciated which tells me how to convince liberty to either provide the NamespacePrefixMapper class to the application classloader, or to use the application classloaders NamespacePrefixMapper also in the MarschallerImpl.
Thank you.
//The implementation class looks for example like this:
public class MyNamespacePrefixMapperImpl extends com.sun.xml.bind.marshaller.NamespacePrefixMapper {...}
JAXBContext c = JAXBContext.newInstance(some mapped class);
Marshaller m = c.createMarshaller();
com.sun.xml.bind.marshaller.NamespacePrefixMapper mapper = new MyNamespacePrefixMapperImpl();// Here the NoClassDefFoundError occurs.
m.setProperty("com.sun.xml.bind.namespacePrefixMapper", mapper); // Here the instanceof check fails if jaxb-impl.jar is in EAR/lib.
this is a precarious situation without an easy solution. Liberty attempts to "hide" internal packages to avoid scenarios where users want a slightly different version of the implementation than what the framework provides - the most glaring example of this problem was in traditional WAS where users wanted to use a different version of Jakarta Commons Logging than what was shipped with WAS - this required users to provide their own, either in an isolated shared library, or use other parent-last classloading hacks to make it work. Liberty avoids those issues by isolating the internal implementations from the user applications.
So that works great when a user wants to use a different version of a third party library than what Liberty provides, but as you have discovered, that doesn't work so great when your legacy application depends on those hidden/isolated third party libraries.
The most ideal solution would be to refactor the application code so as to not depend on internal JAXB classes - somebody with more JAXB expertise may be able to help with this... But it sounds like that may not be feasible, so another alternative would be to create a user feature. A user feature is essentially an extension to Liberty's runtime - so it has access to packages that user applications do not. It also allows you to add packages as APIs for the user applications - so you could use a user feature to add the com.sun.xml.bind.marshaller as a public API - then your user application could extend it freely. You could also include your MyNamespacePrefixMapperImpl class in your user feature and register it there so that it would automatically apply to all applications in your server.
You can find more information on user features here:
https://www.ibm.com/support/knowledgecenter/en/SSEQTP_liberty/com.ibm.websphere.wlp.doc/ae/twlp_feat_example.html
Hope this helps, Andy
`Platform`: Windows 7, MinGW, MSYS, Java 1.5
I have thrift 0.9.1 compiler (prebuilt for windows) and source. I use Ant to build java library.
I create one thrift idl and compile it with the compiler. No problem in generating code files.
I add these files in my project, and that add slf4j (downloaded from their site) and libthrift.
Most of the errors that I have previously (imports etc) are gone except for errors related to overriding methods.
So basically it complains like:
The method clear() of type Server must override a superclass method
and similarly for compareTo, write, read etc. In short it complains about all methods that are overridden. This is all thrift compiler generated code and I haven't changed anything.
Is there any incompatibility? I cannot really find any mention of that. I have tried removing and then adding the libraries, I have also tried cleaning, refreshing, validating the project but the errors are still there.
I have also tried to compile the code (thrift code) but MinGW is also a huge headache. It cannot find configure even though I have installed it. And if I run the msys console, it is able to configure but cannot make complaining about inttypes.h not present (which is not in msys include directory but is present in MinGW include directory.).
Any suggestion would be appreciated.
Are you using Java 5? With Java 5 #Override doesn't search for methods on interfaces, only on superclasses.
If you are using a Java 5 compiler trying using a more recent javac (preferably 7 or 8) and see of that works.
EDIT:
Not sure if this is in your version of Thrift, but in mine it looks like there is a flag called java5 that you an specify when generating code to specify that you want the generated code to be Java 5 compliant
java (Java):
beans: Members will be private, and setter methods will return void.
private-members: Members will be private, but setter methods will return 'this' like usual.
nocamel: Do not use CamelCase field accessors with beans.
fullcamel: Convert underscored_accessor_or_service_names to camelCase.
android: Generated structures are Parcelable.
android_legacy: Do not use java.io.IOException(throwable) (available for Android 2.3 and above).
java5: Generate Java 1.5 compliant code (includes android_legacy flag).
reuse-objects: Data objects will not be allocated, but existing instances will be used (read and write).
sorted_containers:
Use TreeSet/TreeMap instead of HashSet/HashMap as a implementation of set/map.
I'm trying to get my head around some concepts in Java:
JSR(s): describe specifications, but carry no actual implementations. E.g. http://jsr311.java.net/ is the "home" for "Java™ API for RESTful Web Services". It serves as a common reference for all implementations of JSR-311.
One can download the interfaces (?) of JSR-311 from http://mvnrepository.com/artifact/javax.ws.rs/jsr311-api, however, unless you are implementing JSR-311 by yourself these have no particular value?
JSR(s) will usually/always have a reference implementation. To find it you'll have to google "JSR XXX reference implementation" or see the specifications home page (e.g. http://jsr311.java.net/)
For JSR-311 this reference implementation is Jersey. Using maven you can get the jersey server from http://mvnrepository.com/artifact/com.sun.jersey/jersey-server/1.9. Since
Jersey provides an implementation according to the interfaces found in http://mvnrepository.com/artifact/javax.ws.rs/jsr311-api, you only need to add Jersey as a dependency in your project and not the jsr311-api itself. (this applies to all JSR technologies?)
Putting both http://mvnrepository.com/artifact/javax.ws.rs/jsr311-api and http://mvnrepository.com/artifact/com.sun.jersey/jersey-server/1.9 as dependencies in your project will possibly cause classpath problems?
Am I completely off or onto someting?
Yes, this isn't anything new. Think about JDBC, java provides the
interfaces (Connection, Statement, ResultSet etc) but it is up
to database vendors to provide implementations.
If you're using a JSR-311 implementation like Jersey or Apache CXF
then you'll annotate your classes with the javax.ws.rs annotations, such as #Path, #GET, #Produces etc. This is why you need to explicitly have JSR-311 as a maven dependency.
Yes, usually. Have a look at the JSR list on wiki.
You need both the JSR and the implementation. The annotations are in the JSR, the implementation provides supporting classes, such as com.sun.jersey.spi.container.servlet.ServletContainer.
No, it is necessary to have both as dependencies (see point 4); you won't get classpath conflicts.
—
One can download files from a variety of sources. To get the most official version of the JSR-311 specification go to its JCP download page. It's quite possible that you can't get a JAR file (with all the interfaces and stuff) from JCP pages, but still, this is the official source. (There are always nice PDFs of public drafts also!)
—
You're right, because Jersey contains the API defined by JSR-311, however I would add a compile dependency to the jsr311-api JAR file and add Jersey as runtime dependency. This creates a nice separation between API and implementation and you can swap out your JSR-311 implementation anytime [sic]. If you intend to use Jersey all the way include only Jersey. One less dependency in your POM.
If Jersey packages the same API as the jsr311-api JAR contains, it won't. If it packages something different, well, that would be awful! Maven will probably bark at compile time if one has a corrupt JSR-311 API on its classpath (I've already seen lots of java.lang.ClassFormatError: Absent Code attribute in method that ... errors, so it won't go unnoticed, that's for sure).
Other than these, you're right.
SLF4J has a nice mechanism, where the implementation is chosen at runtime, depending of what is available in the classpath. I would like to use such feature in several projects, for example to choose the communication layer or to choose a mock implementation.
I had a look at slf4j source to see how it's done and I could just write something similar. Before I start, I would like to know if some lightweight FOSS library exists for this kind of injection.
Unless you need specific configuration abilities as provided by Pico or Guice, you may get what you need from java.util.ServiceLoader.
Basically, all you have to do is to package your service implementation in a JAR file, include a text file with a list of all implementation classes in "META-INF/services/" and on you go.
Have you looked at Weld, CDI is part of the EE6 spec but the Weld implementation also supports running in a Java SE environment. It has exactly what you are looking for, here is a link to the relative documentation:
http://seamframework.org/Weld one maven dependency for your SE app.
http://docs.jboss.org/weld/reference/1.1.0.Final/en-US/html/environments.html#d0e5333 bootstrapping the Weld container in SE.
Producer methods to vary implementation at runtime:
http://docs.jboss.org/weld/reference/1.1.0.Final/en-US/html/producermethods.html
Plus (in my very biased opinion) Weld rocks ;)
SLF4J's "mechanism" is simply that its API jar is compiled with code that refers to a class that is only provided by one of its "implementation" jars. No framework or library of any kind is needed for this. Simply write one module which is compiled against a class not in that module. Then your "implementation" modules provide that class when included in the project.
Edit: Oh, and this is basically OSGi writ small (very, very small). If you're going to use this kind of thing on a large scale, look to an OSGi container or Eclipse Virgo.
Every java programmer should know how to use Spring.
I have a scenario where I have code written against version 1 of a library but I want to ship version 2 of the library instead. The code has shipped and is therefore not changeable. I'm concerned that it might try to access classes or members of the library that existed in v1 but have been removed in v2.
I figured it would be possible to write a tool to do a simple check to see if the code will link against the newer version of the library. I appreciate that the code may still be very broken even if the code links. I am thinking about this from the other side - if the code won't link then I can be sure there is a problem.
As far as I can see, I need to run through the bytecode checking for references, method calls and field accesses to library classes then use reflection to check whether the class/member exists.
I have three-fold question:
(1) Does such a tool exist already?
(2) I have a niggling feeling it is much more complicated that I imagine and that I have missed something major - is that the case?
(3) Do you know of a handy library that would allow me to inspect the bytecode such that I can find the method calls, references etc.?
Thanks!
I think that Clirr - a binary compatibility checker - can help here:
Clirr is a tool that checks Java libraries for binary and source compatibility with older releases. Basically you give it two sets of jar files and Clirr dumps out a list of changes in the public api. The Clirr Ant task can be configured to break the build if it detects incompatible api changes. In a continuous integration process Clirr can automatically prevent accidental introduction of binary or source compatibility problems.
Changing the library in your IDE will result in all possible compile-time errors.
You don't need anything else, unless your code uses another library, which in turn uses the updated library.
Be especially wary of Spring configuration files. Class names are configured as text and don't show up as missing until runtime.
If you have access to the source code, you could just compile source against the new library. If it doesn't compile, you have definitely a problem. If it compiles you may still have a problem if the program uses reflection, some kind of IoC stuff like Spring etc.
If you have unit tests, then you may have a better change catch any linking errors.
If you have only have a .class file of the program, then I don't know any tools that would help besides decomplining class file to source and compiling source again against the new library, but that doesn't sound too healthy.
The checks you mentioned are done by the JVM/Java class loader, see e.g. Linking of Classes and Interfaces.
So "attempting to link" can be simply achieved by trying to run the application. Of course you could hoist the checks to run them yourself on your collection of .class/.jar files. I guess a bunch of 3rd party byte code manipulators like BCEL will also do similar checks for you.
I notice that you mention reflection in the tags. If you load classes/invoke methods through reflection, there's no way to analyse this in general.
Good luck!