JBoss Modules outside of JBoss AS - java

I had first found a reference to JBoss Modules when I stumbled upon Ceylon language which uses JBoss Modules as its module system. Immediately I wanted to try this system in some toy project and maybe even embed it in a real project (I was writing a project with plugins support at that time), but I couldn't find any documentation on JBoss Modules as a standalone library. The only available documentation source seems to be the official wiki, but it looks abandoned and unsupported. I couldn't even find Javadocs for it (except, maybe, this, but it seems to be very old and not really related to JBoss Modules due to "osgi" presence in the link).
It seems that JBoss Modules are usable outside of JBoss AS because Ceylon language uses it, but lack of almost any documentation on the subject is disappointing.
So, here are my questions:
Is it possible to use JBoss Modules as a standalone library at all? Are there any artifacts in some public Maven repository?
If it is (and there are), is there any documentation on it? That wiki I have mentioned does not have, for example, any instructions on embedding JBoss Modules.

If you'd like to try out JBoss Modules directly, you can grab the dependencies from the JBoss Nexus repository: https://repository.jboss.org/nexus/content/repositories/public/org/jboss/modules/jboss-modules/
Unfortunately, there isn't much documentation on JBoss Modules, but if you want to try it out, you probably don't want to be hand-writing modules.xml files yourself (maybe you like pain, I don't know.)
If you'd like to try out "Furnace" the modular container based on JBoss Modules and Maven that serves as the core module system for JBoss Forge, it give you the ability to write Maven projects that can be loaded directly as Modules. This is what we are using for our entire Forge 2 architecture.
You can find some docs on Furnace here:
https://github.com/forge/furnace#furnace
https://github.com/forge/core#developing-an-addon
Note that Furnace Addons require a maven classifier, you can choose the classifier used if you want to. This is done via the Furnace Manager (which can be seen in the furnace docs above.)

Yes. Actually, JBoss is using it that way as well - so JBoss application server is actually running inside JBoss Modules system.
I'm not aware of such documentation, but usually you shouldn't be embedding jboss modules, but running the applications with it. I'm not aware if you can embed it.
I actually got most information from this presentation in vimeo, Modular Class Loading with JBoss Modules. There seems to also be Zen of Modules video there.

Related

What exactly does "Dynamic Web Module" facet add to an eclipse project

Everybody in the world, except me, seems to know what exactly the "Dynamic Web Module" facet is adding to a project. Web search reveals tons of responses how to recover from various errors originating in more or less unwanted changes in the version of this facet, but there is hardly any information about what the facet actually does.
So my questions are:
What exactly does the "Dynamic Web Module" facet add to my eclipse project?
Why should I want this to happen?
Why do my colleagues using IntelliJ, Visual Studio Code etc. –where this concept does not seem to exist– have no problem?
Keeping mind that I didn't decide on these names (and I grumbled about them myself at the time)... The term dates back to the early J2EE Tutorials, like https://docs.oracle.com/cd/E17802_01/j2ee/j2ee/1.4/docs/tutorial-update2/doc/WebApp3.html The tutorial would explain that J2EE web modules are the web applications from the then Java Servlet specification. J2EE loved to subsume the other specs and use its own naming for things that already had names. The doc and tools often followed suit.
It also mentions that they can contain static web resources, and in fact you can run web modules that only contain static resources. So WTP has the concept of a static web module and a dynamic web module, as static web projects and dynamic web projects in Eclipse. The Facet designates a project as being one of the two, and for dynamic modules, which API level it supports and requires at runtime.
Server adapters then have to state which API versions the server types they provide can support. The Server Tools and validation can then help you avoid deploying to an incompatible server, as well as build against a valid server. You want to build against a valid server in the same way you want to compile against your intended Java runtime. It's the most straightforward way to, for example, avoid calling classes and methods that didn't exist at the time.
There's also a Module Core nature that gets added, which supports APIs for describing the deployment details. Details like saying your Java class output folder contents go into WEB-INF/classes, that the jars you select go into WEB-INF/lib and that those static resources go into the application root, all to comply with the layout expected at runtime. That API is meant to be pluggable, so it can be fulfilled by e.g. Maven integration.
The terms in the UI can be changed, but that has its own pain points. The community tends not to update old videos and tutorials that have been correct for years.

WildFly RestEasy Version confusion

I want to build a REST API using RestEasy. The generated file should be deployed in a WildFly application server.
I face the issue described in the following SO-question:
AsynchronousDispatcher error
The marked solution tells me, to set the dependency to "provided". Which as far as I understand means, that the library is not included in my war file but taken directly from the app-server...
Isn't that just wrong?
My idea would be to build a self-containing war file which contains all the needed libraries in the version I need.
When provided from the app-server I do get the currently available version from there. I have not really a clue about the version... when someone has the idea to update the RestEasy library on the server, it might break my app.
I'm not sure whether I missed something or did something completely wrong?
One of the big advantages to Java EE is developing towards the API and not having to worry about the implementation. Java EE containers provide the API's and implementations for the API's. If you include implementation dependencies one of two things is likely to happen.
You're dependencies will be ignored making it pointless to include them in your deployment.
You'll get conflicts between the dependencies you included vs what the server is expecting. This could be things like:
ClassCastException because it's finding two of the same class on the class path.
MethodNotFoundException because there is a version mismatch
Various other issues with conflcts
Developing towards the API instead of the implementation also allows you to easily switch between Java EE compliant containers with no to minimal changes to your deployment. The API's are generally backwards compatible as well making version upgrades not as big of an issue.
If you want to use a fat WAR (including implementations) instead of a skinny WAR (not including the implementations) then a servlet container is probably a better solution. WildFly does have a servlet only download. I'd encourage you though to trust container to do the right thing with the implementation dependencies :). Usually the only time there is an issue with upgrading is if you're upgrading Java EE versions. Even then it's usually pretty safe.

Spring Dynamic Modules - is it alive project?

Spring Dynamic Modules - is it alive project?
For example here has info what "Spring will NOT support any further releases as OSGi bundles. ". But here has Spring Dynamic Modules Reference Guide, where no info about discontinuing of the project.
Although the project is moved to Eclipse, it is more dead than alive. Pivotal has abandoned the project, which makes the Eclipse move more a code dump than a serious attempt to create an open source project. I would not advise building on top of it.
Its now Eclipse Gemini Blueprint, the change is described in brief here
http://www.eclipse.org/gemini/blueprint/documentation/reference/1.0.2.RELEASE/html/eclipse-migration.html
The fact that is never mentioned in the reference guide is strange for me also. Especially considering "While the project name has changed (to Eclipse Gemini Blueprint) and significant efforts have been made to reflect this in the project documentation and resources, there might be places that we have missed; if you find any, please report them to us."
Working migration guide link: http://www.eclipse.org/gemini/blueprint/documentation/migration/

Project with Guava, GWT and AppEngine

Is it possible to use the Guava libraries on a project done with both GWT and Google AppEngine?
I see that the individual jars (the standard Java one and the GWT compatible one) have the same package naming hierarchy. How do these integrate in a GWT+AppEngine projecT?
Yes it is possible. A few Guava classes won't be usable on AppEngine because of the restricted sandbox your app will run in, especially those in the .io package like Files (you will be able to read stuff but not write it).
Are you worried about deploying both jar files and having a conflict? If so, I think it will be fine - when you compile your GWT application, it turns into javascript, so you wouldn't necessarily be deploying the GWT compatible jar, just the normal one.
There won't be any conflict as the gwt one will be used by true DevMode client-side and the GWT compiler, the "normal " one will live in your WEB-INF/lib and be loaded (in DevMode) in a different classloader. It thus depends entirely on your project and build setup.
That being said I never tried it within the same Eclipse project. I always use distinct client and server projects, and -noserver in DevMode.

Extending/Inheriting Tomcat Projects

We are developing webapps with Eclipse + Tomcat plugin. We recently started a new app which will run on Facebook and StudiVZ (FB competitor in Germany). Since the functionality of the app will be 95% the same we split the code into separate Eclipse projects (app-core, app-facebook, app-vz). The -core project is source-linked into the -facebook and -vz projects in Eclipse. We are also using Hudson for CI and made ant scripts that import the code from the -core project before building. So basically we tried to inherit on a project level.
The described method has some flaws:
Versioning is complicated
The -core project does not run standalone, which makes automatic testing partly impossible
We need to modify some models where the -core projects classes depend on
Other problems that make me think this is not the best solution
Does anyone have suggestions for a better solution?
There are a wealth of build tools available for Java that address dependency management and versioning specifically. Many of these integrate with Hudson and Eclipse.
I'd suggest looking at Maven and how it does dependency management as a good starting point. Even if you don't use Maven itself, many of the solutions out there build on Maven's dependency management mechanism. Something like Apache Ivy allows you to use maven dependency management, but still use your own custom Ant scripts; whereas something like Gradle is wholesale replacement.
You should be able to split your project into 3 or more parts and then establish dependencies via Java Build Path. You need to clean up the dependencies between the projects. If you need to configure your core components depending on whether it is a -facebook or a -vz project, you might need to separate configuration, maybe even use Spring or similar dependency injection framework.
When trying to introduce reuse into web-based Java projects, usually the problems arise in the UI code. Not many frameworks were built with this approach in mind.
I don't use/hate Eclipse[1], but can point to how we deal with a similar problem.
We use Maven with IntelliJ. In particular, both of these support modules which have defined internal dependencies. In your case it could be -fb and -vz modules depending on core, or you can split core into smaller parts (such as DAO, business logic, etc.).
When compiling, deliverables of "upper" modules would be used to build "lower" modules.
Let's go over points/flaws you have raised:
versioning is no longer a problem as everything sits under the same root of Subversion/GIT/VCS of your choice
Why is that a problem? Certainly this shouldn't be an issue for unit tests as how I understand TDD, these should not require complex environments. For automated tests, you would have to test the core API (as this is the interface between core and everything else, right?) hence this shouldn't require any fronted stuff?
you need to explain your other points to tell why you don't like it
It is against Geneva convention to ask a developer to use anything other than IDE of his/her choice.

Categories