After getting counterclockwise working on my Eclipse setup
and GAE development server running in interactive mode I found these
things still unclear for me:
1) How can I start server and application without commanding it on
REPL?
2) When I deploy application to Google servers, how and where do I
define the entry point of application? I mean, how Google will know
which application, application handlers and routes to use?
3) Can I combine using java classes and clojure files on same project
so that both are compiled automatic when creating and editing them on
my src folder?
4) Which files and jars are actually needed for uploading to GAE at
the end? Im used to deploy PHP apps to GAE, but here I dont know if I should make jars, include compiled clj files. I also might like to organize files different way than counterclockwise or appengine-magic does, so where do I specify paths to resources and classes?
5) Finally is it possible to connect Google production server with
Emacs - Slime - Swank combination? That would be the fulfill of
dreams, lol.
I'm using appengine-magic with Jetty, Compojure, Ring and Hiccup.
I'm going to suggest a lein/appengine-magic/Eclipse hybrid approach. Create your GAE project with appengine-magic and then set it up in Eclipse.
Create a Clojure "Run Configuration" and check the source files you need evaluated to bring the server up. You will get a REPL to it when it starts.
Your GAE entry point is your web.xml server-class, which refers to the ahead-of-time compiled source in app_servlet.clj (assuming you used lein appengine-new to create the project originally). Look in app_servlet.clj for the call to make-servlet-service-method -- the argument there is your App Engine Magic (see def-appengine-app in core.clj) entry point. In turn that refers to your Compojure handler and routes. See https://github.com/gcv/appengine-magic for the details.
I have not done this, so cannot comment.
Let appengine-magic take care of this: lein appengine-magic prepare, then deploy the deploy the war directory appcfg.sh (which you can find in the GAE Java SDK). You may also be able to use the GAE Eclipse plugins to achieve this.
You cannot use sockets with GAE. Swank depends on sockets, so a REPL to your live application is not possible. You can REPL all you like with the dev server however.
Q 1 & 2 were eventually solved and cleared.
Q 3 I wasnt able to do it because either java or clojure classes overwrote each other and I couldnt change target directories for them separately.
Q 4 after first succesfull deployment now I know what are the core base jars to be included. Yes it depends on what you happen to use on your project. I think I have transferred way too many unnecessary files on PHP deployments.
Q 5 Thats what I thought. But I didnt get swank working on dev app engine server. Its reporting illegal access to some appengine sdk file. Maybe I need to include it on project libs...
Related
I am coding a java web app.
When I started, every time I needed to use an external package, I would download the jars manually and download all dependencies of each jar manually and place them in the libraries folder (in Netbeans).
As time went on, I started using a dependency manager (Ant).
Now, I would like to use my dependency manager for all of my external libraries.
If, after executing this change I run my application and it successfully deploys (no ClassNotFoundExceptions and no NoClassDefFoundErrors), is it safe to assume that I have not missed anything and that my application will run smoothly as far as the external packages go?
Or, do I need to individually test out each functionality in my web app to confirm that the changes I made to the libraries didn't change how the application runs?
It's actually depends on the code inside these libraries. Only part of classes are loaded at startup, thus you can miss something. Also there might be a possibility that you're loading some classes in runtime manually, i.e. Class.forName(String) and this code has not been triggered at startup. Thus, I would say you can't be 100% sure.
Generally in Java here are 3 build approaches:
Imperative - you're saying "How to assembly your code". The typical example of this is Apache Ant.
Declarative - you're saying "Which code you want to assembly". The typical example of this is Apache Maven
Mixed - which takes benefits of previous systems. This is Gradle.
How it helps!
I have a webapp in a war archive which is deployed on cloudfoundry.
One of the libraries ("somelib.jar") used by the app is made by another developer.
I would like a way for him to upload several different versions of somelib.jar and test the behaviour of the app.
I have managed to get the jar uploaded to WEB-INF/lib directory of the deployment. I have also managed to unpack the jar into WEB-INF/classes. However, I have not managed to get the new version of the jar to be used. I tried various hacks such as those described in this question and this question without any luck.
Everytime, the classes/jars that get loaded the first time get used after that, even if we replace the actual .class or .jar file in the above directories.
Is there any easy way to achieve what I want?
Note: Since I dont have control of Tomcat (where it runs), I cannot configure Tomcat or make any changes to the server. I just have control on my war file, so everything needs to be done programmatically.
EDIT: the reason I want this is to reduce our testing time. Currently someone gives me a new version of somelib.jar, I repackage it into my application, upload to CF, send him a notification, then he tests the behavior of the new jar. What I would have preferred is that he upload his jar directly to CF and do the testing whenever he has a new version without the unnecessary intermediate delay.
In tomcat 7, you can version your WAR file and the new versions will gradually kick in.
http://www.tomcatexpert.com/blog/2011/05/31/parallel-deployment-tomcat-7
In order for you to control the application server yourself, you would need to deploy a standalone app into Cloud Foundry.
This blog should help you out with that:
http://blog.cloudfoundry.com/2012/05/11/running-standalone-web-applications-on-cloud-foundry/
This way you can custom configure your tomcat.
Everytime, the classes/jars that get loaded the first time get used after that, even if we replace the actual .class or .jar file in the above directories
That's the way that normal Tomcat (Java EE) classloading works. Your classes are loaded when first deployed, and any changes will be ignored (JSPs are managed slightly differently, but only in a development environment).
You should be able to solve this problem by using the Equinox OSGi bridge servlet. I haven't done this myself, but here's a writeup by a person that I respect.
I am developing a WebSphere portlet in IDEA 11. The portlet is using some methods defined on portal. I don't have the production environment compiled classes or jars on my PC but I have the source code.
Can I somehow "attach" the .java files to my projects in order to build a war file that will be deployed into the production environment? Or do I have to build the production sources first (this seems to be harder since there are lots of dependencies)?
If this is just to test something while you await the JARs/compiled classes, you can likely do this by only bringing over the API (e.g., referenced interfaces that hopefully don't have external dependencies). Then, open up the compiled WAR and remove those .class files manually to avoid collisions with the real code on the server.
The biggest problem is that you will definitely run into issues trying to limit the exposure to the real code, unless the rest of the code was setup nicely to expose an API that has very limited dependencies.
I am just investigating the idea of this so have no example code. I've been digging around on the web but I'm not really sure what I should actually be looking for so I'm not finding much. So I thought I'd ask and see if anyone knew if a) This is possible. b) How I should do it. (Or at least what I should be looking to learn about to do it.)
I'm building a web app using JSP pages on the client with a JBoss server running J2EE, in the middle there is a tomcat web server.
Basically the app contains different sections that will be rolled out over time as they're developed and also customers may be using different combinations of sections. The tidiest way of deploying this I can think of is to build each section into it's own jar. Then depending on the combination of sections that are relevant for the customer, install only the required jars on the JBoss server for deployment.
To support this I'd like the client navigation menu to only show the available sections based on what is deployed on the JBoss server. Is it possible for my servlet to find out what is deployed on the server? I'd like the solution to be as 'dumb' as possible, i.e. I don't want to tell it what to look for, (other than a prefix to identify our jars), as I don't know yet everything we might build.
My current solution is to have a table in the database to hold a list of the installed sections. But this is going to require update scripts etc during install and I'm sure we should be able to do this by just deploying jars on the server.
Thanks in advance.
You could add this information to the MANIFEST.MF file and then enumerate all files in your webapp (see this answer for how to list all manifests) when you start.
I do something similar by configuring a "Plugin Directory" setting for my application. It then scans that directory regularly for Jars. It looks for specific metadata in the manifest to determine whether the Jar is actually a valid plugin, and what class to load from it (the static initializer of that class registers the plugin with the application).
Then all you need to do is place a new Jar into that directory to add its functionality.
At our shop, we are maintaining roughly 20 Java EE web applications. Most of these applications are fairly CRUD-like in their architecture, with a few of them being pretty processor intensive calculation applications.
For the deployment of these applications we have been using Hudson set up to monitor our CVS repository. When we have a check-in, the projects are set to be compiled and deployed to our Tomcat 6.0 server (Solaris 10, sparc Dual-core 1.6 GHz processor, 2 GB RAM...not the beefiest machine by any stretch of the imagination...) and, if any unit-tests exist for the project, those are executed and the project is only deployed if the unit-tests pass. This works great.
Now, over time, I've noticed myself that a lot of the projects I create utilize the same .jar files over and over again (Hibernate, POI (Excel output), SQL Server JDBC driver, JSF, ICEFaces, business logic .jar files, etc.). Our practice has been to just keep a folder on our network drive stocked with all the default .jar files we have been using, and when a new project is started we copy this set of .jar files into the new project and go from there...and I feel so dirty every time this happens it has started to keep me up at night. I have been told by my co-workers that it is "extremely difficult" to set up a .jar repository on the tomcat server, which I don't buy for a second...I attribute it to pure laziness and, probably, no desire to learn the best practice. I could be wrong, however, I am just stating my feelings on the matter. This seems to bloat the size of our .war files that get deployed to the server as well.
From my understanding, Tomcat itself has a set of .jar files that are accessible to all applications deployed to it, so I would think we would be able to consolidate all of these duplicate .jar files in all our projects and move them onto the tomcat server. This would involve only updating one .jar file on the server if, for example, we need to update the ICEFaces .jar files to a new version.
Another part of me says that by including only one copy of the .jar files on the server, I might need to keep a copy of the server's lib directory in my development environment as well (i.e. include those .jar files in eclipse dependency).
My gut instinct tells me that I want to move those duplicated .jar files onto the server...will this work?
I think Maven and Ivy were born to help manage JAR dependencies. Maybe you'll find that those are helpful.
As far as the debate about duplicating the JARs in every project versus putting them in the server/lib, I think it hinges on one point: How likely is it that you'll want to upgrade every single application deployed on Tomcat at the same time? Can you ever envision a time where you might have N apps running on that server, and the (N+1)th app could want or require a newer version of a particular JAR?
If you don't mind keeping all the apps in synch, by all means have them use a common library base.
Personally, I think that disk space is cheap. My preference is to duplicate JARs for each app and put them in the WAR file. I like the partitioning. I'd like to see more of it when OSGi becomes more mainstream.
It works most of the time, but you can get into annoying situations where the jar that you have moved into tomcat is trying to make an instance of a class in one of your web application jars, leading to ClassNotFoundException s being thrown. I used to do this, but stopped because of these problems.
I really don't think putting libraries in common/lib is a good idea. The idea behind the use of war files as applications into a servlet container, is to have a real idea of isolation between your webapps. You could face errors like deploy some third party WAR (with it own libraries inside WEB-INF/lib) and it behave unexpectedly because it loaded other version of one of it libraries from the common one (remember that the regular behavior for load classes is first look at the common classloader and if you don't find the class look into the one for your webapp). Don't even mention how painful could be to move some application to other servlet container or an Application Server.
As mentioned before, you could use maven to deal with jar dependencies, and if you like the homogeneous use of libraries, define a POM parent (maven jargon) across all your applications.
In my experience you should be very careful with sharing libraries between web applications by moving them into the web container itself.
Let them live in WEB-INF/lib so your wars are self contained (you WILL be glad you did one day).
What you might consider is employing maven or Ant Ivy to pull in library jars from a common repository instead. This is very useful and should not be a problem in your scenario.
Edit: A notable exception is the Metro library - web service layer from Glassfish - which needs to be in the web container and not in the web application.