I am trying to create an integration test, which requires a running PostgreSQL server. Is it possible to start the server in maven build and stop it when tests are completed (in a separate process, I think)? Assuming the PostgreSQL server is not installed on the machine.
You are trying to push maven far beyond the intended envelope, so you'll be in for a fair amount of hurt before it will work.
Luckily postgresql can be downloaded as a zip archive.
As already mentioned above maven can use ant tasks to extend its reach. Ant has a large set of tasks to unzip files, and run commands. The sequence would be as follows :
unzip postgresql-xxx.zip in a well known directory --> INSTALL_DIR
create a data directory --> DATA_DIR
/bin/init-db -D
/bin/postgres -D
/bin/create_db -EUNICODE test
This should give you a running server with a test database.
Further issues : create a user, security (you likely want to connect via TCP/IP but this is disabled by default if I recall correct, this requires editing a config file before starting the database)
...
Good Luck.
I started writing a plugin for this purpose:
https://github.com/adrianboimvaser/postgresql-maven-plugin
It's in a very early stage and lacks documentation, but mostly works.
I already released version 0.1 to Maven Central.
I'm also releasing PostgreSQL binary distributions for all platforms as maven artifacts.
You can find the usage pattern in the plugin's integration tests.
Cheers!
Not to my knowledge. However, you could run a remote command that starts the server.
I think the usual scenario is to have a running integration test db, and not to shut it down/ restart it between builds.
But if you really want to you could set up your continuous integration server to start/ stop the db.
You sound like you are trying to build a full continuous integration environment. You should probably look into using a full CI tool such as Cruise Control or Bamboo.
How I've done it before is to set up a dedicated CI db that is accessible from the CI server, and then have a series of bash/python/whatever scripts run as a After Successful Build step which can then run whatever extra integration tasks you like. Pair that with something like liquibase and you could wipe out the CI db and make sure it is up to the latest schema every build.
Just to bring some fresh perspective into this matter:
You could also start the postgresql database as docker instance.
The plugin ecosystem for docker seems to be still in flux, so you might need to decide yourself which fits. Here are a few links to speed up your search:
https://github.com/fabric8io/docker-maven-plugin
http://heidloff.net/article/23.09.2015102508NHEBVR.htm
https://dzone.com/articles/build-images-and-run-docker-containers-in-maven
Related
I manage a large propriety system that's compromised of about a dozen services in java. We have a core set of java libs that these all share ), and all the components/apps are built using maven. Outside of the core SDK jars though each app has its own unique set of dependencies. I can't figure out what the best approach is to both building and deploying inside docker. Ideally I want the entire lifecycle in docker, using a multi-stage build approach. But, I can't see how to optimize this with the huge number of dependencies.
It looks like I can do 2 approaches.
Build as we have before, using maven and a common cache on the CI server (jenkins) so that dependencies are fetched once and cached, and accessible to all the apps. Then have a dockerfile for each app that just copies the product jar and it's dependencies (or a fat jar) into the container, and set it up to execute. Downside of this approach is that the build itself is something that could differ between developers and the CI server. Potentially use a local maven cache like nexus just to avoid pulling deps from the internet everytime? But that still doesn't solve the problem that a dev build won't necessarily match the CI build environment.
Use multi-stage dockerfile for each project. I've tried this, and it does work and I managed to get the maven dependencies layer to cache so that it doesn't fetch too often. Unfortunately that intermediate build layer was hitting 1-2gb per application, and I can't remove the 'dangling' intermediates from the daemon or all the caching is blowing away. It also means there's a tremendous amount of duplication in the jars that have to be downloaded for each application if something changes in the poms. (ie they all use junit and log4j and many other similarities)
Is there a way to solve this optimally that I'm not seeing? All the blogs I've found basically focus on the 2 approaches above (with some that focus on running maven itself in a container, which really doesn't solve anything for me). I'll probably need to end up going with Option 1 if there aren't any other good solutions.
I've checked around on stackoverflow and blogs, and everything I can find seems to assume that you're really just building a single app and not a suite of them, where it becomes important to not repeat the dependency downloads.
I think it is OK to use the .m2/repository filesystem cache as long as you set the --update-snapshots option in your maven build. It scales better, because you cache each .jar only once per build environment and not once per application. Additionally a change in a single dependency does not invalidate the entire cache, which would be the case if you use docker-layer-caching.
Unfortunately that cannot be combined well with multi-stage builds at the moment, but you are not the only one asking for it.
This issue requests adding a --volume option to the docker build command. This one asks for allowing instructions like this in the Dockerfile: RUN --mount=m2repo=/var/mvn/repo mvn install.
Both features would allow you to use the local maven filesystem cache during your multistage build.
For the moment I would advise to keep your option 1 as solution, unless you are facing many issues which are due to differing build environments.
We are generating a Web Service for deployment to Azure. This includes four pipeline stages for Dev, Test, Full UAT and production. On initial deployment to Dev I want to perform a set of Selenium smoke tests. Then when deployed to UAT, a full set of automated tests should be triggered.
Our test team are happier using Selenium through its Java route. After a couple of days it became clear that the process was to generate a UI agent (really important to anyone who hasn't done this yet, as ChromeDriver does run without a session, but will just hang, making you think it must be close to running), assign a SELENIUM_TEST agent property, and set this flag as a build dependency (this helps it to find the correct agent), and ensure that you set the required java and maven variables in the VSTS settings (apart from the path), rather than the local machine environment. Finally to use the clean, update and -X parameters to force the environment to be configured as part of the test process.
Now I have the problem of how do I trigger these tests from the deployment pipeline. I have searched and found articles on a large number sites and cant find anything on how this may be achieved using the Maven Java Selenium combination.
Can anyone help?
For build and deploy Java Selenium Tests in VSTS, you can refer the document Testing Java applications with VSTS for detail steps:
Besides, you can also refer the blog Continuous Testing of a Java Web App in VSTS using Selenium for build and deploy Java Web App in VSTS.
I am not posting this as a full answer, but I wanted to respond to the kind input from Michele and Marina. I am not sure that there isn't a better way of approaching this but with the assistance of both I was able to at least get closer to an answer. I did prepare images, but apparently you need a reputation to do so.
So this is what I actually ended up doing.
Step 1 – the MVC web app was generated and appropriate deployment slots set up to receive the web build artefacts.
Step 2 – Created a CI process purely to generate code I could deploy into the WebApp CD pipeline.
Step 3 – Generated an empty “Smoke Test” environment in the WebApp Deployment pipeline, and added the new Artefact from step 1 into this.
Configured the Smoke Tests item
Configured it to only receive the _AutoTest-CI artefact
Set it to use the “default” pipeline
Added the “Demand” that specifies the machine configured for Selenium tests.
Added the Maven task, and pointed it at the Maven POM
At this stage it succeeded to run through the configured tests. The Maven deployment step seems to have the idea that it can generate test results, but the output gives warnings that no test results were generated. It will generate the output, and reports a success or failure, so this is a semi-success. The missing last piece is to report the full test results, which I have yet to achieve.
You can trigger the Tasks inside of an Environment by configuring the Triggers with this tools in the Release management UI.
If the trigger contitions are met the process will start automatically. Inside of your process you can do whatever Task you need.
Reference
Microsoft VSTS Docs
I'm getting started with SonarQube usage for JSF page static analysis[1] in Maven. I'm only really interested in using it in Maven since I don't like the idea to introduce another build command.
After going through Analyzing the source code and the specific Maven guide I gained the impression that the plugin can only be used after downloading, installing/unpacking and starting a SonarQube instance at localhost and specifying the connection information in the plugin declaration in the POM. The plugin configuration parameter confirm that.
While this workflow might have advantages it is painful to use on CI services and the necessity to start a service manually in order to be able to build seems not very user friedly (given the fact that other development tools like Selenium or Arquillian pull entire browser, driver and servers in the background without one single line of configuration). Am I missing something about a separate plugin or configuration which manages an embedded or otherwise temporary instance to perform the analysis with a single plugin declaration?
[1] I'm aware that there're other tools based on XML validation which could do the job, but setting up a much more powerful tools like SonarQube seems to be a more flexible approach which will probably pay off.
You don't have to install SonarQube on your build server, but it is necessary to execute analysis (results will be pushed to it). It means that you have a working server somewhere and next you have to set required parameters:
sonar.host.url (http://localhost:9000 is a default value)
sonar.login and sonar.password (if your SonarQube server is secured)
See all Analysis Parameters.
I have my java based custom application. I am very much new to Jenkins and I have a requirement where Jenkins should perform build activities on check in/commit files in SVN for my application.
What are the steps or processes that needs to be followed for working with Jenkins and SVN while working on a custom java based application.
Please guide.
You can have Jenkins poll for changes and build when one or more commits have been detected. By decreasing the wait time between polling you can trigger a build on a new commit.
You can set this in the following location:
Go to your project > Configure
Scroll down to 'Build Triggers'
select 'Poll SCM'
Here you can set your polling schedule to:
H/5 * * * *
this setting makes Jenkins poll your configured source repository every 5 minutes. The '?'-button behind the input field explains more detailed use.
edit: for setting up Jenkins in general I recommend the same tutorial by vogella mentioned by henriquedsg89, or this tutorial which also gives more information about other possibly interesting settings for use with SVN.
At jenkins click at New Job, type the project name and chose a initial configuration, can be free-style. At configuration page, select the source code management as Subversion and type the svn configs.
There you can add Build steps, where you can chose to execute a shell script or whatever and move up or down the build step.
Here is a good article about configuring jenkins to an android app.
http://www.vogella.com/tutorials/Jenkins/article.html
For your requirements there are 3 high tier steps but Jenkins has pretty helpful tooltips for each of these:
Configure your source code repository. This can be done globally or per job. (Ie; git etc). Do this as one job and test before moving on.
Configure how you would like Jenkins to be notified of new builds on your repo.(Poll SCM, or you can look at this plugin)
Configure Gradle within Jenkins to do your Java builds.
Low level tiers are whether your are hosting Jenkins locally or on a hosted environment which could affect how you need to setup your Jenkins environment.
I know its a super late answer, never the less it might be useful for someone who checks it out later. There are mechanisms to do away with polling. There is a plugin which would trigger Jenkins on commit pushes to repo. It is available for Git and Subversion. However, do note that it is necessary to enable polling for this commit trigger to work as well.
Click on New Job tag on the left upper corner:
It will bring up a webpage that you have to give a job name also choose a style.
You should choose either a free-style or Maven project if you are using maven to build your project.
As you said, you are using SVN. Then you should choose Subversion under the Source Code Management options. Copy our svn URL there and it might ask you to enter credentials.
Example
You might have one generic account provided or your own.
Use the build tools to build the application. Depends on what unit test or other code metrics you want to test out, you might can utilize some plugins for your after build process. You can use JUnit to have a plot of your unit test in track. Also, you can choose the person you want to send the email to if the build failed.
Hope this is helpful.
Video reference: https://www.youtube.com/watch?v=RR0LabeUQ88
is there such a thing in a standard manner?
including
Java Source Code - Test Code -
Ant or Maven
JUnit
Continuous Integration (possibly Cruise Control)
ClearCase Versioning Tool
Deploy to Application Server
in the end I 'd like to have an automatic Build and Integration Environment.
Sounds like a job for Hudson.
There are no end of possible solutions. Take a look at the continuous integration matrix, which details common solutions, and their associated features. Hopefully you should be able to make a decision based on that.
Take a look at Apache Continuum.
A tool set could be
IntelliJ
Ant
The ant files can be generated by intellij
Then you need to write some ant to run your tests!
Then you need to write some ant to package your application
Then you need to write some ant to deploy your application
including configuring queues, databases etc (dbdeploy might work for this)
TeamCity
Subversion - Its nice and easy. ClearCase is shocking don't go near.
If you're doing Java EE, then you'll probably have a web site, if so, you might want to use WebDriver, possibly in conjunction with WindowLicker
If you have a database or jms broker, make sure each developer HAS THEIR OWN! This is very important - makes sure everybody has a copy that they can do what they like with, and obviously the continuous integration (CI) environment must have its own copy too!
The pay back for such an environment can be huge. On my current project, we have a two-click-to-production automation coming straight out of TeamCity.
I would suggest below set:
Spring Tool Suite www.springsource.org/ - IDE with maven plugin for development and Accurev plugin for repo management.
Accurev www.accurev.com - For source code repository/version management.
Maven maven.apache.org- For build process.
Hudson hudson-ci.org/ - to Automate build and integration.
JIRA www.atlassian.com/JIRA- For bug/issue tracking.
Rally www.rallydev.com - For project management.