I'm getting started with SonarQube usage for JSF page static analysis[1] in Maven. I'm only really interested in using it in Maven since I don't like the idea to introduce another build command.
After going through Analyzing the source code and the specific Maven guide I gained the impression that the plugin can only be used after downloading, installing/unpacking and starting a SonarQube instance at localhost and specifying the connection information in the plugin declaration in the POM. The plugin configuration parameter confirm that.
While this workflow might have advantages it is painful to use on CI services and the necessity to start a service manually in order to be able to build seems not very user friedly (given the fact that other development tools like Selenium or Arquillian pull entire browser, driver and servers in the background without one single line of configuration). Am I missing something about a separate plugin or configuration which manages an embedded or otherwise temporary instance to perform the analysis with a single plugin declaration?
[1] I'm aware that there're other tools based on XML validation which could do the job, but setting up a much more powerful tools like SonarQube seems to be a more flexible approach which will probably pay off.
You don't have to install SonarQube on your build server, but it is necessary to execute analysis (results will be pushed to it). It means that you have a working server somewhere and next you have to set required parameters:
sonar.host.url (http://localhost:9000 is a default value)
sonar.login and sonar.password (if your SonarQube server is secured)
See all Analysis Parameters.
Related
We are generating a Web Service for deployment to Azure. This includes four pipeline stages for Dev, Test, Full UAT and production. On initial deployment to Dev I want to perform a set of Selenium smoke tests. Then when deployed to UAT, a full set of automated tests should be triggered.
Our test team are happier using Selenium through its Java route. After a couple of days it became clear that the process was to generate a UI agent (really important to anyone who hasn't done this yet, as ChromeDriver does run without a session, but will just hang, making you think it must be close to running), assign a SELENIUM_TEST agent property, and set this flag as a build dependency (this helps it to find the correct agent), and ensure that you set the required java and maven variables in the VSTS settings (apart from the path), rather than the local machine environment. Finally to use the clean, update and -X parameters to force the environment to be configured as part of the test process.
Now I have the problem of how do I trigger these tests from the deployment pipeline. I have searched and found articles on a large number sites and cant find anything on how this may be achieved using the Maven Java Selenium combination.
Can anyone help?
For build and deploy Java Selenium Tests in VSTS, you can refer the document Testing Java applications with VSTS for detail steps:
Besides, you can also refer the blog Continuous Testing of a Java Web App in VSTS using Selenium for build and deploy Java Web App in VSTS.
I am not posting this as a full answer, but I wanted to respond to the kind input from Michele and Marina. I am not sure that there isn't a better way of approaching this but with the assistance of both I was able to at least get closer to an answer. I did prepare images, but apparently you need a reputation to do so.
So this is what I actually ended up doing.
Step 1 – the MVC web app was generated and appropriate deployment slots set up to receive the web build artefacts.
Step 2 – Created a CI process purely to generate code I could deploy into the WebApp CD pipeline.
Step 3 – Generated an empty “Smoke Test” environment in the WebApp Deployment pipeline, and added the new Artefact from step 1 into this.
Configured the Smoke Tests item
Configured it to only receive the _AutoTest-CI artefact
Set it to use the “default” pipeline
Added the “Demand” that specifies the machine configured for Selenium tests.
Added the Maven task, and pointed it at the Maven POM
At this stage it succeeded to run through the configured tests. The Maven deployment step seems to have the idea that it can generate test results, but the output gives warnings that no test results were generated. It will generate the output, and reports a success or failure, so this is a semi-success. The missing last piece is to report the full test results, which I have yet to achieve.
You can trigger the Tasks inside of an Environment by configuring the Triggers with this tools in the Release management UI.
If the trigger contitions are met the process will start automatically. Inside of your process you can do whatever Task you need.
Reference
Microsoft VSTS Docs
I have my java based custom application. I am very much new to Jenkins and I have a requirement where Jenkins should perform build activities on check in/commit files in SVN for my application.
What are the steps or processes that needs to be followed for working with Jenkins and SVN while working on a custom java based application.
Please guide.
You can have Jenkins poll for changes and build when one or more commits have been detected. By decreasing the wait time between polling you can trigger a build on a new commit.
You can set this in the following location:
Go to your project > Configure
Scroll down to 'Build Triggers'
select 'Poll SCM'
Here you can set your polling schedule to:
H/5 * * * *
this setting makes Jenkins poll your configured source repository every 5 minutes. The '?'-button behind the input field explains more detailed use.
edit: for setting up Jenkins in general I recommend the same tutorial by vogella mentioned by henriquedsg89, or this tutorial which also gives more information about other possibly interesting settings for use with SVN.
At jenkins click at New Job, type the project name and chose a initial configuration, can be free-style. At configuration page, select the source code management as Subversion and type the svn configs.
There you can add Build steps, where you can chose to execute a shell script or whatever and move up or down the build step.
Here is a good article about configuring jenkins to an android app.
http://www.vogella.com/tutorials/Jenkins/article.html
For your requirements there are 3 high tier steps but Jenkins has pretty helpful tooltips for each of these:
Configure your source code repository. This can be done globally or per job. (Ie; git etc). Do this as one job and test before moving on.
Configure how you would like Jenkins to be notified of new builds on your repo.(Poll SCM, or you can look at this plugin)
Configure Gradle within Jenkins to do your Java builds.
Low level tiers are whether your are hosting Jenkins locally or on a hosted environment which could affect how you need to setup your Jenkins environment.
I know its a super late answer, never the less it might be useful for someone who checks it out later. There are mechanisms to do away with polling. There is a plugin which would trigger Jenkins on commit pushes to repo. It is available for Git and Subversion. However, do note that it is necessary to enable polling for this commit trigger to work as well.
Click on New Job tag on the left upper corner:
It will bring up a webpage that you have to give a job name also choose a style.
You should choose either a free-style or Maven project if you are using maven to build your project.
As you said, you are using SVN. Then you should choose Subversion under the Source Code Management options. Copy our svn URL there and it might ask you to enter credentials.
Example
You might have one generic account provided or your own.
Use the build tools to build the application. Depends on what unit test or other code metrics you want to test out, you might can utilize some plugins for your after build process. You can use JUnit to have a plot of your unit test in track. Also, you can choose the person you want to send the email to if the build failed.
Hope this is helpful.
Video reference: https://www.youtube.com/watch?v=RR0LabeUQ88
I work on a product composed of many bundles running as features on top of karaf. Typically our developers work on one bundle at a time. Our normal development goes something like: code, compile, copy bundle to deploy folder, test. We've also found that hotdeploy just refuses to override certain bundles that are installed as features without a server restart or a feature uninstall/reinstall, so sometimes the cycle is longer.
My question is: does anyone in the community have a better way? The way we do things works, but I feel like it's pretty slow and inefficient and I'm betting someone has come up with something better!
EDIT: I realize that I was pretty unclear in my question... We are using Equinox underneath Karaf. We also use Eclipse and Maven, although I don't know that using Maven is relevant.
Sounds like you want the dev:watch command. From the documentation:
The watch command can be used to help at developement time. It allows you to configure a set of URLs that will be monitored. All bundles location matching the given URL will be
automatically updated. This avoids the need for manually updating the bundles or even copying the bundle to the system folder if needed. Note that only maven based urls and maven snapshots will actually be updated automatically, so if you run
dev:watch *
It will actually monitor all bundles that have a location matching mvn:* that have '-SNAPSHOT' in their url.
Doing "dev:watch --help" from the Karaf shell will list its available flags and args.
Something similar is the PAX plugin
Either of these will work quite nicely if you've got the m2 maven plugin for Eclipse.
UPDATED: In my company we strive to be as TDD as possible, therefore a lot a development is done without explicitly starting Karaf. In the normal mix of unit tests we're also using Pax Exam, which is largely fantastic even when run from within Eclipse =)
This helps ensure we're not too tided to any Karaf specifics as it runs with Equinox/Felix/Concierge (so I mock out various Karaf specifics we depend on like JAAS authentication). Along with many other cool tools/functionality, it's capable of provisioning Karaf features and using TinyBundles you can even create bundles on the fly (again useful for mocking/stubbing).
Pax Exam hooks into the JUnit framework by providing a JUnit #Runner, the latest version (2) is much faster and has DSL based API, so the tests are quite concise and readable.
Using Pax Exam gives us good test coverage and short development times. Where tests are less practical or we're hunting bugs that don't surface in tests, the dev:watch command is invaluable.
In summary; IMO you should definitely drive your developments with tests (Pax Exam will slot into your existing build nicely and once you get used to it you'll find development quicker). You can start using the dev:watch command immediately, it will certainly speed up your current situation.
UPDATE 2: In answering another question I've added a maven example Pax-Exam testing a ComponentFactory. Test Driven Development is arguably the most efficient workflow available to developers today. link to question: osgi: Using ServiceFactories? and link to sourcecode: http://dl.dropbox.com/u/2465717/net.earcam.example.servicecomponent_2011-08-16_15-52.tgz
I've had excellent results using Equinox in Eclipse - even hot code replace works properly. granted, the target platform is small and we have only on the order of approx 50 bundles of our own, but workflow goes like this:
First, we have a target platform that contains all third-party and Eclipse bundles, Eclipse takes care of downloading & managing them. Then, the workspace has all the bundles of the project, grouped in 3-4 working sets. Compilation happens as usual on save, sometimes GWT needs to be recompiled, but even then the changes get picked up immediately because no deployment needs to happen - the running Equinox system uses the unpacked project folders as bundles. Running this from within Eclipse gives us hot code replace, on-the-fly changing template files, only MANIFEST.MF/plugin.xml changes need to refresh the bundle - and even then it's usually faster to just restart the framework than to type in the console.
if you use Eclipse Eclipse Libra may be useful for you. Libra can start Felix, Equinox and Knopflerfish inside Eclipse as any other server with WST. They have some youtube videos how to use it.
I also wrote some tools that can help:
An osgi bundle that picks up OSGI services that match the filter (osgitest=junit4). With that you do not write Junit classes but you can provide pre-configured objects (e.g. with OSGI Blueprint). JUnit than runs based on the annotations provided in the interface your service implements.
A maven plugin that has the following useful goals
Start a OSGI containers and deploy the bundle maven project with all of it's dependencies (which are OSGI bundles of course). The OSGI container starting is done with the help of PAX Exam but the JUnit tests are started with the help of the OSGI bundle I wrote (that runs the OSGI services you may provide).
Create a folder that contains a shortcut to all dependencies of the project (located at the maven repo or target directory of the folder)
If the projects are deployed onto the server (Eclipse Libra) I have to say only update X where the X is the id of the bundle and everything is refreshed rapidly. You do not have to re-compile the projects that are published to the server if you run Equinox in Libra as it points to the target classes folder which is refreshed as soon as you save your class or pom.xml.
If you do not publish your project onto the server but add it as a bundle in the container pointing to the shortcut folder you can also run the update command on the OSGi console after running mvn install (without the restarting of the server).
A step-by-step guide is available at http://cookbook.everit.org/
With the following method above it is possible to write tests as TDD tests and run them as part of a maven compile on the CI server.
I hope you will find these tools as useful as I do!
It depends on the platform under Karaf: Felix or Equinox.
Equinox
Eclipse has excellent (or almost excellent) support for launching Equinox with bundles of your choice. The two things you need to prepare are:
Bundles, being developed, available in the workspace as Plug-in projects
Target platform, containing the remaining bundles of the application
Such setup will allow you to easily make changes to your bundles, even runtime and easily restarting the runtime when this is required. I see Karaf as more suitable when you are developing on remote system, where the bundles are deployed via SSH or FTP or when you are using external build tools like Maven, which have ability to automatically copy the bundle in the runtime after it is built.
If you are using Equinox, this will give some extra edge over as the runtime will execute the code directly from the workspace.
Felix
Felix doesn't seem to have such support for launching from Eclipse (although there is a work toward this, tracked in this Jira issue). You can also launch it as normal Java application, but this is far from convenient. In this case, using Maven will be much better alternative. You can still setup Eclipse to take full advantage of the PDE other features, only launching will be done externally.
Summary
In summary, you can always automate everything through Maven and Karaf will greatly help you in this regard. Eclipse will give a little edge, if you are using Equinox. You should be able to have hot-code replace regardless of the method you are using, because the hot-code replace doesn't even consider OSGi at all (except in the only case, when you reload your bundle and fresh class loader is created).
I am trying to create an integration test, which requires a running PostgreSQL server. Is it possible to start the server in maven build and stop it when tests are completed (in a separate process, I think)? Assuming the PostgreSQL server is not installed on the machine.
You are trying to push maven far beyond the intended envelope, so you'll be in for a fair amount of hurt before it will work.
Luckily postgresql can be downloaded as a zip archive.
As already mentioned above maven can use ant tasks to extend its reach. Ant has a large set of tasks to unzip files, and run commands. The sequence would be as follows :
unzip postgresql-xxx.zip in a well known directory --> INSTALL_DIR
create a data directory --> DATA_DIR
/bin/init-db -D
/bin/postgres -D
/bin/create_db -EUNICODE test
This should give you a running server with a test database.
Further issues : create a user, security (you likely want to connect via TCP/IP but this is disabled by default if I recall correct, this requires editing a config file before starting the database)
...
Good Luck.
I started writing a plugin for this purpose:
https://github.com/adrianboimvaser/postgresql-maven-plugin
It's in a very early stage and lacks documentation, but mostly works.
I already released version 0.1 to Maven Central.
I'm also releasing PostgreSQL binary distributions for all platforms as maven artifacts.
You can find the usage pattern in the plugin's integration tests.
Cheers!
Not to my knowledge. However, you could run a remote command that starts the server.
I think the usual scenario is to have a running integration test db, and not to shut it down/ restart it between builds.
But if you really want to you could set up your continuous integration server to start/ stop the db.
You sound like you are trying to build a full continuous integration environment. You should probably look into using a full CI tool such as Cruise Control or Bamboo.
How I've done it before is to set up a dedicated CI db that is accessible from the CI server, and then have a series of bash/python/whatever scripts run as a After Successful Build step which can then run whatever extra integration tasks you like. Pair that with something like liquibase and you could wipe out the CI db and make sure it is up to the latest schema every build.
Just to bring some fresh perspective into this matter:
You could also start the postgresql database as docker instance.
The plugin ecosystem for docker seems to be still in flux, so you might need to decide yourself which fits. Here are a few links to speed up your search:
https://github.com/fabric8io/docker-maven-plugin
http://heidloff.net/article/23.09.2015102508NHEBVR.htm
https://dzone.com/articles/build-images-and-run-docker-containers-in-maven
Background. My org uses Maven, Bamboo and Artifactory to support a continuous integration process. We rely on Maven's SNAPSHOT qualifier to help manage storage in Artifactory (rotate out old SNAPSHOT builds) and also to help keep cross-team integrations current (Maven checks for updates to SNAPSHOT dependencies automatically on each build).
Problem. One of the challenges we're having is around correctly promoting builds from environment to environment while continuing to use SNAPSHOT. Say that a tester deploys version 1.8.2-SNAPSHOT to a functional test environment, and it's at rev 1400 in Subversion. Let's say also that it passes functional test. By the time a tester decides to pull 1.8.2-SNAPSHOT from Artifactory into the performance testing environment, a developer could have committed a change to Subversion, so the actual binary in Artifactory is at a different rev. How do we ensure that the rev doesn't change out from under us when using SNAPSHOT builds?
Constraints. We obviously don't want to deploy different builds unknowingly. We also don't want to rebuild from source as we want to test the exact binary in performance test that we tested in functional test.
Approaches we've considered. The thought is that we want to stamp the versions with a fourth component, like 1.8.2.1400, where the fourth component is a Subversion rev. (As a side question, is there a Maven plugin or something else that does that automatically?) But if we do that, then essentially we lose the SNAPSHOT feature since Maven and Artifactory think that these are different versions.
We are using Scrum, so we deploy to the test environments very early (like day two or so). I don't think it makes sense to remove the SNAPSHOT qualifier that early in the dev cycle because we lose the SNAPSHOT benefits again.
Would appreciate knowing how other orgs solve this issue.
Just to circle back on this one, I wanted to share what we are doing.
Basically we deploy snapshot builds like 1.8.2-SNAPSHOT into the development environment. No other teams need to use these builds, so it is fine to leave -SNAPSHOT on them.
But any build that we deploy to a test environment (e.g. functional test, system test) or else production must include the revision; e.g., 1.8.2.1400. We call these "quads". The reason for insisting upon quads in test is that we can attach issues (features, bugfixes, etc.) to specific revisions so the testers know what to test. For production it's really just because we want to deploy exactly the same artifact that we tested, so that means we're deploying a quad.
Anyway hope that information is useful to somebody.
if you enable "uniqueVersion" for you snapshot builds, every snapshot deployed will have a unique id. you can use that to ensure you are deploying the correctly promote builds across environments.
and, as a side note, you can use the buildnumber-maven-plugin to add subversion buildnumbers to artifacts.
Rather than embed the build number of VCS revision in the artifact's version, we embed the CI build number in the META-INF/MANIFEST-MF file .
See for instance Using Hudson environment variables to identify your builds . Although the article is applicable to Jenkins/Hudson I believe it is trivial to port to Bamboo.