Applying security updates to dependencies installed with Maven - java

We have a web application that is implemented in Java. It uses Maven to install various dependencies such as JavaMail, Gson, and so on.
Unfortunately, deploying and maintaining the project is a nuisance. We need to be aware that any of those dependencies might issue a security update, which means checking regularly for new versions. To make matters worse, we can't see any way that Maven can distinguish security fixes from other new releases. This means that we end up doing needless updates, which is a waste of time and could break something.
The server itself runs Ubuntu, and the situation there is far better. Apt installs urgent updates, but everything else waits until the next Ubuntu release. That's ideal because it gives us a stable but secure platform that we can build on.
Is there any way of making Maven more like Apt, so we can install security fixes but nothing else? If not, I'd be interested to know what strategies other people use for updating deployed web applications.
(We know about the maven-dependency-plugin. This plugin helps, because it can automatically find and update any dependencies which have newer versions. Unfortunately it can't distinguish security updates from normal feature releases, so we end up updating when we don't strictly need to.)

I did come up with a partial solution to this problem, but it doesn't use Maven directly. I implemented a script which scans the NVD database for new security exposures relating to products we are using. Every morning I get any new ones emailed to me, and I can decide whether they justify an update to our web application.
The downside of this approach is that smaller projects don't always issue CVE numbers for their vulnerabilities. We have to restrict ourselves to products that have a significant following, are backed by a large vendor, or have previously demonstrated a willingness to take part in the CVE process.

Related

JHIPSTER Application UPGRADE

I want to upgrade an app which was built with jhipster 4.14.4 to 7.9.2 the latest version right now.
Do you have any ideas how should I proceed ? (the steps to follow)(the best way to do it)
I already read this : https://www.jhipster.tech/upgrading-an-application/ but i am still confused.
I already tried with the automatic upgrade which ended with a lot of conflict to solve. I don't know if i should continue in that direction or change my strategy. Because some other developers told me to create another project with 7.9.2 and copy the old code step by step and then resolve problems.
thanks.
Before asking how to upgrade, the first question to ask is why?
What are the benefits you expect from upgrading JHipster versus upgrading your app manually to make it run under latest versions of Java and Spring Boot and other libs?
Usually when you want to upgrade an application, it's to get support for patching security vulnerabilities or to be able to use more modern libs before starting a major functional evolution.
You can get all these benefits by updating manually your code at a fraction of cost especially when your app is several major versions behind JHipster.
Unless your project is trivial, its value reside in your custom code not in code generated by JHipster.
So, it usually does not make sense to keep a dependency with the code generator, it just adds technical debt when you are unlikely to use it anymore.
In my projects, I usually cut off all dependencies with JHipster after few months, it makes it so much easier to upgrade libs and frameworks.
As an analogy, you would not keep scaffolding after your house is built.
That said, if you still want to do it, there are 2 alternatives:
Using jhipster upgrade
In your case the effort is multiplied by 3 because your app is 3 major versions behind current JHipster.
Each major version introduces its share of breaking changes including dropping some components. So in some rare cases you just couldn't upgrade depending on the options you chose for initial project generation.
So, first step is to review the breaking changes introduced by each major release, see https://www.jhipster.tech/releases/
jhipster upgrade will likely fail if you want to go directly from 4 to 7 so you should probably go from 4 to 5, 5 to 6, 6 to 7. It will take long time and effort and there's a risk that you will fail.
Starting from fresh project and importing custom code
Generating a new app in 7 and then importing your custom business code is probably a better idea.
If you have kept the JDL file used for initial generation then it's easy.
If you don't have this JDL file or app was built by answering questions, you can use jhipster export-jdl.
So now you have the JDL and you can create your app in an empty directory, and generate your app with latest version using jhipster import-jdl.
Then you should start importing custom code into new app.
Whichever way you choose, you must be knowledgeable about the technology stack used by JHipster. So to be clear: giving this task to a junior dev does not make any sense.

How do I upgrade to jlink (JDK 9+) from Java Web Start (JDK 8) for an auto-updating application?

Java 8 and prior versions have Java Web Start, which auto-updates the application when we change it. Oracle has recommended that users migrate to jlink, as that is the new Oracle technology. So far, this sounds good. This comes with a host of benefits:
Native code on Windows, Mac and Linux
Modularization of the code (although Proguard does this as well)
The use of new, supported technology.
The problem: I can't find the canonical Java solution to auto-update with jlink.
One would think that Java Web Start could continue to be used, especially if one casually reads this document. Notice the fact that Java Web Start continues to be prominently listed. But there's a fly in the ointment: Oracle is deprecating Java Web Start. It's slated for removal in JDK 11. So, what's the official path forward. Failing that, is there a standard way that people proceed?
For the purposes of this question the following are out of scope:
Paying huge amounts of money yearly to someone with an feature-packed enterprise solution. The application to be distributed is already packaged into a single jar that is smaller than 50MB.
Forcing users to run an InstallShield style app to reinstall the new version, and then manually uninstall the old version every time an update is pushed. That's sooo 1990's.
Porting the entire app to be a webapp, rewriting the UI and client side logic to fit in a browser and dealing with all the incompatibilities that entails. The authors of the application worked on GWT and know exactly what web browsers are capable of. Unfortunately, they also know the level of effort required.
Allowing users to continue to run old versions of the application. That, too, is sooo 1980's. Modern apps update quickly, and supporting every version of the application ever released is not tenable. That's what my father's COBOL application had to deal with, and he didn't enjoy it. I'm hoping technology has progressed.
Continuing to use Java Web Start. Until/unless Oracle changes its mind, Java Web Start is a doomed technology.
In May 2019 commented to watch the OpenWebStart project.
Now (October 2019) it is time to give OpenWebStart serious consideration. While not yet feature complete, a alpha beta release of OpenWebStart is now available for download under a "GPL with Classpath exception" license.
The OpenWebStart Technical Details page states:
OpenWebStart is based on Iced-Tea-Web and the JNLP-specification defined in JSR-56. It will implement the most commonly used features of Java Web Start and it will be able to handle any typical JWS-based application. We plan to support all future versions of Java, starting with Java 11. In addition to Java 11, the first release of OpenWebStart will also support Java 8.
The page goes on to state that OpenWebStart will support interactive installers with auto-update, and non-interactive installers. Some JNLP features will be supported, and it will include a replacement for the Java Control Panel. A more comprehensive list of planned features1 and their implementation status is provided in the feature table.
1 - If you have a requirement that is not on their feature list (e.g. jlink support), you could contact the OpenWebStart team, and offer a suitable incentive (e.g. money to pay developers) to implement the feature for you. They also offer commercial versions of the software for paying customers.
Disclaimer: I have no connection with the OpenWebStart project, the company (Karakun) or the project sponsors. This is not a recommendation.
I had a similar problem in a past project. We needed to migrate from Webstart to another technology.
The first approach was to install IcedTea. It is directly bundled with the AdoptOpenJDK Project.
But as far as I understood the problem, Java wasn't meant to be installed on the Client side like this anymore and we didn't want problems with all of our customers.
Our solution was then building an own specific Executable, which connects to the server, ask for enviroment settings from the server side, and then download and extracts the JLink Java. So we could use the old technologies and just wrapped it in an Executable.
Last thing done then was redirecting to the download location of the Executable when calling the jnlp-URL.
Do you use maven?
I've resolved my similar problem with maven (I need to update an EAR).
My main app (the ear package) has a pom.xml with listed the dependencies and repositories.
The dependencies have the <version> tag with a range (documentation) as in this example
<version>[1.0.0,)</version>
That means : get version 1.0.0 or newer of the dependency. (You can put also an upper bound to the version, [1.0.0, 2.0.0) so if you develope a new version, it is not used in old app)
In the repository section I added my personal repository.
Now, in the remote machine I need only to rebuild my ear package with maven : the compiler download the newer version of my jar and put it together.
You need a system to check if there are newer dependencies version and warn the user to update the app and also lock its work (you can't work if you don't update). Maybe you need a little app to make users do the rebuild process easily. It's 1990's but a lot of desktop-app works in this way
PRO
This schema can be used in a lot of different projects.
CONTRO
You need to build the app in the remote machine, so the client must have a JDK and access to your repository (like artifactory);
You must write code in different jars and add them like dependencies in the main archive.
You must change JAR version each time and publish on the repository (this could be a good practice)

Creating new class versions

I'm currently working on a project. I've been running into a few issues with the e-mail functionality, I've made multiple attempts to fix the issue and the latest seems to have fixed the issue.
For each rewrite I've created a new class to hold the new code. This doesn't seem to be the best solution as each time I have to go through the code and track down the references to the class and update them, with each rewrite only the code for the actual sending of the email has changed, function inputs and names have been consistent.
I've looked up versioning but this hasn't been particularly helpful in providing a solution either granted most likely due to my own lack of knowledge on the subject. So here is what I'm looking for: to have one instance of the class with multiple versions preferably without all the old code in it to aid in reading. But I want access to the old versions so that if a function/feature was there previously and wasn't built into the current version I can see how it was implemented.
Versioning is exactly what you need here.
Have a look here, which gives you a brief introduction to subversion, one of the most popular versioning systems. You can either set up / use your own private subversion server, or if you project is open source use a number of free providers (such as Google code) who will provide versioning for you.
Other versioning systems exist other than subversion, such as git, mercurial, etc. - but subversion is arguably the most popular and a good starting point.
Are you using any IDE? Eclipse/Netbeans store the history of your file updates and you can always compare/replace from history.
Note: This is not a replacement of version control in any way and I would highly recommend that you explore open-source version control solutions. This would help you in the long run
Mercurial is the way to go. Seemless merging and integration with java and popular IDE's like Netbeans. You can't go wrong. From the very beginning of my programming experience I learned how to use Mercurial in a day.
Use version management tool likes as SVN or CVS.

Is it considered a good/bad practice to configure tomcat for deploying certain apps?

Disclaimer: I've never used the technique which is described below. That's why there may occur some mistakes or misunderstandings in its description.
I heard that some teams (developers) use 'pre-configured' tomcat. As I understand they add different jars to tomcat \lib folder and do something else.
Once I've read a thread in a java forum where one developer wrote something about recompilation (or reassembly?) of tomcat for certain needs.
Just yesterday I heard a dialog where one developer sayd that his team-mates were not able to deploy the project until he would give them configured tomcat version.
So, I wonder, what is it all about and why do they do it? What benefits can they gain from that?
Open source projects, always have been an space for customizations (I believe, that's something of its charm), and I think it's acceptable to modify Tomcat for very specific in-house requirements.
But in general I would recommend to avoid a solution that requires hard modifications of open source tools -probably there is another way to do what you want using the existing ; ) (this do not apply for general accepted changes i.e. community Addons, bug fixes, and all the stuff you publish in the project spaces that are accepted and made part of the final solution).
About external lib, I would mentioned them in the project README as platform requirements. so to have a pre-configured server it's not that crazy. in fact it can save you some time, but it's a bonus. you should mention your dependencies somewhere anyway : )
Hope it helps.
Using a customized version of Tomcat could make upgrading very difficult. The benefit of having an application that does not require a specially configured server is that you can easily move to a new version or even move to an entirely different app server (e.g. Jetty, GlassFish)
I'd also point out that you do not specify the context of the changes. The special configuration may not have been application specific, but was required for security settings, compatibility with the web server being used, etc. You should talk to the developer's in question and learn more about why they require the specialized configuration.
This is the mechanism necessary to provide e.g. JDBC pools and objects over JNDI since that requires it to be in the Tomcat classloader. That is a necessity.
It may also be used to allow multiple deployments to share the same single jar file instead of having it in each WAR file. That is in my opinion generally a bad idea which should be avoided unless absolutely necessary. Keep to standard mechanisms if at all possible.

What's with all the Java Build tools?

what's the point of using ant, maven, and buildr? won't the using build in eclipse or netbeans work fine? i'm just curious what the purpose and benefit of extended build tools are.
Dependency Management: The build tools follow a component model that provides hints on where to look for dependencies. In Eclipse / Netbeans, you have to depend on a JAR and you don't really know if this JAR has been updated or not. With these build tools, they 'know' updates in dependencies (generally because of a good integration with your source control repository), recalculate transitive dependencies and ensure that everything is always built with the latest versions.
Access Control: Java, apart from class level access control, has no higher abstraction. With these build tools you can specify exactly which projects you want to depend on you and control visibility and access at a higher level of granularity.
Custom Control: The Eclipse / Netbeans build always builds JAR files. With custom build mechanisms, you could build your own custom (company-internal) archive with extra metadata information, if you so wish.
Plugins: There are a variety of plugins that come with build tools which can do various things during build. From something basic like generating Javadocs to something more non-trivial like running tests and getting code coverage, static analysis, generation of reports, etc.
Transport: Some build systems also manage transport of archives - from a development system to a deployment or production system. So, you can configure transport routes, schedules and such.
Take a look at some continuous integration servers like CruiseControl or Hudson. Also, the features page of Maven provides some insight into what you want to know.
On top of all the other answers. The primary reason I keep my projects buildable without being forced to use NetBeans or Eclipse is that it makes it so much easier to setup automated (and continuous) builds.
It would be rather complicated (in comparison) to set up a server that somehow starts eclipse, updates the source from the repository, build it all, sends a mail with the result and copies the output to somewhere on a disk where the last 50 builds are stored.
If you are a single developer or a very small group, it can seem that a build system is just an overhead. As the number of developers increases though it quickly becomes difficult to track all changes and ensure developers are keeping in sync. A build system reduces the rate of increase of those overheads as your team grows. Consider the issues of building all the code in Eclipse once you have 100+ developers working on the project.
One compelling reason to have a separate build system is to ensure that what has been delivered to your customers is compiled from a specific version of the code checked into your SCM. This eliminates a whole class of "works on my box" issues and in my opinion this benefit is worth the effort on its own in reduced support time. Isolated builds (say on a CI server) also highlight issues in development, e.g. where partial or breaking changes have been committed, so you have a chance to catch issues early.
A build in an IDE builds whatever happens to be on the box, whereas a standalone build system will produce a reproducible build directly from the SCM. Of course this could be done within an IDE, but AFAIK only by invoking something like Ant or Maven to handle all the build steps.
Then of course there are also the direct benefits of build systems. A modular build system reduces copy-paste issues and handles dependency resolution and other build related issues. This should allow developers to focus on delivering code. Of course every new tool introduces its own issues and the learning curve involved can make it seem that a build system is a needless overhead (just Google I hate Maven to get some idea).
The problem with building from the IDE, is that there are tons of settings affecting the build. When you use a build tool all the settings a condensed in a more or less readable form into a small set of scripts or configuration files. This allows in the ideal case anybody to execute a build with hardly any manual setup.
Without the build tool it might become next to impossible to even compile your code in let's say a year, because you'll have to reverse engineer all the settings
Different features. For example Maven can scan your dependencies and go download them, and their dependencies so you don't have to. For even a medium sized project there may be a very large number of dependencies. I don't think Eclipse can do that.
#anonymous,
Why do you I assume that me, a member
of your team, is using an IDE all the
time? I might want to build the code
on a headless build server, is that
ok?
Would you also deny me the right of
using a continuous integration
engine?
May I fetch dependencies from a central repository please? How can I do that?
Would you tie me to a specific IDE? I can't run Eclipse easily on my very old laptop, but I'll buy a new one.
Maybe I should also uninstall subversion and use patches or just zip folders on a sftp/ftp/Samba share.
The build tools allow you to do a build automatically, without human invention, which is essential if you have a code base being able to build many applications (like we do).
We want to be certain that each and everyone of our applications can build correctly after any code base changes. The best way to check that is to let a computer do it automatically using a Continouos integration tool. We just check in code, and the CI server picks up there is a change and rebuilds all modules influenced by that change. If anything breaks the responsible person is mailed directly.
It is extremely handy being able to automate things.
To expand on Jens Schauder's answer, a lot of those build options end up in some sort of .project file. One of the evils of Eclipse is that they store absolute path names in all of it's project files, so you can't copy a project file from one machine to another, which might have its workspace in a different directory.
The strongest reason for me, is automated builds.
IDEs just work on a higher abstraction layer.
NetBeans nativly uses Ant as its underlying build tool and recently can directly open maven projects in NetBeans. Hence, your typical NetBeans project can be compiled with ant and your maven project already is a NetBeans project.
As with every GUI vs CLI discussion, IDEs seem easier for beginners but once you get the idea it becomes cumbersome to do complex things.
Changing the configuration with an IDE means clicking somewhere which is easy for basic things but for complex stuff you need to find the right place to click. Furthermore IDEs seem to hide the importent information. Clicking a button to add a library is easy but you may still not know where the library is, etc.
In contrast, using a CLI isn't easy to start with but becomes quickly easy. It allows to do complex things more easily.
Using Ant or Maven means that every one can choose his/her own IDE to work one the code. Telling someone to install IDE X to compile it is much more overhead than telling "run <build command> in your shell". And of course your can't explain the former to an external tool.
To sum up, the IDE uses a build tool itself. In case of NetBeans Ant (or Maven) is used so you can get all the advantages and disadvantages of those. Eclipse uses its own thing (as far as I know) but also can integrate ant scripts.
As for the build tools themselves Maven is significantly different from Ant. It can download specified dependencies up to the point of downloading a web server to run your project.
In all projects, developers will often manually invoke the Build process.but it is not Suitable for large Projects, Where it is very difficult to keep track of what needs to be built, in what sequence and what dependencies there are in the building process.Hence we Use Build Tools for Our Projects.
Build Tools Done varieties of the task in the Application which will do by the Developer in their daily life.
They are
1.Downloading dependencies.
2.Compiling source code into binary code.
3.Packaging that binary code.
4.Running tests.
5.Deployment to production systems.

Categories