Apache Common Utils package got changed to 4 - java

We have used CollectionUtils.isEmpty is many places through out the application around 1000 files.
Now the package name has been changed in the recent version apachecommonutils to apache common utils4.
Do we need to have our own utility package and call inside the above API since the changes are drastic during upgrade. If we have our own utility the package name change would be in one single place.
What would be the best practice?
How does enterprise application builts it?

We are facing lot of merge issues and conflicts due to several people working and merging on different projects
There's really only one way to do this.
Go back to the prior version of Commons Utils for now;
Plan this upgrade as a separate, major project that can be done when your code base is in a stable state. Make a new feature branch, off of a release, and do nothing else on that branch EXCEPT the upgrade.
Make sure you thoroughly regression-test all the modules that use Common Utils and are affected by the upgrade

Related

Maven versioning for multiprofile project

I need to support two builds with some set of differences in libraries versions, so I made two build profiles and this works fine, but now I have a versioning problem while preparing a release.
I use the major.minor.revision-qualifier versioning schema, where:
major - a breaking change
minor - a backward-compatible changes (new features)
revision - a bug fix
qualifier - the only qualifier I use now is SNAPSHOT to mark unreleased versions.
But since now I have two builds, I need to add some qualifiers to release versions e.g. 1.8.0-v1 and 1.8.0-v2, but then I won't be able to have two SNAPSHOT versions. Or I need to break the "rules" about major\minor version usage and make two "branches" e.g. release 1.8.0 and 1.9.0 and then increase only last number no matter when fixing a bug or adding a new features.
I have a feeling like I am doing something antipattern, could anyone please give me some advice?
P.S. I already have heavily reworked 2.x version, so I can't have separate "branches" as 2.x and 1.x versions, unless i change this new version to 3.0
upd
I guess i can't make this story short, so here we go.
In my project i use to have ojdbc6 and aqapi jars(oracle libraries), my app was working on java 7 and Apache ServiceMix 5 with oracle 11 database. But then some clients updated to oracle 12 and i need new libraries for that, but they only work on java 8, but ActiveMQ i am using as part of ServiceMix 5 doesn't work on java 8. So i updated to servicemix 7 and and after some chances it works fine. So rest of the difference in build profiles are versions of servicemix provided libraries (a complete list is redundant here i guess).
In the end despite the fact that new jdbc driver is fully compatible with old database(not completely sure about aqapi and client side of ActiveMQ, but they should be also compatible), i can't force every client to update and reinstall java\servicemix at the same time, but i still wanna be able to fix\add stuff for all of them.
So i need to support two builds for different versions of servicemix, at least for now(its a temporary solution, but as proverb says: there is nothing more permanent than temporary, so i want to make it in the most right way possible)
P.S.
I decided to make profiles instead of separate brunch in VCS, because it looks like much easier solution, but it doesn't metter in terms of the versioning problem.
So as #Software Engineer said, after thinking about reasons and writing a post update i realised its not multiprofile problem, it's purely versioning problem, it would be the absolutely the same if i make brunch in VCS.
So in the end i decided to make 1.x.x and 2.x.x versions despite the fact that changes are not that "breaking", but they are not fully backward-compatible(even tho new version can work with old database it still needs new servicemix).
This multiprofile workaround doesn't looks pretty, but i left it there, it allows me to build both versions in one go(i use mvn versions:set -DnewVersion command after the first build) and i don't need to support two brunches this way, so it saves some time.

Is there a way to import a specific version of a java sdk in a project with multiple versions of the same sdk?

Because of, reasons, I cannot just update the old version of the aws sdk I'm working with, but I also need some new things that are in a more recent version.
The problem is that if put both version of the sdk the project I get a "java.lang.NoSuchMethodError" because I think it's trying to use the old version of the sdk. If I delete the old one and just use the updated one it works fine. Is there a way to keep both version of the sdk and tell my java class which one to exclusively import?
There are a couple of ways, but they're pretty nasty.
The arguably "more correct" approach is to could use a custom classloader - see this answer for details. However, that's not exactly simple, and can lead to weird outcomes.
A simpler, but somewhat nastier approach is to get the source code of both SDKs (if available), and rename the packages. For example if we have sdk_v1 and sdk_v2, we can rename the packages to com.example.sdk.v1 and com.example.sdk.v2,
Once there's no package name collision, there's no problem using two different SDKs, even in the same class - just use fully qualified imports (see answer):
com.example.sdk.v1.SomeClass.someFunc() will not collide with com.example.sdk.v2.SomeClass.someFunc()

How do big companies tackle with the package dependencies conflict problem?

Just as shown in the picture, one app (Java) referenced two third-party package jars (packageA and packageB), and they referenced packageC-0.1 and packageC-0.2 respectively. It would work well if packageC-0.2 was compatible with packageC-0.1. However sometimes packageA used something that could not be supported in packageC-0.2 and Maven can only use the latest version of a jar. This issue is also known as "Jar Hell".
It would be difficult in practice to rewrite package A or force its developers to update packageC to 0.2.
How do you tackle with these problems? This often happens in large-scale companies.
I have to declare that this problem is mostly occurred in BIG companies due to the fact that big company has a lot of departments and it would be very expensive to let the whole company update one dependency each time certain developers use new features of new version of some dependency jars. And this is not big deal in small companies.
Any response will be highly appreciated.
Let me throw away a brick in order to get a gem first.
Alibaba is one of the largest E-Commerces in the world. And we tackle with these problems by creating an isolation container named Pandora. Its principle is simple: packaging those middle-wares together and load them with different ClassLoaders so that they can work well together even they referenced same packages with different versions. But this need a runtime environment provided by Pandora which is running as a tomcat process. I have to admit that this is a heavy plan. Pandora is developed based on a fact that JVM identifies one class by class-loader plus classname.
If you know someone maybe know the answers, share the link with him/her.
We are a large company and we have this problem a lot. We have large dependency trees that over several developer groups. What we do:
We manage versions by BOMs (lists of Maven dependencyManagement) of "recommended versions" that are published by the maintainers of the jars. This way, we make sure that recent versions of the artifacts are used.
We try to reduce the large dependency trees by separating the functionality that is used inside a developer group from the one that they offer to other groups.
But I admit that we are still trying to find better strategies. Let me also mention that using "microservices" is a strategy against this problem, but in many cases it is not a valid strategy for us (mainly because we could not have global transactions on databases any more).
This is a common problem in the java world.
Your best options are to regularly maintain and update dependencies of both packageA and packageB.
If you have control over those applications - make time to do it. If you don't have control, demand that the vendor or author make regular updates.
If both packageA and packageB are used internally, you can use the following practise: have all internal projects in your company refer to a parent in the maven pom.xml that defines "up to date" versions of commonly used third party libraries.
For example:
<framework.jersey>2.27</framework.jersey>
<framework.spring>4.3.18.RELEASE</framework.spring>
<framework.spring.security>4.2.7.RELEASE</framework.spring.security>
Therefore, if your project "A" uses spring, if they use the latest version of your company's "parent" pom, they should both use 4.3.18.RELEASE.
When a new version of spring is released and desirable, you update your company's parent pom, and force all other projects to use that latest version.
This will solve many of these dependency mismatch issues.
Don't worry, it's common in the java world, you're not alone. Just google "jar hell" and you can understand the issue in the broader context.
By the way mvn dependency:tree is your friend for isolating these dependency problems.
I agree with the answer of #JF Meier ,In Maven multi-module project, the dependency management node is usually defined in the parent POM file when doing unified version management. The content of dependencies node declared by the node class is about the resource version of unified definition. The resources in the directly defined dependencies node need not be introduced into the version phase. The contents of the customs are as follows:
in the parent pom
<dependencyManagement> 
    <dependencies > 
      <dependency > 
        <groupId>com.devzuz.mvnbook.proficio</groupId> 
        <artifactId>proficio-model</artifactId> 
        <version>${project.version}</version> 
      </dependency > 
</dependencies >
</dependencyManagement>
in your module ,you do not need to set the version
<dependencies > 
    <dependency > 
      <groupId>com.devzuz.mvnbook.proficio</groupId> 
       <artifactId>proficio-model</artifactId> 
    </dependency > 
  </dependencies > 
This will avoid the problem of inconsistency .
This question can't be answered in general.
In the past we usually just didn't use dependencies of different versions. If the version was changed, team-/company-wide refactoring was necessary. I doubt it is possible with most build tools.
But to answer your question..
Simple answer: Don't use two versions of one dependency within one compilation unit (usually a module)
But if you really have to do this, you could write a wrapper module that references to the legacy version of the library.
But my personal opinion is that within one module there should not be the need for these constructs because "one module" should be relatively small to be manageable. Otherwise it might be a strong indicator that the project could use some modularization refactoring. However, I know very well that some projects of "large-scale companies" can be a huge mess where no 'good' option is available. I guess you are talking about a situation where packageA is owned by a different team than packageB... and this is generally a very bad design decision due to the lack of separation and inherent dependency problems.
First of all, try to avoid the problem. As mentioned in #Henry's comment, don't use 3rd party libraries for trivial tasks.
However, we all use libraries. And sometimes we end up with the problem you describe, where we need two different versions of the same library. If library 'C' has removed and added some APIs between the two versions, and the removed APIs are needed by 'A', while 'B' needs the new ones, you have an issue.
In my company, we run our Java code inside an OSGi container. Using OSGi, you can modularize your code in "bundles", which are jar files with some special directives in their manifest file. Each bundle jar has its own classloader, so two bundles can use different versions of the same library. In your example, you could split your application code that uses 'packageA' into one bundle, and the code that uses 'packageB' in another. The two bundles can call each others APIs, and it will all work fine as long as your bundles do not use 'packageC' classes in the signature of the methods used by the other bundle (known as API leakage).
To get started with OSGi, you can e.g. take a look at OSGi enRoute.
Let me throw away a brick in order to get a gem first.
Alibaba is one of the largest E-Commerces in the world. And we tackle with these problems by creating an isolation container named Pandora. Its principle is simple: packaging those middle-wares together and load them with different ClassLoaders so that they can work well together even they referenced same packages with different versions. But this need a runtime environment provided by Pandora which is running as a tomcat process. I have to admit that this is a heavy plan.
Pandora is developed based on a fact that JVM identifies one class by class-loader plus classname.

How to avoid heavily changing Maven projects when new versions come out

Specific Background
I have just switched from spring data neo4j 4.1.3 to 5.0.0
And this issue has arisen since I changed my pom file.
Maven install fails because "cannot find symbol ... class GraphRepository"
I am newer to Java Maven projects as a whole
Broad Question:
If I update maven dependencies on a given project from one version of something to another and a class that I have been using heavily now gives around 100 error codes saying that whole class is now missing... how do I have this not happen.
Specific Where I think I'm at
I am gonna have to remove every reference to the "GraphRepository" and change it to Neo4jRepository since "Also note that GraphRepository is deprecated and replaced by Neo4jRepository" - Neo4j 4.2 graph repo save method is now ambiguous
But, this just doesn't seem right. Do I really have to go through an entire project and change all that code just to update?
One full line of error:
[ERROR] /.../service/SupportModelServiceImpl.java:[10,49] cannot find symbol
symbol: class GraphRepository
location: package org.springframework.data.neo4j.repository
You cannot prevent external dependencies from introducing breaking changes. However you could write your code so that it takes minimal effort to update external dependencies.
I have observed that in practice not much care is given to dependencies as if they are free. Initially they are as good as free, but once you start stacking your dependencies and have transitive dependencies that conflict or you upgrade to a new version with breaking changes there comes a maintenance cost. I have seen projects where the web of dependencies is so complex that they should be rewritten completely from scratch, if not for management not understanding the concept of technical debt, living in the illusion that maintaining an existing (bad) version of the software is cheaper than writing a new one.
The only option you have to guard against external dependencies, is to encapsulate them in one way or another. This may involve some boiler plate code, though if this boiler plate code is minimal it may be well worth the effort.
Because I have seen projects with horrible dependencies, I have given it some thought how I could prevent such a dependency mess and made the following image:
External code, over which you have no control is in red. If you do not think about structuring your code, your code (in orange) will depend directly on the external code and is at risk for external changes. You can try to write code (in green) that has no dependencies on external code. The way you achieve this is that you define the external functionality that you need in your own interfaces. You have then some code (in orange) that implements these interfaces and has external dependencies. You inject the code with external dependencies through a dependency injection framework.
This approach limits the impact of external changes to only the code in orange. However it requires more planning than directly using dependencies everywhere in your code. And because more planning means more effort, it is often not put in practice.
This is not specific to maven, you will have this issue regardless of whatever build system you use.
but I do not understand why you would want this, a major version change (e.g 4.xx to 5.xx) means something is going to break, and you will have to make changes into your code.

Multiple versions in a web application: duplication or messy code?

I was used to manage versions with a tag in Git. But that was a long time ago, for stand-alone applications. Now the problem is that I have a web application, and at the same application might connect clients that expect to communicate to different versions of the application.
So, I added to the input a path variable for the version in that way :
#PathParam("version") String version
And the client can specify the version in the URL:
https://whatever.com/v.2/show
Then across the code I added conditions like this:
if(version.equals("v.2") {
// Do something
}
else if(version.equals("v.3") {
// Do something else
}
else {
// Or something different
}
The problem is that my code is becoming very messy. So I decided to do in a different way. I added this condition only in one point of the code, and from there I call different classes according to the version:
MyClassVersion2.java
MyClassVersion3.java
MyClassVersion4.java
The problem now is that I have a lot of duplication.
And I want to solve this problem as well. How can I do now to have a web application that:
1) Deal with multiple versions
2) It is not messy (with a lot of conditions)
3) Doesn't have much duplication
Normally, when we speak of an old version of an application, we mean that the behavior and appearance of that version is cast in stone and does not change. If you make even the slightest modification to the source files of that application, then its behavior and/or appearance may change, (and according to Murphy's law it will change,) which is unacceptable.
So, if I were you, I would lock all the source files of the old version in the source code repository, so that nobody can commit to them, ever. This approach solves the problem and dictates how you have to go about everything else: Every version would have to have its own set of source files which would be completely unrelated to the source files of all other versions.
Now, if the old versions of the application must have something in common with the newest version, and this thing changes, (say, the database,) then we are not exactly talking about different versions of the application, we have something more akin to different skins: The core of the application evolves, but users who picked a skin some time ago are allowed to stick with that skin. In this case, the polymorphism solution which has already been suggested by others might be a better approach.
your version number is in a place in the URL named the 'Context Root'.
You could release multiple different WAR files each of which is configured to respond on different Context Roots.
So one war for version 1, one war for version 2 etc.
This leaves you with code duplication.
So what you are really asking is, "how do I efficiently modularise Java web applications?".
This is a big question, and leads you into "Enterprise Java".
Essentially you need to solve it by abstracting your common code to a different application. Usually this is called 'n-tier' design.
So you'd create an 'integration tier' application which your 'presentation' layer war files speaks to.
The Integration tier contains all the common code so that it isn't repeated.
Your integration tier could be EJB or webservices etc.
Or you could investigate using OSGi.

Categories