I have a multi module maven project, and in the dao module, I added the JSON-IO dependency. When I try to deserialize my object, it gives me:
Exception in thread "main" com.cedarsoftware.util.io.JsonIoException: Class listed in #type [hu.kleatech.projekt.model.Employee] is not found
The class name is correct, the Employee is public, and the dao has the module as dependency. What could have gone wrong?
Edit: Since this is an old question and have been answered long ago, I'm deleting the github repository that I made specifically for this question. The solution to the problem is in the accepted answer, the exact code is not relevant.
Please try adding an empty constructor to Employee class.
Edit: Actually, while adding an empty constructor solves the problem, it is not necessarily required. Json-IO "will make a valiant effort to instantiate passed in Class, including calling all of its constructors until successful. The order they tried are public with the fewest arguments first to private with the most arguments."
(copied from MetaUtils.java javadoc)
Also, when calling a multi-argument constructor, the library fills the arguments with nulls and defaults for primitives. Then any exceptions thrown during the constructor call is ignored. In your case, a NullPointerException was thrown, because the constructor is not null-safe. So either modify the constructor so that it can handle nulls, or add an empty constructor.
Maven dependency configuration is hierarchical from <parent> element not from <modules> element.
It means that in the project's pom.xml file where you have dependency on "JSON-IO dependency" you do not have dependency on your dao project or where that class is.
<modules> stands only to define what projects to build. Order of modules definition does not matter, since Maven detects order by required dependencies
So, you can define dependency in <parent> pom.xml either in
<dependencies> element. then all children will have it.
or in <dependencyManagement> - then children who need it can include it in their <dependencies> without common configurations like version, scope etc...
look at quite similar answer here:
How to minimize maven pom.xml
As per your project and modules Pom your main Pom should have modules in following order ....
<modules>
<module>core</module>
<module>controller</module>
<module>service</module>
<module>dao</module>
</modules>
service depends on core so core should be build before service
dao depends on service and core both so dao should be after core and service.
Employee class is available in core and it should be available in core jar.
You should add depencyManagent in main Pom and then add all the module as dependencies in dependencyManagement so whoever adds your main Pom as dependency will be able to access all your jars.
Once you change order build your project again and then update your maven project.
If this code is being used in another project then make sure that you have uploaded jars to repository (mvn deploy) so whoever uses it can download it when they are building their project.
One way to verify whether this jar is downloaded in the main project where it is used or not is check in project explorer there would be a Maven Dependencies section where you can see all dependency jars and check if core is present or not.
I am not sure what controller module is doing in main Pom as I couldn’t find a module by that name in your project so you should either remove it or add a module (folder) for it.
Related
I have two services A and B which placed in monorepo in different maven modules, also they have Aggregate pom.xml which contains the next modules:
<modules>
<module>A</module>
<module>B</module>
</modules>
Both services are talking through gRPC and have common protocol which described in the proto files.
The grpc-java manual says, that I must put my proto files into src/main/resourses/proto folder.
It means I have to copy the same proto files bewteen two services:
A/src/main/resourses/proto/somefile.proto
B/src/main/resourses/proto/somefile.proto
Which is code duplication actually.
The main question - How can I share and compile proto files between two maven modules in monorepo?
I have done the next:
Created the separate library which contains only proto files. Let's call it C.
Added C dependency to A and B modules.
Aggregated pom.xml looks like:
<modules>
<module>C</module>
<module>A</module>
<module>B</module>
</modules>
The approach seems quite havy for that case and I don't want to have a separate maven module for that.
Moreover, I will definetley face with a problem, if I use different language for B service (something other than java and maven).
Is there a known solution for this problem? Can I share protofiles without separate library/module? Any examples appreciated.
I've been wrong with
The grpc-java manual says, that I must put my proto files into
src/main/resourses/proto folder.
We can set protoSourceRoot configuration for grpc-java plugin. We can specify any required proto source folder as follows:
<protoSourceRoot>${basedir}/../proto</protoSourceRoot>
It means no need in separate maven module and library.
I have added two projects as modules in empty intellij project.
Then I added in pom of module B following dependency to first project(module A):
<dependency>
<groupId>Tests</groupId>
<artifactId>Group</artifactId>
<version>1.0-SNAPSHOT</version>
<scope>compile</scope>
</dependency>
This allows me to import classes from module A into module B.
But I can't see any method from that module (it looks like classes are empty or they have only private fields/methods).
What am I missing? What should I do to see all public methods/fields from A module?
Thanks
Kamil
If you are adding one of them as a dependency, you can avoid to join them as modules. For local purposes, you can build(mvn clean package) one of them and add that as a dependency to another one. You can check relevant .class file to see the access levels of the class members.
For multi-module projects, please, see: https://www.jetbrains.com/help/idea/creating-and-managing-modules.html
Let's say that there are 2 maven artifacts (local) with the same
groupId but with a different artifactId.
The different artifactId should make each maven artifact unique.
However, if both of the unique artifacts each have a class with that share the same name. that class will not be unique because when it is imported to java it will use the groupId.className format. and the neither groupId nor the className are unique (in the discussed case).
This will result in an issue of ambiguity as to determining which class to use.
Upon testing it seems that the dependency declared first in the pom.xml file will be used.
The Question Are
What is the best practice solve/avoid this issue?
Why does maven's artifactId coordinate contribute to the uniqueness of a maven artifact within the repository but not inside the java code?
Example Code:
Maven - Same Class Name Same GroupId Different ArtifactId
Project1 is the first artifact.
Project2 is the second artifact.
"Projects User" is the artifact/project that will depend on both Project1 & Project2.
Project1 & Project2 both have a class named Utilities.
The class Utilities have a static method public static String getDescription() that returns a string containing the current project's artifact coordinates as well as the project name.
Utilities.getDescription() resulting String is called to see if an error will occur somewhere, and to see how it will be resolved.
The output depends on which dependency was declared first in the pom.xml file of the "Projects User" artifact.
Edited : Follow up Question
Is there an archetype that will create the java package using both the artifactId and groupId instead of having to do it manually every
time?
What is the best practice solve/avoid this issue?
We include the groupId and artifactId as the base package in the module. This way it is not possible to have the same class in two modules as the packages would be different.
e.g.
<groupId>net.openhft</groupId>
<artifactId>chronicle-bytes</artifactId>
has everything under the package
package net.openhft.chronicle.bytes;
Also if you know the package of a class you know which JAR it must be in.
if you have a class two JARs need, I suggest creating a common module, they both depend on.
Note: it is general practice to use your company domain name (and notional division as well) as the base of your package. Maven recommend using your domain name as you groupId and if you release to Maven Central this is now a requirement. The above strategy supports both recommendations.
Why does maven's artifactId coordinate contribute to the uniqueness of a maven artifact within the repository but not inside the java code?
Maven doesn't take any notice of the contents of the JAR.
#Peter following your lead on suggesting best practices to avoid this issue.
Group Id : It is required to uniquely identify your project. Revese of your domain name ex :
com.github.dibyaranjan
artifactId is the name of the jar without version.
To distinguish two classes from different JARs, Create package as groupId.artifactId.
For Example, I would create a project TestDummy, I want the name of the JAR to be TestDummy-1.1, then my package would look like.
com.github.dibyaranjan.testdummy
The class would look like - com.github.dibyaranjan.testdummy.MyClass
For reference visit : https://maven.apache.org/guides/mini/guide-naming-conventions.html
I'm currently writing a custom maven plugin for generating a XML file in a multi-module maven project.
My maven structure is pretty standard: one parent project and a module by project components in the parent project folder:
-- Parent
-- module A
-- module B
-- module C
I need to list, by module, a set of classes flagged by a custom annotation.
I already wrote a set of custom annotations and an annocation processor to create a XML file at compile time in the corresponding module output directory (${project.build.outputDirectory}) .
Now i need to merge each module XML into one file, but i don't know how to access each modules from within my maven plugin except having each path set as parameters (i don't like this method).
Any idea on how to do this ?
Does maven plugins can traverse project modules ?
Thank you in advance.
To get the list list of all projects you can use:
List<MavenProject> projectList = MavenSession.getProjectDependencyGraph().getSortedProjects()
If one of your goals is correctly executed you will get everything you need. Every MavenProject contains a getBaseDir() etc.
After some researches, it seems that MavenProject.getCollectedProjects() will return the list of projects beeing manipulated by a goal execution in a multi-module project.
I am working on a maven based project. I have a parent pom file for the parent module and a child pom file for the child module. In parent module I am using a custom property (databaseType) and it is declared in parent pom.
<properties>
<databaseType>${databaseType}</databaseType>
</properties>
While building the application, I am passing it as -D argument and its building successfully. However, when I am creating a maven project in eclipse, I am getting below error in child pom (Though the maven build is happening fine)
Project build error: Resolving expression: '${databaseType}': Detected
the following recursive expression cycle in 'databaseType': [databaseType]
What could be the issue? Any help is appreciated.
The problem is that both the argument you pass with -D and the property have the same name. If you provide the argument, it works because when maven resolves the expression it first finds the provided argument by -D databaseType and then assigns that value to the <databaseType> property. If the argument is missing, maven tries to resolve the expression but only finds the definition of the <databaseType> property in the same pom, which creates a circle.
Maven and Eclipse are either using different approaches to resolve variables here (which might a bug in eclipse) or it's caused by some misconfiguration. I would guess that passing the variable with -D is not working in Eclipse for some reason.
The example doesn't really do anything anyways. If ${databaseType} is available, you shouldn't need to explicitly define the property again. Or use a different property name in the parent pom if it makes sense, like this:
<properties>
<databaseType>${defaultDatabaseType}</databaseType>
</properties>
This doesn't help if the argument is missing though. I would use the enforcer plugin to make sure it the property is defined.