Getting "NoSuchMethodError: org.hamcrest.Matcher.describeMismatch" when running integration tests - java

I'm using JUnit 4.11 to run integration tests on my SpringBoot application. I have the following test case :
#Test
public void testClassementMarcheActeTranche() throws IOException, PlanDeClassementException {
NodeRef nodeRefRepertoireSasGda = servicePlanDeClassementMarche.getSpace(SpaceUtil.SAS_GED_MARCHE.name());
NodeRef nodeRefPiece = setUpCreateActe(nodeRefRepertoireSasGda, "1 (Tranche 1 Marché 2019-19M0059001) (Tranche 1 Marché 2019-19M58785858)", "Courrier de reconduction");
classementDocumentsActionJobExecuter.executeImpl(null, nodeRefPiece);
String siteName = serviceRegistry.getSiteService().getSite(nodeRefPiece).getShortName();
QName typePiece = serviceRegistry.getNodeService().getType(nodeRefPiece);
assertThat(typePiece, equalTo(TYPE_PIECE.getQName()));
NodeRef parentFolder = serviceRegistry.getNodeService().getPrimaryParent(nodeRefPiece).getParentRef();
String parentFolderName = serviceRegistry.getFileFolderService().getFileInfo(parentFolder).getName();
NodeRef marcheFolder = serviceRegistry.getNodeService().getPrimaryParent(parentFolder).getParentRef();
QName typeMarche = serviceRegistry.getNodeService().getType(marcheFolder);
assertThat(typeMarche, equalTo(TYPE_MARCHE.getQName()));
String marcheFolderName = serviceRegistry.getFileFolderService().getFileInfo(marcheFolder).getName();
NodeRef millesimeFolder = serviceRegistry.getNodeService().getPrimaryParent(marcheFolder).getParentRef();
String millesimeFolderName = serviceRegistry.getFileFolderService().getFileInfo(millesimeFolder).getName();
NodeRef cdrFolder = serviceRegistry.getNodeService().getPrimaryParent(millesimeFolder).getParentRef();
String cdrFolderName = serviceRegistry.getFileFolderService().getFileInfo(cdrFolder).getName();
String numeroMarche = (String)nodeService.getProperty(nodeRefPiece, PROP_NUMERO_DU_MARCHE.getQName());
String dateModificationnMarche = nodeService.getProperty(nodeRefPiece, PROP_DATE_DE_NOTIFICATION_DU_MARCHE.getQName()).toString();
String categorieMarche = (String)nodeService.getProperty(nodeRefPiece, PROP_CATEGORIE_DU_MARCHE.getQName());
String etatMarche = (String)nodeService.getProperty(nodeRefPiece, PROP_ETAT_DU_MARCHE.getQName());
List<String> codeTitulaire = (ArrayList<String>)nodeService.getProperty(nodeRefPiece, PROP_CODE_TITULAIRE.getQName());
List<String> titulaireMarche = (ArrayList<String>)nodeService.getProperty(nodeRefPiece, PROP_NOM_TITULAIRE.getQName());
List<String> codeSousTraitant = (ArrayList<String>)nodeService.getProperty(nodeRefPiece, PROP_CODE_SOUS_TRAITANTS.getQName());
List<String> nomSousTraitant = (ArrayList<String>)nodeService.getProperty(nodeRefPiece, PROP_NOM_SOUS_TRAITANTS.getQName());
String typeDeProcedureMarche = (String)nodeService.getProperty(nodeRefPiece, PROP_TYPE_DE_PROCEDURE.getQName());
String objetMarche = (String)nodeService.getProperty(nodeRefPiece, PROP_OBJET_DU_MARCHE.getQName());
List<String> axe = (ArrayList<String>)nodeService.getProperty(marcheFolder, PROP_AXE_OPERATION_ORYSON_MARCHE.getQName());
List<String> libEtablissement = (ArrayList<String>)nodeService.getProperty(marcheFolder, PROP_LIBELLE_ETABLISSEMENT_ENSEIGNEMENT_MARCHE.getQName());
List<String> codeEtablissement = (ArrayList<String>)nodeService.getProperty(marcheFolder, PROP_CODE_ETABLISSEMENT_ENSEIGNEMENT_MARCHE.getQName());
List<String> cdrMarche = (ArrayList<String>)nodeService.getProperty(marcheFolder, PROP_CDR_MARCHE.getQName());
assertThat(marcheFolderName, equalTo("2019-19M58785858 - Lot 1 - Chantier des collections");
nodeService.deleteNode(marcheFolder);
}
And the following error message :
java.lang.NoSuchMethodError: org.hamcrest.Matcher.describeMismatch(Ljava/lang/Object;Lorg/hamcrest/Description;)V
at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:18)
at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:8)
at fr.package.alfresco.marche.actions.ClassementActeActionJobIT.testClassementMarcheActeTranche(ClassementActeActionJobIT.java:203)
I have followed the answers in this question : Getting "NoSuchMethodError: org.hamcrest.Matcher.describeMismatch" when running test in IntelliJ 10.5
So I ended up having this pom file :
<dependencies>
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-core</artifactId>
<version>1.3</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hamcrest</groupId>
<artifactId>hamcrest-library</artifactId>
<version>1.3</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<exclusions>
<exclusion>
<artifactId>hamcrest-core</artifactId>
<groupId>org.hamcrest</groupId>
</exclusion>
</exclusions>
<version>1.9.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<exclusions>
<exclusion>
<artifactId>hamcrest-core</artifactId>
<groupId>org.hamcrest</groupId>
</exclusion>
</exclusions>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jdbi</groupId>
<artifactId>jdbi</artifactId>
<version>2.78</version>
</dependency>
</dependencies>
Although this haven't fixed the problem, anyone has an idea of how this persisting problem would be fixed ?

Related

NoClassDefFoundError in Azure Function App for Digital Twin

I am developing an azure function app. function in azure function app is responsible to receive messages from azure event hub. this method should should update azure digital twin. I am creating Azure DigitalTwin instance like below
#FunctionName("eventGridMonitorString")
public void eventHubProcessor(
#EventHubTrigger(name = "msg", eventHubName = "", connection = "EventHubConnectionString") String message,
final ExecutionContext context) {
// context.getLogger().info(message);
String adtUrl = System.getenv("ADT_SERVICE_URL");
context.getLogger().info("ADTURl : " + adtUrl);
DigitalTwinsClient client = new DigitalTwinsClientBuilder().credential(new ClientSecretCredentialBuilder()
.tenantId("my_tenant_id").clientId("my_client_id")
.clientSecret("my_client_secret").build()).endpoint(adtUrl).buildClient();
Iterable<DigitalTwinsModelData> modelList = client.listModels();
Iterator<DigitalTwinsModelData> it = modelList.iterator();
while (it.hasNext()) {
DigitalTwinsModelData model = it.next();
context.getLogger().info("" + model.getDtdlModel());
}
for (DigitalTwinsModelData model : modelList) {
context.getLogger().info("Created model: " + model.getModelId());
}
}
This code works fine in my local java application but when I deploy this code to azure function app, it gives me below error
2021-05-27T07:42:35.173 [Error] Executed 'Functions.eventGridMonitorString' (Failed, Id=12a87102-78a3-4e2e-8715-b4401091d753, Duration=116ms)Result: FailureException: NoClassDefFoundError: Could not initialize class reactor.netty.http.client.HttpClientConfigStack: java.lang.reflect.InvocationTargetExceptionat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:498)at com.microsoft.azure.functions.worker.broker.JavaMethodInvokeInfo.invoke(JavaMethodInvokeInfo.java:22)at com.microsoft.azure.functions.worker.broker.JavaMethodExecutorImpl.execute(JavaMethodExecutorImpl.java:54)at com.microsoft.azure.functions.worker.broker.JavaFunctionBroker.invokeMethod(JavaFunctionBroker.java:57)at com.microsoft.azure.functions.worker.handler.InvocationRequestHandler.execute(InvocationRequestHandler.java:33)at com.microsoft.azure.functions.worker.handler.InvocationRequestHandler.execute(InvocationRequestHandler.java:10)at com.microsoft.azure.functions.worker.handler.MessageHandler.handle(MessageHandler.java:45)at com.microsoft.azure.functions.worker.JavaWorkerClient$StreamingMessagePeer.lambda$onNext$0(JavaWorkerClient.java:92)at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)at java.util.concurrent.FutureTask.run(FutureTask.java:266)at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)at java.lang.Thread.run(Thread.java:748)Caused by: java.lang.NoClassDefFoundError: Could not initialize class reactor.netty.http.client.HttpClientConfigat reactor.netty.http.client.HttpClientConnect.<init>(HttpClientConnect.java:84)at reactor.netty.http.client.HttpClient.create(HttpClient.java:393)at com.azure.core.http.netty.NettyAsyncHttpClientBuilder.build(NettyAsyncHttpClientBuilder.java:91)at com.azure.core.http.netty.implementation.ReactorNettyClientProvider.createInstance(ReactorNettyClientProvider.java:14)at com.azure.core.implementation.http.HttpClientProviders.createInstance(HttpClientProviders.java:58)at com.azure.core.http.HttpClient.createDefault(HttpClient.java:50)at com.azure.core.http.HttpClient.createDefault(HttpClient.java:40)at com.azure.core.http.HttpPipelineBuilder.build(HttpPipelineBuilder.java:62)at com.azure.digitaltwins.core.DigitalTwinsClientBuilder.buildPipeline(DigitalTwinsClientBuilder.java:151)at com.azure.digitaltwins.core.DigitalTwinsClientBuilder.buildAsyncClient(DigitalTwinsClientBuilder.java:193)at com.azure.digitaltwins.core.DigitalTwinsClientBuilder.buildClient(DigitalTwinsClientBuilder.java:160)at com.ey.azurefunctions.PolarDelightFunctionApp.Function.eventHubProcessor(Function.java:32)... 16 more
am I missing something or is there any issue with above code?
Edit 1 I have below maven dependencies added to my project
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-digitaltwins-core</artifactId>
<version>1.1.0</version>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-identity</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-core-http-netty</artifactId>
<version>1.7.1</version> <!-- {x-version-update;com.azure:azure-core-http-netty;dependency} -->
</dependency>
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>4.1.49.Final</version>
</dependency>
<dependency>
<groupId>io.projectreactor.netty</groupId>
<artifactId>reactor-netty</artifactId>
<version>1.0.7</version>
</dependency>
<dependency>
<groupId>io.projectreactor.netty</groupId>
<artifactId>reactor-netty-http</artifactId>
<version>1.0.7</version>
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-bom</artifactId>
<version>Dysprosium-SR20</version>
<type>pom</type>
</dependency>

java.lang.LinkageError: ClassCastException:attempting to castjar: javax.ws.rs-api-2.0.1.jar

I am running below rest API in my code and it gives me the below error. I am not sure whether this is an issue with the jar. Please help me .
java.lang.LinkageError: ClassCastException: attempting to castjar:file:/C:/apache-tomcat-8.5.9/wtpwebapps/searchextractweb/WEB-INF/lib/javax.ws.rs-api-2.0.1.jar!/javax/ws/rs/ext/RuntimeDelegate.class to jar:file:/C:/apache-tomcat-8.5.9/wtpwebapps/searchextractweb/WEB-INF/lib/javax.ws.rs-api-2.0.1.jar!/javax/ws/rs/ext/RuntimeDelegate.class
javax.ws.rs.ext.RuntimeDelegate.findDelegate(RuntimeDelegate.java:146)
javax.ws.rs.ext.RuntimeDelegate.getInstance(RuntimeDelegate.java:120)
javax.ws.rs.core.MediaType.valueOf(MediaType.java:179)
com.sun.jersey.api.client.PartialRequestBuilder.type(PartialRequestBuilder.java:92)
com.sun.jersey.api.client.WebResource.type(WebResource.java:343)
com.tlr.searchextract.workflow.Workflow.retrieveSearchInfo(Workflow.java:1208)
com.tlr.searchextract.workflow.Workflow.createWorkflowRequest(Workflow.java:275)
com.tlr.searchextract.messages.SearchExtractEventHandler.createNewWorkflowRequest(SearchExtractEventHandler.java:675)
com.tlr.searchextract.messages.SearchExtractEventHandler.processRequest(SearchExtractEventHandler.java:134)
com.tlr.searchextract.messages.SearchExtractEventHandler.processMessage(SearchExtractEventHandler.java:65)
com.tlr.searchextract.messages.MessageHandler.routeMessage(MessageHandler.java:92)
com.tlr.searchextract.messages.MessageHandler.processMessages(MessageHandler.java:64)
com.tlr.searchextract.servlet.RequestModel.insertCurrentRequest(RequestModel.java:190)
com.tlr.searchextract.servlet.SEControllerServlet.insertRequestTemplate(SEControllerServlet.java:1344)
com.tlr.searchextract.servlet.SEControllerServlet.performTask(SEControllerServlet.java:1941)
com.tlr.searchextract.servlet.SEControllerServlet.doPost(SEControllerServlet.java:90)
javax.servlet.http.HttpServlet.service(HttpServlet.java:648)
javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
I have the below code for executing REST API
private void retrieveSearchInfo() {
// find out what type of workflow to create
searchType =
document
.getElementsByTagName("search.type")
.item(0)
.getFirstChild()
.getNodeValue();
try {
excludeMetaDoc =
document
.getElementsByTagName("exclude.metadoc")
.item(0)
.getFirstChild()
.getNodeValue();
} catch (Exception e) {
excludeMetaDoc = "";
}
try {
searchGroup =
document
.getElementsByTagName("search.group")
.item(0)
.getFirstChild()
.getNodeValue();
} catch (Exception e) {
searchGroup = "";
}
try{
imageDoc =
document
.getElementsByTagName("search.imagedoc")
.item(0)
.getFirstChild()
.getNodeValue();
}catch (Exception e) {
imageDoc = "";
}
//add the term "search" to the value of the searchLevel
//vaiable in order to fit the needs of the LTC request
searchLevel =
document
.getElementsByTagName("search.level")
.item(0)
.getFirstChild()
.getNodeValue();
if (searchLevel.equalsIgnoreCase("collection set")) {
searchLevel = "collectionset";
}
//collection or collection set name
searchName =
document
.getElementsByTagName("search.name")
.item(0)
.getFirstChild()
.getNodeValue();
searchNovusVersion =
document
.getElementsByTagName("search.novus.version")
.item(0)
.getFirstChild()
.getNodeValue();
searchNovusEnvironment =
document
.getElementsByTagName("search.novus.environment")
.item(0)
.getFirstChild()
.getNodeValue();
//check to see if the user wants either all of the guids
//for a collection or a collection set
if (searchType.equalsIgnoreCase("all guids")
|| searchType.equalsIgnoreCase("document count")) {
if("Norm".equalsIgnoreCase(searchGroup))
queryText = "=n-relbase";
else
queryText = "=n-document";
queryType = "boolean";
} else {
queryText =
document
.getElementsByTagName("search.query.text")
.item(0)
.getFirstChild()
.getNodeValue();
//escapte special characters
// escape any reserved characters
// Problem using an ampersand (&) in the Search query. Maestro translates it to an entity in the relevant data. Need to use the word "and".
queryText = escapeXML(queryText, "&", "and");
// queryText = escapeXML(queryText, "&", "&");
queryText = escapeXML(queryText, "<", "<");
queryText = escapeXML(queryText, ">", ">");
queryText = escapeXML(queryText, "'", "&apos;");
queryText = escapeXML(queryText, "\"", """);
//find the search type boolean or natural
queryType =
document
.getElementsByTagName("search.query.type")
.item(0)
.getFirstChild()
.getNodeValue();
}
try {
searchOutputResource =
document
.getElementsByTagName("search.output.resource")
.item(0)
.getFirstChild()
.getNodeValue();
searchOutputPath =
document
.getElementsByTagName("search.output.path")
.item(0)
.getFirstChild()
.getNodeValue();
searchOutputPrefix =
document
.getElementsByTagName("search.output.file.prefix")
.item(0)
.getFirstChild()
.getNodeValue();
} catch (Exception e) {
searchOutputResource = "";
searchOutputPath = "";
searchOutputPrefix = "";
if (searchOutputPrefix != null) {
if (searchOutputPrefix.length() == 0) {
searchOutputPrefix = "se";
}
} else {
searchOutputPrefix = "se";
}
//e.printStackTrace();
}
//now get the resource signon and password
String output="";
try {
Client client = Client.create();
System.out.println("resourceName: "+searchOutputResource);
WebResource webResource = client.resource("http://localhost:8080/searchextract/webapi/resource?isGroupAndResource=true&groupId="
+ requestGroup + "&resourceName=" + searchOutputResource);
ClientResponse response = webResource.type("application/json").get(ClientResponse.class);
if (response.getStatus() != 200)
{
throw new RuntimeException("Failed : HTTP error code : "
+ response.getStatus());
}
output = response.getEntity(String.class);
ResultSetIterator rsi = new ResultSetIterator(output);
searchOutputResourceUser = rsi.getFieldValue("resource_signon");
searchOutputResourcePass = rsi.getFieldValue("resource_password");
} catch (RemoteException re) {
System.err.println(re.detail.getMessage());
if (re.detail.getMessage().indexOf("DuplicateKeyException") > -1) {
//throw new Exception("Duplicate record");
}
} catch (NamingException e) {
System.err.println("NamingException: " + e.getMessage());
System.err.println("Root cause: " + e.getRootCause());
System.err.println("Explanation: " + e.getExplanation());
}
catch (Exception e) {
System.err.println(e.getMessage());
}
}
Below is the pom.xml file
<?xml version="1.0"?>
<project
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.tlr.searchextractproject</groupId>
<artifactId>parent-project</artifactId>
<version>0.0.1-SNAPSHOT</version>
</parent>
<groupId>com.tlr.searchextractproject</groupId>
<artifactId>searchextract</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>searchextract</name>
<properties>
<jersey.version>2.16</jersey.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.glassfish.jersey</groupId>
<artifactId>jersey-bom</artifactId>
<version>${jersey.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-container-servlet</artifactId>
<version>2.16</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.sun.jersey/jersey-bundle -->
<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-server</artifactId>
<version>2.16</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-container-servlet-core</artifactId>
<version>2.16</version>
</dependency>
<dependency>
<groupId>javax.servlet.jsp</groupId>
<artifactId>jsp-api</artifactId>
<version>2.1</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>commons-fileupload</groupId>
<artifactId>commons-fileupload</artifactId>
<version>1.2</version>
</dependency>
<dependency>
<groupId>taglibs</groupId>
<artifactId>standard</artifactId>
<version>1.1.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/javax.ws.rs/jsr311-api -->
<dependency>
<groupId>javax.ws.rs</groupId>
<artifactId>jsr311-api</artifactId>
<version>1.1.1</version>
</dependency>
<dependency>
<groupId>com.owlike</groupId>
<artifactId>genson</artifactId>
<version>1.6</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>jstl</artifactId>
<version>1.1.2</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-container-servlet-core</artifactId>
<!-- use the following artifactId if you don't need servlet 2.x compatibility -->
<!-- artifactId>jersey-container-servlet</artifactId -->
</dependency>
<dependency>
<groupId>org.glassfish.jersey.bundles.repackaged</groupId>
<artifactId>jersey-guava</artifactId>
<version>2.6</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-client</artifactId>
<version>2.16</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-client</artifactId>
<version>1.9.1</version>
</dependency>
<dependency>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-core</artifactId>
<version>1.9.1</version>
</dependency>
<!-- uncomment this to get JSON support -->
<dependency>
<groupId>org.glassfish.jersey.media</groupId>
<artifactId>jersey-media-moxy</artifactId>
</dependency>
<dependency>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
<version>1.2</version>
</dependency>
<dependency>
<groupId>javax.ws.rs</groupId>
<artifactId>javax.ws.rs-api</artifactId>
<version>2.0.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.activemq</groupId>
<artifactId>activemq-all</artifactId>
<version>5.11.1</version>
</dependency>
<dependency>
<groupId>org.apache.tomcat</groupId>
<artifactId>tomcat-dbcp</artifactId>
<version>8.5.9</version>
</dependency>
<dependency>
<groupId>org.apache.tomcat</groupId>
<artifactId>tomcat-jdbc</artifactId>
<version>8.5.9</version>
</dependency>
<dependency>
<groupId>org.apache.tomcat</groupId>
<artifactId>tomcat-coyote</artifactId>
<version>8.5.9</version>
</dependency>
<dependency>
<groupId>xerces</groupId>
<artifactId>xercesImpl</artifactId>
<version>2.11.0.1</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.39</version>
</dependency>
<dependency>
<groupId>xml-apis</groupId>
<artifactId>xml-apis</artifactId>
<version>1.4.01</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
<version>3.1.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.ibm</groupId>
<artifactId>com.ibm.mq</artifactId>
<version>6.0.2.4</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.ibm/com.ibm.mqjms -->
<dependency>
<groupId>com.ibm</groupId>
<artifactId>com.ibm.mqjms</artifactId>
<version>6.0.2.4</version>
</dependency>
<!-- https://mvnrepository.com/artifact/javax.resource/connector -->
<dependency>
<groupId>javax.resource </groupId>
<artifactId>connector</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>javax.jms</groupId>
<artifactId>jms</artifactId>
<version>1.1</version>
</dependency>
<dependency>
<groupId>com.ibm</groupId>
<artifactId>com.ibm.dhbcore </artifactId>
<version>7.1.0.0</version>
<scope>runtime</scope>
</dependency>
</dependencies>
<build>
<finalName>searchextract</finalName>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
</plugin>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.5.1</version>
<inherited>true</inherited>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
This exception is thrown while executing the below line
ClientResponse response = webResource.type("application/json").get(ClientResponse.class)
I am using Java 8 and Tomcat 8
I have faced the same error with my code and could resolve by using jersey dependency version 2.34
POM dependency
<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-client</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-common</artifactId>
<version>3.0.0</version>
</dependency>
My Error Message
2021-09-22 08:32:23.839 INFO 10 --- [http-nio-9070-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Handler dispatch failed; nested exception is java.lang.LinkageError: ClassCastException: attempting to castjar:file:/faw-qa-api/target/faw-qa-api-1.0-SNAPSHOT.jar!/javax/ws/rs/client/ClientBuilder.class to jar:file:/faw-qa-api/target/faw-qa-api-1.0-SNAPSHOT.jar!/javax/ws/rs/client/ClientBuilder.class] with root cause
java.lang.LinkageError: ClassCastException: attempting to castjar:file:/faw-qa-api/target/faw-qa-api-1.0-SNAPSHOT.jar!/javax/ws/rs/client/ClientBuilder.class to jar:file:/faw-qa-api/target/faw-qa-api-1.0-SNAPSHOT.jar!/javax/ws/rs/client/ClientBuilder.class
at javax.ws.rs.client.ClientBuilder.newBuilder(ClientBuilder.java:105)
I updated the dependency version of jersey-client and jersey-common to 2.34 and the issue was resolved.
<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-client</artifactId>
<version>2.34</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-common</artifactId>
<version>2.34</version>
</dependency>

Maven dependency hell for spark mlib ALS algorithm [duplicate]

This question already has answers here:
Resolving dependency problems in Apache Spark
(7 answers)
Closed 4 years ago.
I have this small piece of java code to get apache spark recommendations:
public class Main {
public static class Rating implements Serializable {
private int userId;
private int movieId;
private float rating;
private long timestamp;
public Rating() {}
public Rating(int userId, int movieId, float rating, long timestamp) {
this.userId = userId;
this.movieId = movieId;
this.rating = rating;
this.timestamp = timestamp;
}
public int getUserId() {
return userId;
}
public int getMovieId() {
return movieId;
}
public float getRating() {
return rating;
}
public long getTimestamp() {
return timestamp;
}
public static Rating parseRating(String str) {
String[] fields = str.split(",");
if (fields.length != 4) {
throw new IllegalArgumentException("Each line must contain 4 fields");
}
int userId = Integer.parseInt(fields[0]);
int movieId = Integer.parseInt(fields[1]);
float rating = Float.parseFloat(fields[2]);
long timestamp = Long.parseLong(fields[3]);
return new Rating(userId, movieId, rating, timestamp);
}
}
static String parse(String str) {
Pattern pat = Pattern.compile("\\[[0-9.]*,[0-9.]*]");
Matcher matcher = pat.matcher(str);
int count = 0;
StringBuilder sb = new StringBuilder();
while (matcher.find()) {
count++;
String substring = str.substring(matcher.start(), matcher.end());
String itstr = substring.split(",")[0].substring(1);
sb.append(itstr + " ");
}
return sb.toString().trim();
}
static TreeMap<Long, String> res = new TreeMap<>();
public static void add(long k, String v) {
res.put(k, v);
}
public static void main(String[] args) throws IOException {
Logger.getLogger("org").setLevel(Level.OFF);
Logger.getLogger("akka").setLevel(Level.OFF);
SparkSession spark = SparkSession
.builder()
.appName("SomeAppName")
.config("spark.master", "local")
.getOrCreate();
JavaRDD<Rating> ratingsRDD = spark
.read().textFile(args[0]).javaRDD()
.map(Rating::parseRating);
Dataset<Row> ratings = spark.createDataFrame(ratingsRDD, Rating.class);
ALS als = new ALS()
.setMaxIter(1)
.setRegParam(0.01)
.setUserCol("userId")
.setItemCol("movieId")
.setRatingCol("rating");
ALSModel model = als.fit(ratings);
model.setColdStartStrategy("drop");
Dataset<Row> rowDataset = model.recommendForAllUsers(50);
rowDataset.foreach((ForeachFunction<Row>) row -> {
String str = row.toString();
long l = Long.parseLong(str.substring(1).split(",")[0]);
add(l, parse(str));
});
BufferedWriter bw = new BufferedWriter(new FileWriter(args[1]));
for (long l = 0; l < res.lastKey(); l++) {
if (!res.containsKey(l)) {
bw.write("\n");
continue;
}
String str = res.get(l);
bw.write(str);
}
bw.close();
}
}
I am trying different dependencies in my pom.xml to get it running, but all variants fail. This one:
<dependency>
<groupId>com.sparkjava</groupId>
<artifactId>spark-core</artifactId>
<version>2.8.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.12</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.6.4</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.6.4</version>
</dependency>
fails with java.lang.ClassNotFoundException: text.DefaultSource, to fix it I add
org.apache.spark
spark-sql-kafka-0-10_2.10
2.0.2
now it crashes with ClassNotFoundException: org.apache.spark.internal.Logging$class, to fix it I add another ones:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.2.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>2.2.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>2.2.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
<version>2.2.2</version>
</dependency>
now it fails with java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce to fix it I tried dozen of other combinations, all of them failed, the last one is
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.2.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.2.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.2.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
<version>2.2.2</version>
</dependency>
which again gives me ClassNotFoundException: text.DefaultSource, how can I fix it? Was there any logic behind implementing runtime linking in spark?
UPD: also tried
<dependencies>
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.1</version>
</dependency>
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.0.1</version>
</dependency>
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.0.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.0.1</version>
</dependency>
<dependency>
<groupId>org.apache.bahir</groupId>
<artifactId>spark-streaming-twitter_2.11</artifactId>
<version>2.0.1</version>
</dependency>
</dependencies>
(this still gives me java.lang.ClassNotFoundException: text.DefaultSource))
I also tried dependencies published in this question, but they also fail: Resolving dependency problems in Apache Spark
Source code is available here, so you can try various maven settings yourself: https://github.com/stiv-yakovenko/sparkrec
Finally I was able to make it work:
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
</dependency>
</dependencies>
You have to use these exact versions otherwise it will crash in multiple various ways.

Lucene exception NoSuchMethodError with version 6.4.0

I am relatively new to Lucene and playing with the latest version 6.4.0.
I have written a cutom analyzer class for doing synonyms,
public class MySynonymAnalyzer extends Analyzer {
#Override
protected TokenStreamComponents createComponents(String fieldName) {
Tokenizer source = new ClassicTokenizer();
TokenStream filter = new StandardFilter(source);
filter = new LowerCaseFilter(filter);
filter = new SynonymGraphFilter(filter, getSynonymsMap(), false);
filter = new FlattenGraphFilter(filter);
return new TokenStreamComponents(source, filter);
}
private SynonymMap getSynonymsMap() {
try {
SynonymMap.Builder builder = new SynonymMap.Builder(true);
builder.add(new CharsRef("work"), new CharsRef("labor"), true);
builder.add(new CharsRef("work"), new CharsRef("effort"), true);
SynonymMap mySynonymMap = builder.build();
return mySynonymMap;
} catch (Exception ex) {
return null;
}
}
In the line where I call getSynonymsMap(), I get the following exception:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.lucene.util.UnicodeUtil.UTF16toUTF8WithHash([CIILorg/apache/lucene/util/BytesRef;)I
at org.apache.lucene.analysis.synonym.SynonymMap$Builder.add(SynonymMap.java:192)
at org.apache.lucene.analysis.synonym.SynonymMap$Builder.add(SynonymMap.java:239)
at m2_lab4.MySynonymAnalyzer.getSynonymsMap(MySynonymAnalyzer.java:37)
at m2_lab4.MySynonymAnalyzer.createComponents(MySynonymAnalyzer.java:28)
at org.apache.lucene.analysis.Analyzer.tokenStream(Analyzer.java:162)
at org.apache.lucene.document.Field.tokenStream(Field.java:568)
Version 6.4.0 doesn't seem to have method UTF16toUTF8WithHash in class UnicodeUtil. I am using everything from lucene 6.4.0 and there doesn't seem to be any old versioned jar in my classpath. This is how my maven dependendies look like:
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<lucene.version>6.4.0</lucene.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-core</artifactId>
<version>${lucene.version}</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-analyzers-common</artifactId>
<version>${lucene.version}</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-queryparser</artifactId>
<version>${lucene.version}</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-queries</artifactId>
<version>${lucene.version}</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-analyzers</artifactId>
<version>3.6.2</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-facet</artifactId>
<version>${lucene.version}</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-spatial</artifactId>
<version>${lucene.version}</version>
</dependency>
<dependency>
<groupId>com.spatial4j</groupId>
<artifactId>spatial4j</artifactId>
<version>0.4.1</version>
</dependency>
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20090211</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-io</artifactId>
<version>1.3.2</version>
</dependency>
</dependencies>
Any idea what's going on? I am specially puzzled by the [CIILorg/apache/lucene/util/BytesRef text inside the exception description.

PowerMockito fails to mock javax.faces.context.FacesContext

I am getting build error if I do a mvn clean install, otherwise test case working fine and return expected result. Is there any mockito and powermockito version issue?
java.lang.IllegalStateException: Failed to transform class with name javax.faces.context.FacesContext. Reason: javax.el.ELContext
at javassist.ClassPool.get(ClassPool.java:450)
at javassist.bytecode.Descriptor.toCtClass(Descriptor.java:592)
at javassist.bytecode.Descriptor.getReturnType(Descriptor.java:489)
at javassist.CtBehavior.getReturnType0(CtBehavior.java:306)
at javassist.CtMethod.getReturnType(CtMethod.java:217)
at org.powermock.core.transformers.impl.MainMockTransformer.modifyMethod(MainMockTransformer.java:172)
at org.powermock.core.transformers.impl.MainMockTransformer.allowMockingOfStaticAndFinalAndNativeMethods(MainMockTransformer.java:142)
at org.powermock.core.transformers.impl.MainMockTransformer.transform(MainMockTransformer.java:65)
at org.powermock.core.classloader.MockClassLoader.loadMockClass(MockClassLoader.java:243)
at org.powermock.core.classloader.MockClassLoader.loadModifiedClass(MockClassLoader.java:177)
at org.powermock.core.classloader.DeferSupportingClassLoader.loadClass(DeferSupportingClassLoader.java:68)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at sun.reflect.generics.factory.CoreReflectionFactory.makeNamedType(CoreReflectionFactory.java:95)
at sun.reflect.generics.visitor.Reifier.visitClassTypeSignature(Reifier.java:107)
at sun.reflect.generics.tree.ClassTypeSignature.accept(ClassTypeSignature.java:31)
at sun.reflect.annotation.AnnotationParser.parseSig(AnnotationParser.java:370)
at sun.reflect.annotation.AnnotationParser.parseClassValue(AnnotationParser.java:351)
at sun.reflect.annotation.AnnotationParser.parseClassArray(AnnotationParser.java:653)
at sun.reflect.annotation.AnnotationParser.parseArray(AnnotationParser.java:460)
at sun.reflect.annotation.AnnotationParser.parseMemberValue(AnnotationParser.java:286)
at sun.reflect.annotation.AnnotationParser.parseAnnotation(AnnotationParser.java:222)
at sun.reflect.annotation.AnnotationParser.parseAnnotations2(AnnotationParser.java:69)
at sun.reflect.annotation.AnnotationParser.parseAnnotations(AnnotationParser.java:52)
at java.lang.Class.initAnnotationsIfNecessary(Class.java:3070)
at java.lang.Class.getAnnotations(Class.java:3050)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl.classAnnotations(PowerMockJUnit44RunnerDelegateImpl.java:163)
at org.powermock.modules.junit4.internal.impl.PowerMockJUnit44RunnerDelegateImpl.getDescription(PowerMockJUnit44RunnerDelegateImpl.java:155)
at org.powermock.modules.junit4.common.internal.impl.JUnit4TestSuiteChunkerImpl.getDescription(JUnit4TestSuiteChunkerImpl.java:177)
at org.powermock.modules.junit4.common.internal.impl.AbstractCommonPowerMockRunner.getDescription(AbstractCommonPowerMockRunner.java:47)
at org.powermock.modules.junit4.PowerMockRunner.run(PowerMockRunner.java:51)
at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
pom.xml
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<version>1.10.8</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.powermock</groupId>
<artifactId>powermock-core</artifactId>
<version>1.5.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.powermock</groupId>
<artifactId>powermock-module-junit4</artifactId>
<version>1.5.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.powermock</groupId>
<artifactId>powermock-api-mockito</artifactId>
<version>1.5.5</version>
<scope>test</scope>
</dependency>
Test
#Before
public void setup() throws Exception {
PowerMockito.mockStatic(Util.class);
PowerMockito.mockStatic(Service.class);
mockPageCodeBase = Mockito.mock(PageCodeBase.class);
testWorkbenchDetails = new WorkbenchDetails();
PowerMockito.mockStatic(FacesContext.class);
PowerMockito.mockStatic(ExternalContext.class);
facesContext = mock(FacesContext.class);
extContext = mock(ExternalContext.class);
when(FacesContext.getCurrentInstance()).thenReturn(facesContext);
when(facesContext.getExternalContext()).thenReturn(extContext);
facesMessage = Mockito.spy(new FacesMessage());
PowerMockito.whenNew(FacesMessage.class).withAnyArguments().thenReturn(facesMessage);
Iterator<FacesMessage> mockIterator = mock(Iterator.class);
when(facesContext.getMessages()).thenReturn(mockIterator);
sessionScope = mock(HashMap.class);
sessionScope.put("awrId", "12345");
sessionScope.put("userId", "wpsadmin");
requestParameterMap = mock(HashMap.class);
requestParameterMap.put("selectedApplicantId", "selectedApplicantId");
testWorkbenchDetails.setApplicants(MockDataPotalClient.getApplicantWorkbenchDetailsList());
BusinessInformation businessInformation = new BusinessInformation();
businessInformation.setFax("fax");
testWorkbenchDetails.setBusinessInformation(businessInformation);
when(extContext.getSessionMap()).thenReturn(sessionScope);
when(extContext.getRequestParameterMap()).thenReturn(requestParameterMap);
when(requestParameterMap.get("selectedApplicantId")).thenReturn("selectedApplicantId");
Mockito.when(Util.extractApplicant(any(String.class), any(List.class))).thenReturn(MockDataPotalClient.getApplicantWorkbenchDetails());
suppress(field(WorkbenchDetails.class, "applicants"));
PowerMockito.doNothing().when(Service.class, "initiateCCHCheck" ,any(WorkbenchDetails.class),any(ApplicantWorkbenchDetails.class));
workbenchDetails = new WorkbenchDetails() {
private static final long serialVersionUID = 1L;
};
applicant = new ApplicantWorkbenchDetails();
}
#Test
public void initiateCheckTest() throws Exception{
String check = testWorkbenchDetails.initiateCheck();
Assert.assertEquals(check, "SAME_PAGE");
}
It seems like you don't have API for EL defined in your dependencies, try to add one:
<dependency>
<groupId>javax.el</groupId>
<artifactId>el-api</artifactId>
<version>2.2</version>
</dependency>
You don't have to define both mockito and powermockito dependencies in your pom.xml.
PowerMockito internally pulls compatible versions of mockito jars.
Keep only below two dependencies in your pom.
<dependency>
<groupId>org.powermock</groupId>
<artifactId>powermock-module-junit4</artifactId>
<version>1.5.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.powermock</groupId>
<artifactId>powermock-api-mockito</artifactId>
<version>1.5.5</version>
<scope>test</scope>
</dependency>

Categories