Oracle with Spring roo - java

I've been try to get spring roo generating entities from an oracle database.
However I kept getting the error JDBC driver not available for oracle.jdbc.OracleDriver
I've installed my ojdbc14.jar to my local maven respository
mvn install:install-file -Dfile=ojdbc14.jar -DgroupId=com.oracle -DartifactId=ojdbc14 -Dversion=10.2.0.2 -Dpackaging=jar -DgeneratePom=true
I've check my pom.xml and it points to the correct version and when I mvn clean install it compiles correctly.
I'm not sure what I'm missing. I've had it work using MySql but no job with Oracle.
What am I doing wrong ?
Roo version 1.2.2.RELEASE
<dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc14</artifactId>
<version>10.2.0.2</version>
<classifier />
</dependency>
....
<dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc14</artifactId>
<version>10.2.0.2</version>
</dependency>
#Updated at Tue Nov 13 22:43:01 NZDT 2012
#Tue Nov 13 22:43:01 NZDT 2012
database.driverClassName=oracle.jdbc.OracleDriver
database.url=jdbc\:oracle\:thin\:#[server.ip]\:1521\:orcl
database.username=MYUSER
database.password=MYPASSWORD
Where [server.ip] is the databases ip address
Hi,
I didn't do anything with OSGI.
After downloading the jar you mentioned. (Downloaded it from here http://www.java2s.com/Code/Jar/b/DownloadbizaQutebndjar.htm)
I ran the command you gave me but it produced an error.
oracle/sql/converter_xcharset
lx20001.glb
lx20002.glb
lx2001f.glb
lx200b2.glb
One error
1 : Unresolved references to [javax.naming, javax.naming.directory, javax.naming.spi, javax.net, javax.net.ssl, javax.resource, javax.resource.spi, javax.resource.spi.endpoint, javax.resource.spi.security, javax.security.auth, javax.security.cert, javax.sql, javax.sql.rowset, javax.sql.rowset.spi, javax.transaction.xa, javax.xml.parsers, oracle.i18n.text.converter, oracle.ons, oracle.security.pki, org.w3c.dom, org.xml.sax, org.xml.sax.helpers] by class(es) on the Bundle-Classpath[Jar:ojdbc14.jar]: [oracle/jdbc/pool/OracleConnectionCacheManager$1.class, oracle/jdbc/rowset/OracleCachedRowSetReader.class, oracle/jdbc/pool/OracleRuntimeLoadBalancingEventHandlerThread$1.class, oracle/jdbc/rowset/OracleWebRowSet.class, oracle/jdbc/internal/OracleConnection.class, oracle/jdbc/connector/OracleManagedConnectionMetaData.class, oracle/jdbc/xa/OracleXAConnection.class, oracle/jdbc/xa/OracleXAException.class, oracle/jdbc/xa/OracleXADataSource.class, oracle/jdbc/pool/OracleOCIConnectionPool.class, oracle/jdbc/connector/OracleConnectionManager.class, oracle/jdbc/rowset/OracleJDBCRowSet.class, oracle/net/nt/TcpsConfigure.class, oracle/jdbc/rowset/OraclePredicate.class, oracle/jdbc/pool/OracleRuntimeLoadBalancingEventHandlerThread.class, oracle/jdbc/connector/OracleResourceAdapter.class, oracle/jdbc/pool/OracleConnectionCacheTimeOutThread.class, oracle/net/jndi/CustomSSLSocketFactory.class, oracle/jdbc/rowset/OracleRowSetMetaData.class, oracle/jdbc/driver/T4CXAResource.class, oracle/jdbc/rowset/OracleFilteredRowSet.class, oracle/jdbc/xa/OracleXid.class, oracle/jdbc/rowset/OracleWebRowSetXmlWriter.class, oracle/jdbc/rowset/OracleJoinable.class, oracle/jdbc/pool/OracleConnectionCacheImpl.class, oracle/jdbc/connector/OracleManagedConnectionFactory.class, oracle/jdbc/pool/OracleFailoverEventHandlerThread$1.class, oracle/sql/converter/CharacterConverterFactoryOGS.class, oracle/jdbc/pool/OracleConnectionEventListener.class, oracle/jdbc/rowset/OracleWebRowSetXmlReaderImpl.class, oracle/jdbc/pool/OracleXAConnectionCacheImpl.class, oracle/jdbc/rowset/OracleWebRowSetXmlWriterImpl.class, oracle/jdbc/driver/PhysicalConnection.class, oracle/jdbc/xa/client/OracleXADataSource.class, oracle/net/jndi/TrustManagerSSLSocketFactory.class, oracle/jdbc/xa/client/OracleXAResource.class, oracle/jdbc/xa/OracleXAResource.class, oracle/jdbc/pool/OracleConnectionCacheEventListener.class, oracle/jdbc/rowset/OracleRowSetListenerAdapter.class, oracle/jdbc/rowset/OracleWebRowSetXmlReaderDomHandler.class, oracle/jdbc/connector/OracleConnectionRequestInfo.class, oracle/jdbc/pool/OracleDataSourceFactory.class, oracle/net/jndi/JndiAttrs.class, oracle/jdbc/rowset/OracleJoinRowSet.class, oracle/jdbc/rowset/OracleWebRowSetXmlReader.class, oracle/jdbc/xa/client/OracleXAHeteroConnection.class, oracle/jdbc/driver/T4CXAConnection.class, oracle/net/nt/CustomSSLSocketFactory.class, oracle/net/nt/TcpsNTAdapter.class, oracle/jdbc/rowset/OracleCachedRowSetWriter.class, oracle/jdbc/rowset/OracleCachedRowSet.class, oracle/jdbc/pool/OraclePooledConnection.class, oracle/jdbc/pool/OracleFailoverEventHandlerThread.class, oracle/jdbc/xa/client/OracleXAConnection.class, oracle/jdbc/pool/OracleConnectionPoolDataSource.class, oracle/jdbc/driver/LogicalConnection.class, oracle/jdbc/pool/OracleConnectionCacheManager.class, oracle/jdbc/rowset/OracleRowSet.class, oracle/jdbc/pool/OracleImplicitConnectionCache.class, oracle/jdbc/connector/OracleManagedConnection.class, oracle/jdbc/pool/OracleConnectionCache.class, oracle/jdbc/xa/client/OracleXAHeteroResource.class, oracle/jdbc/driver/OracleDriver.class, oracle/jdbc/rowset/OracleWebRowSetXmlReaderContHandler.class, oracle/jdbc/connector/OracleLocalTransaction.class, oracle/net/jndi/TrustManager.class, oracle/jdbc/pool/OracleDataSource.class]
So I didn't get the expected result.
Any idea on this ?
Thanks for the help.

Did you create an Oracle JDBC driver (ojdbc) for OSGI? From Roo docs:
currently there are no open-source JDBC drivers for Oracle or DB2 and
Roo does not provide OSGi drivers for these databases.
You can follow instructions in the given docs. Otherwise, the biz.aQute.bnd.jar helps you to create an OSGi bundle version of the OJDBC driver.
To do that download this zip file and unzip it. Then in the same folder put your ojdbc14.jar and run the command:
java -jar biz.aQute.bnd.jar wrap ojdbc14.jar
I get one warning (Superfluous export-package instructions: [oracle.net, oracle, oracle.jpub, oracle.security, oracle.core]) which I ignore.
As a result of this step you should get a new file: ojdbc14.bar
Rename it whatever you want but with .jar extension. e.g. 'ojdbc14-osgi.jar'
Install the jar in roo with
roo> osgi start --url file:///tmp/ojdbc14-osgi.jar
roo> database reverse engineer ... and so on
Just one thing to note. Remember to edit the version of the ojdbc14 artifactId in the Roo generated pom.xml if necessary.
Hope it helps. I have done it with 3 projects and 3 databases without problems.

If you are getting the error during running with mvn jetty:run, or mvn tomcat:run you can try adding the dependency to the tomcat and jetty config:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>tomcat-maven-plugin</artifactId>
<version>1.1</version>
<configuration>
<httpsPort>9443</httpsPort>
</configuration>
<dependencies>
<dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc14</artifactId>
<version>10.2.0.2</version>
</dependency>
</dependencies>
</plugin>
<plugin>
<groupId>org.mortbay.jetty</groupId>
<artifactId>jetty-maven-plugin</artifactId>
<version>8.1.4.v20120524</version>
<configuration>
<webAppConfig>
<contextPath>/${project.name}</contextPath>
</webAppConfig>
</configuration>
<dependencies>
<dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc14</artifactId>
<version>10.2.0.2</version>
</dependency>
</dependencies>
</plugin>

Related

JcaPEMKeyConverter is provided by BouncyCastle, an optional dependency. To use support for EC Keys you must explicitly add dependency to classpath

I have a simple Flink streaming app. It runs well in a cluster created by start-cluster.sh command.
Now based on the Flink tutorial, I hope to deploy it in application mode natively in a Kubernetes cluster created by k3d on macOS.
First, I created a cluster by k3d cluster create dev.
Here is my Dockerfile:
FROM flink
RUN mkdir -p $FLINK_HOME/usrlib
COPY target/streaming-0.1.jar $FLINK_HOME/usrlib/streaming-0.1.jar
I built and pushed it to Docker Hub.
My cluster name is k3d-dev, so I ran
flink run-application \
--target kubernetes-application \
-Dkubernetes.cluster-id=k3d-dev \
-Dkubernetes.container.image=hongbomiao/my-flink-xxx:latest \
local:///opt/flink/usrlib/streaming-0.1.jar
However, I got error:
The program finished with the following exception:
io.fabric8.kubernetes.client.KubernetesClientException: JcaPEMKeyConverter is provided by BouncyCastle, an optional dependency. To use support for EC Keys you must explicitly add this dependency to classpath.
at io.fabric8.kubernetes.client.internal.CertUtils.handleECKey(CertUtils.java:161)
at io.fabric8.kubernetes.client.internal.CertUtils.loadKey(CertUtils.java:131)
at io.fabric8.kubernetes.client.internal.CertUtils.createKeyStore(CertUtils.java:111)
at io.fabric8.kubernetes.client.internal.CertUtils.createKeyStore(CertUtils.java:243)
at io.fabric8.kubernetes.client.internal.SSLUtils.keyManagers(SSLUtils.java:128)
at io.fabric8.kubernetes.client.internal.SSLUtils.keyManagers(SSLUtils.java:122)
at io.fabric8.kubernetes.client.utils.HttpClientUtils.createHttpClient(HttpClientUtils.java:82)
at io.fabric8.kubernetes.client.utils.HttpClientUtils.createHttpClient(HttpClientUtils.java:62)
at io.fabric8.kubernetes.client.BaseClient.<init>(BaseClient.java:51)
at io.fabric8.kubernetes.client.DefaultKubernetesClient.<init>(DefaultKubernetesClient.java:105)
at org.apache.flink.kubernetes.kubeclient.FlinkKubeClientFactory.fromConfiguration(FlinkKubeClientFactory.java:102)
at org.apache.flink.kubernetes.KubernetesClusterClientFactory.createClusterDescriptor(KubernetesClusterClientFactory.java:61)
at org.apache.flink.kubernetes.KubernetesClusterClientFactory.createClusterDescriptor(KubernetesClusterClientFactory.java:39)
at org.apache.flink.client.deployment.application.cli.ApplicationClusterDeployer.run(ApplicationClusterDeployer.java:63)
at org.apache.flink.client.cli.CliFrontend.runApplication(CliFrontend.java:213)
at org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1057)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1132)
at org.apache.flink.runtime.security.contexts.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:28)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1132)
After reading
https://www.mail-archive.com/search?l=dev#spark.apache.org&q=subject:"Should+we+add+built+in+support+for+bouncy+castle+EC+w\%2FKube"&o=newest&f=1
https://github.com/de-jcup/ekube/issues/63#issuecomment-753508790
I added
<dependency>
<groupId>org.bouncycastle</groupId>
<artifactId>bcpkix-jdk15on</artifactId>
<version>1.69</version>
</dependency>
<dependency>
<groupId>org.bouncycastle</groupId>
<artifactId>bcprov-jdk15on</artifactId>
<version>1.69</version>
</dependency>
<dependency>
<groupId>org.bouncycastle</groupId>
<artifactId>bcprov-ext-jdk15on</artifactId>
<version>1.69</version>
</dependency>
to my pom.xml file. I built and pushed to Docker Hub again.
When I ran the Flink command above, I still got same error. Any idea? Thanks!
UPDATE 1:
Besides above pom.xml change, I manually downloaded those 3 jars and changed my Dockerfile to
FROM flink
COPY lib/* $FLINK_HOME/lib
RUN mkdir -p $FLINK_HOME/usrlib
COPY target/streaming-0.1.jar $FLINK_HOME/usrlib/streaming-0.1.jar
and tried again, but still same error.
I can confirm the 3 jar files bcpkix-jdk15on-1.69.jar, bcprov-ext-jdk15on-1.69.jar, bcprov-jdk15on-1.69.jar are in the docker image:
➜ docker run -it 6c48af48db55c334003a307d1ef7a5fc5181f389613284b66b5cb97588b9708d sh
$ cd lib && ls
bcpkix-jdk15on-1.69.jar flink-dist_2.12-1.13.2.jar flink-table_2.12-1.13.2.jar log4j-slf4j-impl-2.12.1.jar
bcprov-ext-jdk15on-1.69.jar flink-json-1.13.2.jar log4j-1.2-api-2.12.1.jar
bcprov-jdk15on-1.69.jar flink-shaded-zookeeper-3.4.14.jar log4j-api-2.12.1.jar
flink-csv-1.13.2.jar flink-table-blink_2.12-1.13.2.jar log4j-core-2.12.1.jar
$ cd ../usrlib && ls
streaming-0.1.jar
UPDATE 2:
I tried to start the session mode by
/usr/local/Cellar/apache-flink/1.13.1/libexec/bin/kubernetes-session.sh
but still got same error. So now I can confirm when I was using application mode before, the issue is not related my Docker image.
I have those jars are located at ~/.m2 on my machine:
Did I miss any other jars?
Also, I found the error only happens to the cluster created by k3d/k3s, but not minikube.
After checking the code of
/usr/local/Cellar/apache-flink/1.13.1/libexec/bin/kubernetes-session.sh
/usr/local/Cellar/apache-flink/1.13.1/libexec/libexec/kubernetes-session.sh
The first script is pointing to the second script, and the second script has
# ...
CC_CLASSPATH=`manglePathList $(constructFlinkClassPath):$INTERNAL_HADOOP_CLASSPATHS`
# ...
"$JAVA_RUN" $JVM_ARGS -classpath "$CC_CLASSPATH" $log_setting org.apache.flink.kubernetes.cli.KubernetesSessionCli "$#"
I added echo $CC_CLASSPATH, and printed out the classpath.
In my case, it is at /usr/local/Cellar/apache-flink/1.13.1/libexec/lib.
After I put bcprov-jdk15on-1.69.jar and bcpkix-jdk15on-1.69.jar in the folder above, Flink can be deployed to k3s (k3d) now in both session and application modes.
Summary
Download bcprov-jdk15on and bcpkix-jdk15on jar files at
https://mvnrepository.com/artifact/org.bouncycastle/bcprov-jdk15on
https://mvnrepository.com/artifact/org.bouncycastle/bcpkix-jdk15on
Then move to the folder
/usr/local/Cellar/apache-flink/{version}/libexec/lib
Then you are good to go.
I stumbled upon this problem today on my m1 mac using docker and k3d to create a k3s cluster.
I solved it by actually doing what the error message says: adding bouncy castle to the class path.
But you need to add bc to the plugin's class path in the pom.xml, like so:
...
<plugin>
<groupId>${quarkus.platform.group-id}</groupId>
<artifactId>quarkus-maven-plugin</artifactId>
<version>${quarkus.platform.version}</version>
<extensions>true</extensions>
<executions>
<execution>
<goals>
<goal>build</goal>
<goal>generate-code</goal>
<goal>generate-code-tests</goal>
</goals>
</execution>
</executions>
<!-- add bouncy castle to class path, so that elliptic curve keys work -->
<dependencies>
<dependency>
<groupId>org.bouncycastle</groupId>
<artifactId>bcpkix-jdk15on</artifactId>
<version>1.69</version>
</dependency>
<dependency>
<groupId>org.bouncycastle</groupId>
<artifactId>bcprov-jdk15on</artifactId>
<version>1.69</version>
</dependency>
<dependency>
<groupId>org.bouncycastle</groupId>
<artifactId>bcprov-ext-jdk15on</artifactId>
<version>1.69</version>
</dependency>
</dependencies>
</plugin>
...
I don't know, it seems to me, that k3d always generates EC keys, as I had this exact problem on a totally different machine (win10/wsl2 + k3d) also.

eclipse generates the next errro The type com.google.common.collect.ImmutableList$Builde [duplicate]

I'm writing Selenium Junit tests with IntelliJ. The tests run ok if I trigger from test directly. However, if I trigger tests from TestRunnerSuite with JunitCore, I encountered following weird error that I did not find a solution after researching on google. Similar questions on DriverService$builder, but not my error type.
[main] ERROR sire.responseOrg.TestIncidents - java.lang.AbstractMethodError: org.openqa.selenium.remote.service.DriverService$Builder.createArgs()Lcom/google/common/collect/ImmutableList;
at org.openqa.selenium.remote.service.DriverService$Builder.build(DriverService.java:332)
at org.openqa.selenium.chrome.ChromeDriverService.createDefaultService(ChromeDriverService.java:88)
at org.openqa.selenium.chrome.ChromeDriver.<init>(ChromeDriver.java:123)
at sire.responseOrg.WebDrivers.getInstance(WebDrivers.java:15)
at sire.responseOrg.util.util1.setupChromeDriver(util1.java:51)
at sire.responseOrg.Test1.setUp(Test1.java:28)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at ......Omitted
at org.junit.runner.JUnitCore.run(JUnitCore.java:127)
at org.junit.runner.JUnitCore.runClasses(JUnitCore.java:76)
at sire.responseOrg.TestSuiteRunner.main(TestSuiteRunner.java:24)
I'm using Selenium 3.5.3 and chrome 76.---> Updated to Selenium 3.141.59,and with main scope.
Now getting error
java.lang.NoClassDefFoundError: org/apache/http/auth/Credentials
at org.openqa.selenium.remote.HttpCommandExecutor.getDefaultClientFactory(HttpCommandExecutor.java:93)
at org.openqa.selenium.remote.HttpCommandExecutor.<init>(HttpCommandExecutor.java:72)
at org.openqa.selenium.remote.service.DriverCommandExecutor.<init>(DriverCommandExecutor.java:63)
at org.openqa.selenium.chrome.ChromeDriverCommandExecutor.<init>(ChromeDriverCommandExecutor.java:36)
at org.openqa.selenium.chrome.ChromeDriver.<init>(ChromeDriver.java:181)
at org.openqa.selenium.chrome.ChromeDriver.<init>(ChromeDriver.java:168)
at org.openqa.selenium.chrome.ChromeDriver.<init>(ChromeDriver.java:123)
at sire.responseOrg.WebDrivers.getInstance(WebDrivers.java:15)
at sire.responseOrg.util.SeleniumUtil.setupChromeDriver(SeleniumUtil.java:62)
at sire.responseOrg.TestIncidents.setUp(TestIncidents.java:29)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:27)
at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
at org.junit.runners.ParentRunner.run(ParentRunner.java:292)
at org.junit.runners.Suite.runChild(Suite.java:128)
at org.junit.runners.Suite.runChild(Suite.java:24)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:231)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:60)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:229)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:50)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:222)
at org.junit.runners.ParentRunner.run(ParentRunner.java:292)
at org.junit.runner.JUnitCore.run(JUnitCore.java:157)
at org.junit.runner.JUnitCore.run(JUnitCore.java:136)
at org.junit.runner.JUnitCore.run(JUnitCore.java:127)
at org.junit.runner.JUnitCore.runClasses(JUnitCore.java:76)
at sire.responseOrg.TestSuiteRunner.main(TestSuiteRunner.java:24)
Caused by: java.lang.ClassNotFoundException: org.apache.http.auth.Credentials
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 33 more
Full pom.xml dependencies
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>myGroupId</groupId>
<artifactId>myArtifactId</artifactId>
<version>1.0-SNAPSHOT</version>
<description>My description</description>
<dependencies>
<!-- https://mvnrepository.com/artifact/junit/junit -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.9</version>
<scope>main</scope>
</dependency>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-api</artifactId>
<version>3.141.59</version>
</dependency>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>3.141.59</version>
</dependency>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-chrome-driver</artifactId>
<version>3.141.59</version>
<scope>main</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-api -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.6</version>
<scope>main</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-simple -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.7.6</version>
<scope>main</scope>
</dependency>
<dependency>
<groupId>com.salesforce.seti</groupId>
<artifactId>selenium-dependencies</artifactId>
<version>1.0.3</version>
</dependency>
</dependencies>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.1.2</version>
<executions>
<execution>
<goals>
<goal>test-jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
</build>
<packaging>pom</packaging>
</project>
My project folder structure is
.src
...main
.....java
.......projectname
.........constantsFolder
.........utilFolder
...........util1.java
...........util2.java
.........Test1.java
.........TestRunnerSuite.java
.........WebDrivers.java
If I start test from Test1.java, the test runs regularly though with warnings
[main] INFO projectname.util.util1 - Set up chrome driver.
Starting ChromeDriver 75.0.3770.90 (a6dcaf7e3ec6f70a194cc25e8149475c6590e025-refs/branch-heads/3770#{#1003}) on port 28755
Only local connections are allowed.
Please protect ports used by ChromeDriver and related test frameworks to prevent access by malicious code.
[1566609934.853][WARNING]: This version of ChromeDriver has not been tested with Chrome version 76.
Aug 23, 2019 6:25:34 PM org.openqa.selenium.remote.ProtocolHandshake createSession
INFO: Detected dialect: W3C
[main] INFO projectname.util.util1 - Navigating to https://mytest.com/
However, after adding a testSuiteRunner as below.
#RunWith(Suite.class)
#Suite.SuiteClasses({ Test1.class })
public class TestSuiteRunner {
public static void main(String[] args) {
Result result = JUnitCore.runClasses(Test1.class);
// print erros, exit etc omitted
}
}
Now I get the weird error and cannot fire the chromedriver.
The webdriver I have is singleton
public class WebDrivers {
private static WebDriver driver = null;
public static WebDriver getInstance(){
if (driver == null) {
driver = new ChromeDriver();
}
return driver;
}
}
It's my first time to work on setting everything up from grounds. I'm not sure if it's pom dependency issue, singleton webdriver issue, or something else. Could anyone share an eyesight on this and give some clues? Much appreciated.
This error message...
java.lang.AbstractMethodError: org.openqa.selenium.remote.service.DriverService$Builder.createArgs()Lcom/google/common/collect/ImmutableList;
...implies that there is some incompatibility between the version of the binaries you are using specifically with the guava dependency.
You are using chrome= 76.0
You are using the following:
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-chrome-driver</artifactId>
<version>3.5.3</version>
<scope>test</scope>
</dependency>
Your Selenium Client version is 3.5.3 which is more then 2 years older.
Your JDK version is unknown to us.
So there is a clear mismatch between the Selenium Client v3.5.3 and Chrome Browser v76.0
However as per the discussions in:
java.lang.NoSuchMethodError: com.google.common.collect.ImmutableList.builderWithExpectedSize
NoSuchMethodError: com.google.common.collect.ImmutableList.toImmutableList()Ljava/util/stream/Collector; after upgrade to 2.0.16
These issues crop up due to incompatibile Guava dependency.
The current guava version used within selenium-java-3.141.59 is guava-25.0-jre
Solution
Ensure that:
JDK is upgraded to current levels JDK 8u222.
Selenium is upgraded to current levels Version 3.141.59.
Clean your Project Workspace through your IDE and Rebuild your project with required dependencies only.
If your base Web Client version is too old, then uninstall it and install a recent GA and released version of Web Client.
Take a System Reboot.
Execute your #Test as non-root user.
Always invoke driver.quit() within tearDown(){} method to close & destroy the WebDriver and Web Client instances gracefully.
Update
So presumably your main question with respect to the error:
java.lang.AbstractMethodError: org.openqa.selenium.remote.service.DriverService$Builder.createArgs()Lcom/google/common/collect/ImmutableList;
is solved. Congratulations.
Now, as per your question update as you are seeing the error:
java.lang.NoClassDefFoundError: org/apache/http/auth/Credentials
There are two aspects.
NoClassDefFoundError: NoClassDefFoundError in Java occurs when Java Virtual Machine is not able to find a particular class at runtime which was available at compile time. You can find a detailed discussion in Exception in thread “main” java.lang.NoClassDefFoundError: org/openqa/selenium/WebDriver
http/auth: Traces of http/auth implies http client is still in use where as the CHANGELOG reflects:
The HttpClient implementation details were out of HttpCommandExecutor right from Selenium v2.45.0.
With the availability of Selenium v3.11, Selenium Grid was switched to use OkHttp rather than the Apache HttpClient.
Further with the release of Selenium v3.141.0, Apache HttpClient was removed from selenium-server-standalone which drastically
reduced the size of selenium server distribution package.
Even the apache-backed httpclient was also removed.
You can find a detailed discussion in org.openqa.selenium.remote.internal.ApacheHttpClient is deprecated in selenium 3.14.0 - What should be used instead?
Remove scope from your POM.
test
or main
Isn't needed for your tests to run
used this gauva jar with latest testng 7.3. Resolved this error also dont configure any testNG seperately. Please remove configuration if we add it in pom.xml
I was facing the same issue. The following steps helped resolve it:
Go to your POM file and comment out/ remove the following dependency:
--> org.seleniumhq.selenium
--> selenium-java
--> 2.48.2
Check what version you have.
Then copy the latest mvn dependency which is 4.1.1 as of now
Perform mvn clean install (till this point it will be sufficient)
Perform reload all maven projects
Perform download sources
Build your project again
Play/ perform mvn clean install again
FYI: as per my knowledge, this issue occurs at compile time when you execute the code. Or the time when you execute a code in a new project which was compiled with different dependencies and versions earlier.

Spark and Kafka issue - Exception in thread "main" java.lang.NoClassDefFoundError: org.apache.spark.streaming.kafka010.LocationStrategies

I will start by mentioning that I have tried all the suggestion in similar topics and nothing worked for me, so please, this is not a duplicate question.
The issue that I am having is as follows -
I am trying to run a sample java application on spark, using spark streaming and kafka. I have added all the required dependencies:
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.12</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
<version>2.3.0</version>
</dependency>
</dependencies>
After deploying the Jar on a server where I would like to run my application (I have already set up the environment with spark, kafka, and created the relevant topic) I am trying to spark-submit it and getting the following error:
Exception in thread "main" java.lang.NoClassDefFoundError: org.apache.spark.streaming.kafka010.LocationStrategies
at JavaWordCount.main(JavaWordCount.java:47)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:90)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55)
at java.lang.reflect.Method.invoke(Method.java:508)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:879)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:197)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:227)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.kafka010.LocationStrategies
at java.net.URLClassLoader.findClass(URLClassLoader.java:609)
at java.lang.ClassLoader.loadClassHelper(ClassLoader.java:924)
at java.lang.ClassLoader.loadClass(ClassLoader.java:869)
at java.lang.ClassLoader.loadClass(ClassLoader.java:852)
It seems that the workers cannot identify the dependencies as part of the environment. I did some research online, and many suggest to create an assembly JAR with maven-shade-plugin. So I also tried to maven-package the jar this way, but still no success.
For reference, here is where the app is failing:
// Configure Spark to connect to Kafka running on local machine
Map<String, Object> kafkaParams = new HashMap<>();
kafkaParams.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,"localhost:9092");
kafkaParams.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
"org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
"org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put(ConsumerConfig.GROUP_ID_CONFIG,"group1");
kafkaParams.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG,"latest");
kafkaParams.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG,true);
//Configure Spark to listen messages in topic test
Collection<String> topics = Arrays.asList("wordCount");
SparkConf conf = new SparkConf().setMaster("local[2]").setAppName("SparkKafkaWordCount");
//Read messages in batch of 30 seconds
JavaStreamingContext jssc = new JavaStreamingContext(conf, Durations.seconds(30));
// Start reading messages from Kafka and get DStream
final JavaInputDStream<ConsumerRecord<String, String>> stream =
KafkaUtils.createDirectStream(jssc, LocationStrategies.PreferConsistent(),
ConsumerStrategies.<String,String>Subscribe(topics,kafkaParams));
In the last line above, the class LocationStrategies is not recognized, even though I have added the right dependency to the pom.xml
Any ideas how to fix this issue?
add this to your pom.xml
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>your main class </mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
and then build the jar with
mvn clean compile assembly:single
You should get two jars in the target directory, one without dependencies and one with dependencies (your-jar-1.0-jar-with-dependencies.jar)
I do not get this error when I include following jars under --jar in spark-spark-submit command:
sql-kafka-0-10_2.11-2.2.1.jar
kafka-clients-0.10.1.0.jar
spark-streaming-kafka-0-10_2.11-2.2.1.jar
spark-streaming-kafka-0-10-assembly_2.11-2.2.1.jar
spark-streaming-kafka_2.11-1.6.3.jar
Even I faced the same issue, didn't get much help from googling but what I've understood after reading many threads, the dependencies which are mentioned in pom.xml file has scope as "provided" which means we need to specify the dependent jar files during the time of execution. Also all the examples inside Apache Spark package was compiled as a single jar file and we need to specify the class path to execute the required module. Download the necessary jar files which you've mentioned in pom.xml and execute like this
spark-submit --jars kafka-clients-1.1.0.jar,spark-streaming_2.11-2.3.0.jar,spark-streaming-kafka-0-10_2.11-2.3.0.jar --class org.apache.spark.examples.streaming.JavaDirectKafkaWordCount target/original-spark-examples_2.11-2.4.0-SNAPSHOT.jar <brokerip:port> <topic>
But before that you need to re-write the consumer properties in the java file or else you will get error saying configuration is missing
kafkaParams.put("bootstrap.servers", brokers);
kafkaParams.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put("group.id", "<group_id>");

SonarQube "Class Not Found" during Main AST Scan

My setup:
Sonarqube 5.1.1
Sonar-Maven Plugin 2.6 (also tried 2.7 and 3.6)
JDK 1.7.0_51
Example of the error:
16:00:54 [INFO] [23:00:54.219] Sensor JavaSquidSensor
16:00:55 [INFO] [23:00:55.030] Java Main Files AST scan...
16:00:55 [INFO] [23:00:55.030] 1532 source files to be analyzed
16:00:58 [ERROR] [23:00:57.927] Class not found: javax.annotation.Nullable
16:00:58 [ERROR] [23:00:57.928] Class not found: javax.annotation.CheckReturnValue
16:00:58 [ERROR] [23:00:58.114] Class not found: javax.annotation.Nullable
According to this stackoverflow question, javax.annotation should be part of java 1.7 and up. Furthermore, I've tried putting it in the local maven repository but that didnt help.
So where is Sonar trying to find this package? Any help?!?
Update:
I've tried modifying the sonar-maven-plugin to include a dependency on javax.annotation
I've tried putting the dependency in my maven's settings.xml
Upgrading my JDK to 1.8 has not helped.
According to http://docs.oracle.com/javase/7/docs/api/index.html?javax/annotation/package-summary.html the classes you expect are not part of JDK 7.
The classes you're looking for are part of google JSR-305 implementation that was initiated here https://code.google.com/p/jsr-305/source/browse/trunk/ri/src/main/java/javax/annotation/Nullable.java?r=24 and which moved to Findbugs:
<dependency>
<groupId>com.google.code.findbugs</groupId>
<artifactId>jsr305</artifactId>
<version>3.0.0</version>
</dependency>
According to https://jcp.org/en/jsr/detail?id=305 the JSR-305 is finished, but is in dormant status and has not been added to a JDK release yet.
Hope it helps.
To avoid adding SonarQube specific dependencies to your project, define a profile like this:
<profile>
<id>sonarqube</id>
<dependencies>
<dependency>
<groupId>org.joda</groupId>
<artifactId>joda-convert</artifactId>
<version>1.2</version>
</dependency>
<dependency>
<groupId>com.google.code.findbugs</groupId>
<artifactId>jsr305</artifactId>
<version>3.0.0</version>
</dependency>
</dependencies>
</profile>
Then run your sonar analysis with a command like
mvn org.sonarsource.scanner.maven:sonar-maven-plugin:3.0.1:sonar -Psonarqube,sonarqube-dev
The sonarqube-dev profile is defined in my ~/.m2/settings.xml and it just specifies where my development environment SonarQube installation is
<profile>
<id>sonarqube-dev</id>
<properties>
<!-- no direct db connections in new sonar -->
<sonar.host.url>
http://localhost:9000/
</sonar.host.url>
</properties>
</profile>
What is achieved by all this?
sonarqube analysis specific dependencies don't pollute the project unnecessarily
no sonarqube maven plugin defined in pom.xml. Each developer and Jenkins can use whatever sonar plugin and server installation they wish
This is more an addendum to the latest answer:
I see similar problems and adding the google findbugs dependency to the project dependencies helps. Similar problems occured with joda convert like
[ERROR] [20:44:25.247] Class not found: org.joda.convert.ToString
Hence I also added
`<dependency>
<groupId>org.joda</groupId>
<artifactId>joda-convert</artifactId>
<version>1.8.1</version>
<scope>provided</scope>
</dependency>`
But note, that I set the scope to provided to prevent these new dependencies to be added to a resulting war file.
However, I still wonder why these errors occur since none of the analyzed classes seem to use these annotations?

Maven plugin builds but can't execute due to java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory

I am using the maven-jspc-plugin in my pom.xml.
When i try to execute the jsp-compile goal (which executes the plugin) I get:
Caused by: java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory
at org.apache.juli.logging.Slf4jLog.<init>(Slf4jLog.java:29)
at org.apache.juli.logging.LogFactory.getLog(LogFactory.java:54)
at org.apache.juli.logging.LogFactory.getLog(LogFactory.java:35)
at org.apache.sling.scripting.jsp.jasper.compiler.OriginalTldLocationsCache.<init>(OriginalTldLocationsCache.java:81)
at org.apache.sling.maven.jspc.JspcMojo.initServletContext(JspcMojo.java:426)
I've tried downloading the (open) source for the maven-jspc-plugin and i am able to easily "mvn install" -- I don't get any build issues, however when i use that build in my project pom it still crashes and tells me it can't find LoggerFactory.
I've logged an issue with the Apache Sling project but am not making much headway.
https://issues.apache.org/jira/browse/SLING-2350
This link includes some more troubleshooting info as well as a simple maven project that uses the maven plugin. downloading the jspc-test.zip and "mvn install"ing will result in the error I've mentioned.
Also, i took a peak at the org.apache.juli pom.xml and it doesnt appear to list any dependencies at all.
Any thoughts on how to resolve would be appreciated.
Thanks!
Plugin dependencies are supplied in a different part of the POM:
<project>
<dependencies>
<!-- dependencies defined here don't get included for plugins -->
...
</dependencies>
<build>
<plugins>
<plugin>
.... jspc plugin section ....
<dependencies>
<dependency>
<!-- Try adding slf4j here --->
Though it does sounds like their POM is invalid if it doesn't already specify slf4j.

Categories