Recently I expirienced problem with testcontainers.
I am using
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>dynalite</artifactId>
<version>1.17.6</version>
<scope>test</scope>
</dependency>
Part from test class:
#Rule
public DynaliteContainer dynamoDB2 =
new DynaliteContainer(DockerImageName.parse("harbor-main.test.com/teamb/dynalite").withTag("v1.2.1-1")
.asCompatibleSubstituteFor("quay.io/testcontainers/dynalite"));
I can successfully run tests in IDE, but in Jenkins I get following:
23/02/13 00:26:29 WARN TestcontainersConfiguration:
Attempted to read Testcontainers configuration file at file:/root/.testcontainers.properties but the file was not found.
Exception message: FileNotFoundException: /root/.testcontainers.properties
(No such file or directory)
I have properties file in classpath as well:
Is there any way to override config file path?
Related
I am able to use the SQLServerDriver when I run my project through maven and can successfully read/write to my db. I'm testing some changes in a specific file and I'm trying to test that individual file by right-clicking the file and running the file.
When it gets to run the setup code it throws an exception (java.lang.ClassNotFoundException: com.microsoft.sqlserver.jdbc.SQLServerDriver).
The code:
Connection conn = null;
Class.forName(dbDriverMsSQL);
conn = DriverManager.getConnection(dbConnectionString);
with this separately defined:
public static String dbDriverMsSQL = "com.microsoft.sqlserver.jdbc.SQLServerDriver";
I'm using the following dependency in my pom.xml:
<dependency>
<groupId>com.microsoft.sqlserver</groupId>
<artifactId>mssql-jdbc</artifactId>
<version>7.4.1.jre8</version>
<scope>test</scope>
</dependency>
Is there some additional setting that I have to configure to be able to read/write from my main()?
Posting #TomStroemer 's comment to mark this resolved:
Your dependency scope is test - do you use this code in a Test? If not this might cause the error. You can also try Class.forName(SQLServerDriver.class); If IntelliJ can't find the import you have a problem with your depenencies.
I removed the scope and I could run successfully.
I am trying to use log4j2 with Spring Boot (2.0.3).
I have added below dependency in my pom file and excluded spring-boot-starter-logging as well:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-log4j2</artifactId>
</dependency>
The problem is when I deploy my war on tomcat, and put the config file with name log4j2.xml on class path(WEB-INF/classes) logs are getting generated. But If I am trying to use a different file name by putting below property in application.properties, it is not working.
logging.config=classpath:mylog4j2.xml
I have put mylog4j2 in the same folder (WEB-INF/classes) as log4j2, but still logs are not getting generated.
Also I am not allowed to set config path using command line arguments and have to use property file only.
Can anyone please help me out.
I used a H2 database in unit test, use java configuration:
#Configuration #EnableJpaRepositories(basePackageClasses = AdvertisementRepository.class) public class EmbeddedDatabaseConfig {
/**
* Creates DataSource for an embedded Database (H2).
*/
#Bean
public DataSource dataSource() {
return new EmbeddedDatabaseBuilder().setType(EmbeddedDatabaseType.H2).build();
}
and set scope to test in pom.xml:
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>1.4.184</version>
<scope>test</scope>
</dependency>
It's successfully build and in JUnit test, but when I run it on server, it reports error:
java.lang.ClassNotFoundException: org.h2.Driver
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1274)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1108)
at org.springframework.util.ClassUtils.forName(ClassUtils.java:250)
at org.springframework.jdbc.datasource.embedded.H2EmbeddedDatabaseConfigurer.getInstance(H2EmbeddedDatabaseConfigurer.java:48)
at org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseConfigurerFactory.getConfigurer(EmbeddedDatabaseConfigurerFactory.java:39)
... 51 more
I'm confused: the junit test won't run in runtime, right? why my code automatically read the java config class, and not found driver class?
While I will not be able to point out the exact issue, the problem you are facing is that org.h2.Driver is not available while trying to run the test. As mentioned in one of the comments make sure that jar is available in the classpath.
PS: If you are using an Ant build and trying to run test before deploying to production, make sure the jar is available before ANT runs your tests.
I need to implement spring security in my web application I have created user repository custom Service class and also POJO required to communicate with backend and i am not using in-memory configuration where I am using,
#Autowired
public void configureGlobal(AuthenticationManagerBuilder auth) throws Exception {
auth.parentAuthenticationManager(authenticationManager).userDetailsService(customUserDetailsService);
}
Dependencies in pom.xml,
<spring.version>4.2.2.RELEASE</spring.version> <spring.security.version>4.0.1.RELEASE</spring.security.version>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-web</artifactId>
<version>${spring.security.version}</version>
<scope>provided</scope>
</dependency>
While running the application I am getting below error
Caused by: java.io.FileNotFoundException: class path resource
[org/springframework/security/config/annotation/web/configuration/WebSecurityConfigurerAdapter.class]
cannot be opened because it does not exist
but jar is already downloaded
add maven dependencies in the deployment assembly and try again.(I guess you are using eclipse ide) open the project properties got to deployment assembly and click on add and select Java build path entries. move next and select maven dependencies
I want to upload a file to HDFS. I compiled my code using following jars as dependencies:
hadoop-auth-2.6.1.jar,
hadoop-common-2.6.1.jar and
hadoop-hdfs-2.6.1.jar,
My code:
I compiled it with Ant. But, it gave me this error: No FileSystem for scheme:hdfs.
Then I changed the code and compiled again:
But now I got another error: Class org.apache.hdfs.DistributedFileSystem not found.
What's wrong? And what should I do?
DistributedFileSystem is part of hadoop-core.
To fix this problem, you need to include hadoop-core-1.2.1.jar also (Note: I am using Maven for building):
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
Overall, I am using following Maven dependencies:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.7.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
While getting Hadoop Filesystem object like below
FileSystem fs = FileSystem.get(hdfsUrl,configuration);
If you get following error :
"No FileSystem for scheme:hdfs"
You can resolve it by setting following 2 properties on configuration.
configuration.set("fs.hdfs.impl","org.apache.hadoop.hdfs.DistributedFileSystem");
configuration.set("fs.file.impl", "org.apache.hadoop.fs.LocalFileSystem");
Now, you may get new error like below :
java.lang.ClassNotFoundException: Class org.apache.hadoop.hdfs.DistributedFileSystem not found
Hadoop-common.jar uses Thread.currentThread.getContextClassLoader() and configuration.getClassLoader to load classes.
So, if you set your classLoader by using
Thread.currentThread.setContextClassLoader(yourClassLoader);
configuration.setClassLoader(yourClassLoader);
you will be able to load required class from other hadoop jars (e.g. hadoop-hdfs)
Let me know if you need more help. And don't forget to upvote if you find this bit useful.
I have the same problem, when I compile my Java code into a executable jar and run the compiled jar. Always some error "Not found" (e.g. in your case no FileSystem...), which means some hadoop jar is not included in the compilation.
The solution is add the correct dependencies in the Maven/Gradle or add (all) the jars.
In my case, the hdfs is from the class org.apache.hadoop.hdfs.DistributedFileSystem from the jar: hadoop-hdfs-client-3.2.1.jar.
The relevant jars which have been used can be found in the log file (if you successfully run the program and have the log file). In my example is the following:
You can simply add all the jars (from the installed hadoop folder). They should be in the folder of common/hdfs/ ... under the folder: hadoop 3.2.1/share/hadoop. There are possible other jars which are used but not shown in the log. To be safe, just include all the jars. You can run hdfs classpath in the terminal to find the location of all the jars.
Once all the jars have been added, in your java code, you may also need to set the hadoop configuration
Configuration hadoopConfiguration = new Configuration();
hadoopConfiguration.addResource(new Path(CoreSiteXMLStr));
hadoopConfiguration.set("fs.hdfs.impl", "org.apache.hadoop.hdfs.DistributedFileSystem");