I am creating a project which have multiple module.I am using the gradle build tool and IntelliJ IDEA.I have two module webservice and utilities.
Project Structure as-
I am reading the config.properties file in my utilities module.In which I am defining the server port and other values. When I am calling the method of utilities module (which reads the property file and return the values) from my webservice module classes that work fine and return proper values.
But when trying to call same method from test classes of webservice module then utility class method failed to read property file.
Now I am not getting what is going wrong.
Thanks.
Be sure that the property file exists in /src/test/resources as well as /src/main/resources as the classpath used when executing tests differs to your regular application classpath.
Related
I needed some metadata on my S3 objects, so I had to override the S3 file system provided by flink.
I followed this guide to the letter and now I have a custom file system which works on my local machine, when I run my application in the IDE.
Now I am trying to use it on a local kafka cluster OR on my docker deployment, and I keep getting this error Could not find a file system implementation for scheme 's3c'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
I package my application using shadowJar, using the following configuration:
shadowJar {
configurations = [project.configurations.flinkShadowJar]
mainClassName = "dev.vox.collect.delivery.Application"
mergeServiceFiles()
}
I have my service file in src/main/resources/META-INF/services/org.apache.flink.core.fs.FileSystemFactory that contains a single line with the namespace and name of my factory :dev.vox.collect.delivery.filesystem.S3CFileSystemFactory
If I unzip my shadowJar I can see in its org.apache.flink.core.fs.FileSystemFactory file it has both my factory and the others declared by Flink, which should be correct:
dev.vox.collect.delivery.filesystem.S3CFileSystemFactory
org.apache.flink.fs.s3hadoop.S3FileSystemFactory
org.apache.flink.fs.s3hadoop.S3AFileSystemFactory
When I use the S3 file system provided by flink everything works, it is just mine that does not.
I am assuming the service loader is not loading my factory, either because it does not find it or because it is not declared correctly.
How can I make it work? Am I missing something?
I have a mvn project packaged as applicationConfig.jar that contains commonly shared properties across different other projects (WAR's(web application eg: application.war) and JAR's(batch eg: applicationBatch.jar)).
I have used propertysourcesplaceholderconfigurer using annotation to initialize these properties in my applicationConfig.jar
This applicationConfig.jar is now added as dependency in pom.xml's for application.war & applicationBatch.jar
1) The java code in application.war is able to access properties initialized by code executed in applicationConfig.jar on server startup property. No issues here.
2) The applicationBatch.jar which is run from command line on linux machine, is unable to access properties. It appears like the properties initialization code is never executed, when the applicationBatch.jar is run.
Can anyone please help how can i ensure, code sitting in Jar file (applicationConfig.jar responsible for initializing properties using propertysourcesplaceholderconfigurer), is executed, when a batch jar (applicationBatch.jar) is run from command line.
Code Snippet Below:
applicationConfig.jar:
[Attachment ][1]
[Property referenced using annotation and property from properties file][2]
Spring will manage InternalConfig if you import it using #Import or have configured component scan with a base package parent of the InternalConfig one. applicationBatch seems to miss one of this method.
Could you check that ?
I am using Play Framework 2.2.x/Java.
I want to create a module to sperate some of the logics from my main application and also I want to use the application.conf inside the module for its configurations instead of using the main application's config file!
But using the following snippet in the module, only reads values from the main application config file:
Play.application().configuration().getString("myVar");
Is there any other way to get the values from the application.conf file inside my module?
Play uses the typesafe-config library for reading configuration. This is actually a Java library, even though Typesafe is a Scala company.
The documentation for typesafe-config says "The idea is that libraries and frameworks should ship with a reference.conf in their jar."
So your module's config should be stored in a file called reference.conf - the format is exactly the same, it's just the name that is different.
The problem occurs because there is a conflict between the two config files because they are named the same, so it probably goes by classpath order or something. Don't use two application.conf files - this problem has bitten me in the past!
Save your config into ie. /conf/my-module.conf of the main app and then include it at the end of application.conf like:
include "my-module.conf"
I'm using a abstract class in another module for reading and input for my testdata with:
package src/main/java/path/to/my/base/testclass;
InputStream stream = getClass().getResourceAsStream(filename);
filename is eg "test.txt", located in src/main/resources/path/to/my/base/testclass
As long as I put this abstract class into the same module as my testclasses are in, everything works fine.
Then i extract the acstract class (as well as the resources) to other module, compile, add to pom etc.
Result: My test implementation runs fine, but: I'm getting IO exception as the file could not be found.
What am I missing here? Why does the abstract class work within the same module, but not within another?
Test resources are for this artifact's tests only, they don't get deployed.
There are two possible ways around this:
Dirty: Make your app deploy a test jar along with the main jar,
and add that as a dependency with scope TEST to the second artifact.
Clean: Create a separate test artifact for base test classes and
common test resources. Important: in this artifact, nothing goes in src/test and everything goes in src/main. Reference this test artifact from both other
artifacts with scope TEST.
Grails have cofig for spring bean called resources.groovy. And as i understand from docs it allows you to include another file, using loadBeans(%path%)
I'm tried with this:
println 'loading application config ...'
// Place your Spring DSL code here
beans = {
loadBeans("classpath:security") //i'm tried with "spring/security" and "spring/security.groovy" also
}
but when grails is running it log following error:
Caused by: org.springframework.beans.factory.parsing.BeanDefinitionParsingException: Configuration problem: Error evaluating bean definition script: class path resource [security] cannot be opened because it does not exist
Offending resource: class path resource [security]; nested exception is java.io.FileNotFoundException: class path resource [security] cannot be opened because it does not exist
at grails.spring.BeanBuilder.loadBeans(BeanBuilder.java:470)
at grails.spring.BeanBuilder.loadBeans(BeanBuilder.java:424)
at resources$_run_closure1.doCall(resources.groovy:13)
at resources$_run_closure1.doCall(resources.groovy)
... 45 more
Script security.groovy is exists at grails-app/conf/spring and compiled by grails maven plugin into target/classes/security.class.
Directory target/resources/spring is empty at this time
How i can configure Grails or grails-maven-plugin to copy this config files, not compile it into classes?
p.s. this problem also presents when i try to include config scripts using grails.config.locations = [ %path% ] inside conf/Config.groovy, my groovy scripts compiles into classes and because of it, grails config builder can't find them :(
Did you try:
println 'loading application config ...'
// Place your Spring DSL code here
beans = {
loadBeans("classpath:*security.groovy")
}
(this should load all Groovy files on the classpath ending with security.groovy and parse them into bean definitions).
Update: Found an interesting thread with this message as reference and my understanding is that one trick is to use ant in scripts/_Events.groovy to copy the .groovy file to the classesDirPath dir and then simply use:
beans = {
// load spring-beans for db-access via spring-jdbc-template
loadBeans('security.groovy')
// load some other spring-beans
...
}
This looks like a hack to get things working in both the war and when running run-app though. Not sure how things "should" be done (if this even makes sense).