passing variables beetwen groovy files - java

I'm managing many jobs in Jenkins by DSL plugin. That plugin is using .groovy definitions so I think even if someone doesn't use Jenkins but using groovy may be able to help.
Generally, I want to create an additional file, that may be a groovy file, JSON or YAML, whatever. It important is the possibility to connect that file with my .groovy file.
In that file, I'm defining variables(rather just strings) for example address IP or other stuff
eg.
ip_gitlab: 1.2.3.4
default_user: admin
In my groovy files, I want to be able to use these variables.
That approach is possible in groovy?

I suggest use a property file as #JBaruch wrote
ip_gitlab=1.2.3.4
default_user=admin
And load it
Properties properties = new Properties()
File propertiesFile = new File('test.properties')
propertiesFile.withInputStream {
properties.load(it)
}
Then you can use it, get ip for example:
def ipPropertyName= 'ip_gitlab'
properties."$ipPropertyName"

Make groovy file and define some general information and use load.
E.g., hello.conf (written by groovy)
build_name = 'hello'
build_config = [
'git': 'your git repository',
'build_job': ['bulid_a', 'build_b']
]
And use it by load
load 'hello.conf'
println(build_name)
for (job in build_config['build_job']) {
build job: job
}

if you want a Jenkins specific answer:
There's a Config File Provider Plugin to jenkins.
You can store config/properties files via Managed files.
Go to Manage Jenkins>Managed files and and create a new file. It supports .groovy, .json, .xml and many others.
Once you have that, you can load the said file inside a job using the Provide Config file checkbox which will load the file into an env variable automatically.

Related

How to use environment variables on a Spring Boot and Gradle application?

I explain my problem;
I have a web app developed using Vue.js and Spring Boot, this application working a PDF sheet and saves the file that is generated by Java, I use two lines of code to separate my development part from the production part (I leave you the 2 lines of code like this you understand the concept well)
FileReader leggoFile = new FileReader(System.getProperty("user.dir") + "/temp/webapps/foolder/foolder/file.pdf");
// FileReader leggoFile = new FileReader(System.getProperty("catalina.base") + "/temp/webapps/foolder/foolder/file.pdf");
This whole application is built using the "bootWar gradle plugin" which returns me a .war which I will then upload to a Tomcat server;
My goal is this:
I would like to set a single environment variable so that if I want to build the project I don't have to comment/uncomment that line for example:
FileReader leggoFile = new FileReader({{variableEnvironment}} + "/temp/webapps/foolder/foolder/file.pdf")
my question is this:
How dp Gradle and Spring Boot handle environments? Is there a way to separate environments? Is this possible or should I start thinking differently?
I tried to search on something but unfortunately I was tied to the problem that I don't understand how the .war file is generated through the BootWar Gradle plugin, also searching on the internet I understood that environment Gradle and environment Spring are two separate things but in general even if I know the line of code is wrong in the beginning my question is always the same:
How are environment variables handled in Spring and Gradle?
With Spring Boot, you can add properties to your application by adding an file named application.yaml to your resources folder (src/resources/). In addition you can add properties through application-{profile}.yaml to add properties only for given Spring profiles. For instance application-test.yaml would only be read if "test" is an active profile. When booting up the application, Spring will first read application.yaml, then any profile-specific YAML-files, such that any overlapping properties are replaced.
There are several approaches to injecting the property. A simple solution is to add a field to your component annotated with #Value("${PATH}) and replace PATH with the property's path in the YAML.

Cannot load custom File System on Flink's shadow jar

I needed some metadata on my S3 objects, so I had to override the S3 file system provided by flink.
I followed this guide to the letter and now I have a custom file system which works on my local machine, when I run my application in the IDE.
Now I am trying to use it on a local kafka cluster OR on my docker deployment, and I keep getting this error Could not find a file system implementation for scheme 's3c'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
I package my application using shadowJar, using the following configuration:
shadowJar {
configurations = [project.configurations.flinkShadowJar]
mainClassName = "dev.vox.collect.delivery.Application"
mergeServiceFiles()
}
I have my service file in src/main/resources/META-INF/services/org.apache.flink.core.fs.FileSystemFactory that contains a single line with the namespace and name of my factory :dev.vox.collect.delivery.filesystem.S3CFileSystemFactory
If I unzip my shadowJar I can see in its org.apache.flink.core.fs.FileSystemFactory file it has both my factory and the others declared by Flink, which should be correct:
dev.vox.collect.delivery.filesystem.S3CFileSystemFactory
org.apache.flink.fs.s3hadoop.S3FileSystemFactory
org.apache.flink.fs.s3hadoop.S3AFileSystemFactory
When I use the S3 file system provided by flink everything works, it is just mine that does not.
I am assuming the service loader is not loading my factory, either because it does not find it or because it is not declared correctly.
How can I make it work? Am I missing something?

play 2.4.x current application environment

Is there any function to return the current working environment of a Play framework application ?
I tried the following but it doesn't seem to be working correctly;
String environment = play.api.Play.Mode
NOTE: I don't want to use isDev() isProd() stuff, I want to be able to create custom environments
PlayFramework 2.x supports only 3 modes: Prod, Dev and Test. First is used for production. Second provides more development additions like hotloading just editet classes. The last one is like the second one but with test libraries.
Play 1.x had also ID, which was able for using as different environment. For instance staging or instance of distributed server.
Play 2.x sadly doesn't support ID's anymore. But You can achieve the same effects manually.
Suppose you want to run your application in 'staging' mode.
First you need put configuration file along with basic configuration file, but named as application.staging.conf.
Second step is add to Global.scala code responsible for managing configuration files, something like this:
import java.io.File
import play.api._
import com.typesafe.config.ConfigFactory
object Global extends GlobalSettings {
override def onLoadConfig(config: Configuration, path: File, classloader: ClassLoader, mode: Mode.Mode): Configuration = {
val env = System.getProperty("environment")
val envConfig = config ++ Configuration(ConfigFactory.load(s"application.${environment}.conf"))
super.onLoadConfig(environmentSpecificConfig, path, classloader, mode)
}
}
As you see it reads environment value and look at specific configuration file.
The last step is telling play framework which mode it should use. The best way is by starting command:
activator run -Denvironment=staging
Should works.
In Java it is play.Application.isDev() and isProd() or isTest()
https://playframework.com/documentation/2.2.x/api/java/index.html

Is there any way to load a component from filesystem in Oracle Commerce(ATG)?

I'm trying to find out if we can load a oracle commerce component from file system. Generally we assemble all the code into an ear file and deploy it, however, I got a requirement where in I have to store some components in file system rather than packaging them along with ear file.
I know that we can use URLClassloader to load a class as shown below,
File classDir = new File("A:\\LodeeModule\\classes");
URL[] url = { classDir.toURI().toURL() };
ClassLoader loader = new URLClassLoader(url);
for (File file : classDir.listFiles()) {
String filename = file.getName().replace(".class", "");
loader.loadClass("com.buddha.testers." + filename).getConstructor().newInstance();
}
but how can we use the same for an component which has to be resolved by Nucleus at later point of time? Is there any way to instruct Nucleus to resolve component from file system?
You should just be able to add the JAR that contains the components classes to the CLASSPATH system variable used by the application server instance.
Then in the component configuration just define the implementing class as you normally would
$class=some.class.path.class
If you are using Jboss EAP 6+ on a newer version of ATG (11.0+) you might have some more trouble, you have to jump through some more hoops due to its classloader
https://docs.jboss.org/author/display/AS7/Class+Loading+in+AS7
Essentially you would need to define a jboss module containing your jar files, and define a dependency between the ear's "module" and the module containing your classes.
Alternatively you can define a ClassLoaderService that will manage the classes for your JARs
To do this, you need to define a new ClassLoaderService, so create a new properties file as you would with any other component.
/my/custom/ClassLoaderService.properties
$class=atg.nucleus.ServicesManifestClassLoaderService
$description=Custom Class Loader Service.
# The files to go into the classpath of the classloader
classpathFiles=\
/path/to/my/jars/lib/someClasses.jar,\
/path/to/my/jars/lib/someOtherClasses.jar
loggingDebug=false
Then in the actual component that you need these classes for add this line;
$classloader=/my/custom/ClassLoaderService
I think you're looking for the atg.dynamo.data-dir property. If you specify that property dynamo will look at that location for the "server configs" or properties files. This allows you to separate the configs from the ear file.
Note: You can still include configs in the ear, I believe they will still have first precedence
It's usually specified when you start the server, something like:
run.sh -c <your server> -Datg.dynamo.data-dir=/data/something/serverconfigs
This feature is largely undocumented, but many people know about it.
See http://docs.oracle.com/cd/E24152_01/Platform.10-1/ATGPlatformProgGuide/html/s0302developmentmodeandstandalonemode01.html
EDIT:
I mistook what you were originally asking. You might want to take a look at the disposable class loader that ATG provides, but keep in mind this is only intended for development purposes.

tokenize application.conf in play java app

I want to tokenize a few keys in my application.conf file to use variables from another properties file. How can I do so? Here is an example.
my-play-project/conf/application.conf
db.default.url=${env.db.url}
db.default.driver=${env.db.driver}
db.default.user=${env.db.user}
db.default.pass=${env.db.password}
my-play-project/conf/env/devlab/project.properties
db.url=myoracleserver.lab.org:1521
db.driver=oracle.thin
db.user=myname
db.password=mypassword
my-play-project/conf/env/devlab2/project.properties
db.url=myoracleserver2.lab.org:1521
db.driver=oracle.thin
db.user=myname
db.password=mypassword
Q. Is there a way to set devlab/project.properties to be part of system resolvable properties?
Play's configuration uses Typesafe Config. There are several ways of combining configuration together on the classpath and at runtime.
Creating multiple application.conf and application.properties files and putting them all on the classpath (i.e. in different JARs). The configuration will be combined. See Standard Behavior.
Using includes to pull in files with different names.
Using substitutions to pull in values from other config files.
Using substitutions to pull in environment variables or system properties.

Categories