My Plugin is loading before Vault, even tho I added a depend/load-after into the plugin.yml
I tried depend, softdepend and loadbefore. I even tried downgrading the on the server used version of Vault.
I tried even loadbefore without depend and the other way around.
My plugins.yml
name: TrainsaPlugin
version: ${project.version}
main: de.gamingcraft.trainsa.TrainsaPlugin
(...)
loadbefore:
- Vault
depend:
- Vault
commands: (...)
My Main Class:
public final class TrainsaPlugin extends JavaPlugin {
(...)
public static Economy econ = null;
public static Permission perms = null;
public static Chat chat = null;
#Override
public void onEnable() {
(...)
if (!setupEconomy() ) {
System.out.println("Disabled due to no Vault dependency found!");
getServer().getPluginManager().disablePlugin(this);
return;
}
setupPermissions();
setupChat();
}
private boolean setupEconomy() {
if (getServer().getPluginManager().getPlugin("Vault") == null) {
return false;
}
RegisteredServiceProvider<Economy> rsp = getServer().getServicesManager().getRegistration(Economy.class);
if (rsp == null) {
return false;
}
econ = rsp.getProvider();
return econ != null;
}
private boolean setupChat() {
RegisteredServiceProvider<Chat> rsp = getServer().getServicesManager().getRegistration(Chat.class);
chat = rsp.getProvider();
return chat != null;
}
private boolean setupPermissions() {
RegisteredServiceProvider<Permission> rsp = getServer().getServicesManager().getRegistration(Permission.class);
perms = rsp.getProvider();
return perms != null;
}
#Override
public void onDisable() {
}
(...)
}
The Log
[22:35:43 INFO]: [TrainsaPlugin] Disabling TrainsaPlugin v1.0
(...)
[22:35:43 INFO]: Server permissions file permissions.yml is empty, ignoring it
[22:35:43 INFO]: Done (1,912s)! For help, type "help" or "?"
[22:35:43 INFO]: [Vault] Checking for Updates ...
I know, that my main-class disables my plugin when Vault is not found and i want that because it's essential at the moment.
TL;DR: My problem is, that Vault loads too late.
To your plugin.yml add depend: [Vault]
For more info see this
You added vault to loadbefore, which makes your plugin load before vault. If you want vault to load before your plugin, use depend: [Vault,someOtherPlugin,someOtherPlugin,etc].
I fixed it by adding
<scope>provided</scope>
to every dependency, that was a plugin in the pom.xml
Related
I'm trying to create a custom gradle plugin (100% java) which will automatically configure Artifactory, avoiding the need of the following DSL:
artifactory {
contextUrl = "${artifactory_contextUrl}" //The base Artifactory URL if not overridden by the publisher/resolver
publish {
repository {
contextUrl = "${artifactory_contextUrl}"
repoKey = 'android-dev'
username = "${artifactory_user}"
password = "${artifactory_password}"
maven = true
}
}
resolve {
repository {
contextUrl = "${artifactory_contextUrl}"
repoKey = 'android-dev-distributions'
username = "${artifactory_user}"
password = "${artifactory_password}"
maven = true
}
}
}
I'm trying to re-create #agrosner own solution (from https://stackoverflow.com/a/25669431/1880280) but I'm missing "ArtifactoryAction". I can't find it anywhere.
The nonworking version posted by #agrosner is the following code:
// Set up plugins so we never need to add them to a build.gradle
project.getPlugins().apply(MAVEN);
project.getPlugins().apply(ARTIFACTORY);
project.setGroup(GROUP);
// Add Artifactory repo to the repositories
project.getRepositories().maven(new ArtifactoryAction(contextUrl + ARTIFACTORY_REPO_ENDPOINT, user, pass));
// We will define the plugin convention here so all of our libraries do not need to
// declare the artifactory closure manually
ArtifactoryPluginConvention pluginConvention =
ArtifactoryPluginUtil.getArtifactoryConvention(project);
pluginConvention.setContextUrl(contextUrl);
PublisherConfig publisherConfig = new PublisherConfig(pluginConvention);
publisherConfig.setContextUrl(contextUrl);
pluginConvention.setPublisherConfig(publisherConfig);
// Use reflection to access private field
PublisherConfig.Repository repository = null;
Field[] fields = PublisherConfig.class.getDeclaredFields();
for(Field field : fields) {
if(field.getName().equalsIgnoreCase("repository")) {
try {
field.setAccessible(true);
repository = (PublisherConfig.Repository) field.get(publisherConfig);
} catch (Exception e) {
e.printStackTrace();
}
}
}
if(repository != null) {
repository.setPassword(pass);
repository.setUsername(user);
repository.setRepoKey(PUBLISHER_REPO_KEY);
repository.setMavenCompatible(true);
}
GradleArtifactoryClientConfigUpdater.update(pluginConvention.getClientConfig(), project.getRootProject());
Can anyone help with an updated 100% java version of this?
Additionally, how would be for the following DSL?
artifactory {
publish {
repository {
repoKey = 'default-gradle-dev-local' // The Artifactory repository key to publish to
username = "${artifactory_user}" // The publisher user name
password = "${artifactory_password}" // The publisher password
maven = true
}
defaults {
publications('mavenJava')
publishArtifacts = true
publishPom = true
}
}}
Thanks in advance
César
ps. The DSL version that was published at #agrosner question thread is not useful for me. I need a Java version.
For your first question related to ArtifactoryAction: this is neither Gradle API nor Artifactory plugin related api, but most probably a custom class that the response author has implemented himself, as a shortcut to declare his custom Artifactory maven repo.
See this API, used to declare maven repositories :
MavenArtifactRepository maven(Action<? super MavenArtifactRepository> action)
So you can use:
project.getRepositories().maven( mavenArtifactRepository -> {
mavenArtifactRepository.setUrl(contextUrl + MAVEN_PUBLIC_REPO);
mavenArtifactRepository.getCredentials().setUsername("user");
mavenArtifactRepository.getCredentials().setPassword("password");
});
or wrap the action code into a custom implementation of Action<? super MavenArtifactRepository> :
project.getRepositories().maven( new ArtifactoryAction(contextUrl + MAVEN_PUBLIC_REPO, "user", "password") );
[...]
// custom action class, defined somewhere else
class ArtifactoryAction implements Action<MavenArtifactRepository> {
private final String url, userName, password;
ArtifactoryAction(String url, String userName, String password) {
this.url = url; this.userName = userName; this.password = password;
}
#Override
public void execute(MavenArtifactRepository target) {
target.setUrl(url);
target.getCredentials().setUsername(userName);
target.getCredentials().setPassword(password);
}
}
For the other question with java translation of the artifactory { } DSL : see full example below with some inline comments. ( not tested but translated from my kotlin implementation which works fine)
import org.gradle.api.Plugin;
import org.gradle.api.Project;
import org.gradle.api.publish.maven.plugins.MavenPublishPlugin;
import org.jfrog.gradle.plugin.artifactory.ArtifactoryPlugin;
import org.jfrog.gradle.plugin.artifactory.ArtifactoryPluginUtil;
import org.jfrog.gradle.plugin.artifactory.dsl.ArtifactoryPluginConvention;
import org.jfrog.gradle.plugin.artifactory.dsl.PublisherConfig;
public class CustomArtifactoryPlugin implements Plugin<Project> {
private static String MAVEN_PUBLIC_REPO = "/maven-public";
private String contextUrl = "MY_CUSTOM_REPO_BASE_URL";
#Override
public void apply(Project project) {
// Base gradle publishing plugin
// - maven plugin for pulishing java artefacts
// - ivy plugin if for publishing other type of artefacts: rpms, archives, ..
project.getPluginManager().apply(MavenPublishPlugin.class);
// project.getPluginManager().apply(IvyPublishPlugin.class);
// Apply the Artifactory plugin
project.getPluginManager().apply(ArtifactoryPlugin.class);
// Add Artifactory repo to the repositories
project.getRepositories().maven(new ArtifactoryAction(contextUrl + MAVEN_PUBLIC_REPO, "user", "password"));
// Configure artifactory plugin - using 'withPlugin' ensures that plugin has already been applied
project.getPluginManager().withPlugin("com.jfrog.artifactory", appliedPlugin -> {
// artifactory {
ArtifactoryPluginConvention pluginConvention = ArtifactoryPluginUtil.getArtifactoryConvention(project);
// contextUrl = "${contextUrl}"
pluginConvention.setContextUrl(contextUrl);
// publish {
PublisherConfig publisherConfig = new PublisherConfig(pluginConvention);
pluginConvention.setPublisherConfig(publisherConfig);
// repository {
pluginConvention.getPublisherConfig().repository(repository -> {
// repoKey = 'default-gradle-dev-local' ...
repository.setRepoKey("default-gradle-dev-local"); // The Artifactory repository key to publish to
repository.setUsername("${artifactory_user}"); // The publisher user name
repository.setPassword("${artifactory_password}"); // The publisher password
repository.setMavenCompatible(true);
});
// defaults {
pluginConvention.getPublisherConfig().defaults(artifactoryTask -> {
// publications('mavenJava')
artifactoryTask.publications("mavenJava");
artifactoryTask.setPublishArtifacts("true");
artifactoryTask.setPublishPom("true");
});
});
}
}
EDIT for the publication configuration, you can do as follows:
// create maven publication
project.getExtensions().configure(PublishingExtension.class, publishingExtension -> {
publishingExtension.publications(publications -> {
publications.create("mavenPublication", MavenPublication.class, mavenPublication -> {
mavenPublication.setVersion("1.0.0");
mavenPublication.setGroupId("groupId");
mavenPublication.from(project.getComponents().findByName("java"));
});
});
});
TL:DR; When running tests with different #ResourceArgs, the configuration of different tests get thrown around and override others, breaking tests meant to run with specific configurations.
So, I have a service that has tests that run in different configuration setups. The main difference at the moment is the service can either manage its own authentication or get it from an external source (Keycloak).
I firstly control this using test profiles, which seem to work fine. Unfortunately, in order to support both cases, the ResourceLifecycleManager I have setup supports setting up a Keycloak instance and returns config values that break the config for self authentication (This is due primarily to the fact that I have not found out how to get the lifecycle manager to determine on its own what profile or config is currently running. If I could do this, I think I would be much better off than using #ResourceArg, so would love to know if I missed something here).
To remedy this shortcoming, I have attempted to use #ResourceArgs to convey to the lifecycle manager when to setup for external auth. However, I have noticed some really odd execution timings and the config that ends up at my test/service isn't what I intend based on the test class's annotations, where it is obvious the lifecycle manager has setup for external auth.
Additionally, it should be noted that I have my tests ordered such that the profiles and configs shouldn't be running out of order; all the tests that don't care are run first, then the 'normal' tests with self auth, then the tests with the external auth profile. I can see this working appropriately when I run in intellij, and the fact I can tell the time is being taken to start up the new service instance between the test profiles.
Looking at the logs when I throw a breakpoint in places, some odd things are obvious:
When breakpoint on an erring test (before the external-configured tests run)
The start() method of my TestResourceLifecycleManager has been called twice
The first run ran with Keycloak starting, would override/break config
though the time I would expect to need to be taken to start up keycloak not happening, a little confused here
The second run is correct, not starting keycloak
The profile config is what is expected, except for what the keycloak setup would override
When breakpoint on an external-configured test (after all self-configured tests run):
The start() method has now been called 4 times; appears that things were started in the same order as before again for the new run of the app
There could be some weirdness in how Intellij/Gradle shows logs, but I am interpreting this as:
Quarkus initting the two instances of LifecycleManager when starting the app for some reason, and one's config overrides the other, causing my woes.
The lifecycle manager is working as expected; it appropriately starts/ doesn't start keycloak when configured either way
At this point I can't tell if I'm doing something wrong, or if there's a bug.
Test class example for self-auth test (same annotations for all tests in this (test) profile):
#Slf4j
#QuarkusTest
#QuarkusTestResource(TestResourceLifecycleManager.class)
#TestHTTPEndpoint(Auth.class)
class AuthTest extends RunningServerTest {
Test class example for external auth test (same annotations for all tests in this (externalAuth) profile):
#Slf4j
#QuarkusTest
#TestProfile(ExternalAuthTestProfile.class)
#QuarkusTestResource(value = TestResourceLifecycleManager.class, initArgs = #ResourceArg(name=TestResourceLifecycleManager.EXTERNAL_AUTH_ARG, value="true"))
#TestHTTPEndpoint(Auth.class)
class AuthExternalTest extends RunningServerTest {
ExternalAuthTestProfile extends this, providing the appropriate profile name:
public class NonDefaultTestProfile implements QuarkusTestProfile {
private final String testProfile;
private final Map<String, String> overrides = new HashMap<>();
protected NonDefaultTestProfile(String testProfile) {
this.testProfile = testProfile;
}
protected NonDefaultTestProfile(String testProfile, Map<String, String> configOverrides) {
this(testProfile);
this.overrides.putAll(configOverrides);
}
#Override
public Map<String, String> getConfigOverrides() {
return new HashMap<>(this.overrides);
}
#Override
public String getConfigProfile() {
return testProfile;
}
#Override
public List<TestResourceEntry> testResources() {
return QuarkusTestProfile.super.testResources();
}
}
Lifecycle manager:
#Slf4j
public class TestResourceLifecycleManager implements QuarkusTestResourceLifecycleManager {
public static final String EXTERNAL_AUTH_ARG = "externalAuth";
private static volatile MongodExecutable MONGO_EXE = null;
private static volatile KeycloakContainer KEYCLOAK_CONTAINER = null;
private boolean externalAuth = false;
public synchronized Map<String, String> startKeycloakTestServer() {
if(!this.externalAuth){
log.info("No need for keycloak.");
return Map.of();
}
if (KEYCLOAK_CONTAINER != null) {
log.info("Keycloak already started.");
} else {
KEYCLOAK_CONTAINER = new KeycloakContainer()
// .withEnv("hello","world")
.withRealmImportFile("keycloak-realm.json");
KEYCLOAK_CONTAINER.start();
log.info(
"Test keycloak started at endpoint: {}\tAdmin creds: {}:{}",
KEYCLOAK_CONTAINER.getAuthServerUrl(),
KEYCLOAK_CONTAINER.getAdminUsername(),
KEYCLOAK_CONTAINER.getAdminPassword()
);
}
String clientId;
String clientSecret;
String publicKey = "";
try (
Keycloak keycloak = KeycloakBuilder.builder()
.serverUrl(KEYCLOAK_CONTAINER.getAuthServerUrl())
.realm("master")
.grantType(OAuth2Constants.PASSWORD)
.clientId("admin-cli")
.username(KEYCLOAK_CONTAINER.getAdminUsername())
.password(KEYCLOAK_CONTAINER.getAdminPassword())
.build();
) {
RealmResource appsRealmResource = keycloak.realms().realm("apps");
ClientRepresentation qmClientResource = appsRealmResource.clients().findByClientId("quartermaster").get(0);
clientSecret = qmClientResource.getSecret();
log.info("Got client id \"{}\" with secret: {}", "quartermaster", clientSecret);
//get private key
for (KeysMetadataRepresentation.KeyMetadataRepresentation curKey : appsRealmResource.keys().getKeyMetadata().getKeys()) {
if (!SIG.equals(curKey.getUse())) {
continue;
}
if (!"RSA".equals(curKey.getType())) {
continue;
}
String publicKeyTemp = curKey.getPublicKey();
if (publicKeyTemp == null || publicKeyTemp.isBlank()) {
continue;
}
publicKey = publicKeyTemp;
log.info("Found a relevant key for public key use: {} / {}", curKey.getKid(), publicKey);
}
}
// write public key
// = new File(TestResourceLifecycleManager.class.getResource("/").toURI().toString() + "/security/testKeycloakPublicKey.pem");
File publicKeyFile;
try {
publicKeyFile = File.createTempFile("oqmTestKeycloakPublicKey",".pem");
// publicKeyFile = new File(TestResourceLifecycleManager.class.getResource("/").toURI().toString().replace("/classes/java/", "/resources/") + "/security/testKeycloakPublicKey.pem");
log.info("path of public key: {}", publicKeyFile);
// if(publicKeyFile.createNewFile()){
// log.info("created new public key file");
//
// } else {
// log.info("Public file already exists");
// }
try (
FileOutputStream os = new FileOutputStream(
publicKeyFile
);
) {
IOUtils.write(publicKey, os, UTF_8);
} catch (IOException e) {
log.error("Failed to write out public key of keycloak: ", e);
throw new IllegalStateException("Failed to write out public key of keycloak.", e);
}
} catch (IOException e) {
log.error("Failed to create public key file: ", e);
throw new IllegalStateException("Failed to create public key file", e);
}
String keycloakUrl = KEYCLOAK_CONTAINER.getAuthServerUrl().replace("/auth", "");
return Map.of(
"test.keycloak.url", keycloakUrl,
"test.keycloak.authUrl", KEYCLOAK_CONTAINER.getAuthServerUrl(),
"test.keycloak.adminName", KEYCLOAK_CONTAINER.getAdminUsername(),
"test.keycloak.adminPass", KEYCLOAK_CONTAINER.getAdminPassword(),
//TODO:: add config for server to talk to
"service.externalAuth.url", keycloakUrl,
"mp.jwt.verify.publickey.location", publicKeyFile.getAbsolutePath()
);
}
public static synchronized void startMongoTestServer() throws IOException {
if (MONGO_EXE != null) {
log.info("Flapdoodle Mongo already started.");
return;
}
Version.Main version = Version.Main.V4_0;
int port = 27018;
log.info("Starting Flapdoodle Test Mongo {} on port {}", version, port);
IMongodConfig config = new MongodConfigBuilder()
.version(version)
.net(new Net(port, Network.localhostIsIPv6()))
.build();
try {
MONGO_EXE = MongodStarter.getDefaultInstance().prepare(config);
MongodProcess process = MONGO_EXE.start();
if (!process.isProcessRunning()) {
throw new IOException();
}
} catch (Throwable e) {
log.error("FAILED to start test mongo server: ", e);
MONGO_EXE = null;
throw e;
}
}
public static synchronized void stopMongoTestServer() {
if (MONGO_EXE == null) {
log.warn("Mongo was not started.");
return;
}
MONGO_EXE.stop();
MONGO_EXE = null;
}
public synchronized static void cleanMongo() throws IOException {
if (MONGO_EXE == null) {
log.warn("Mongo was not started.");
return;
}
log.info("Cleaning Mongo of all entries.");
}
#Override
public void init(Map<String, String> initArgs) {
this.externalAuth = Boolean.parseBoolean(initArgs.getOrDefault(EXTERNAL_AUTH_ARG, Boolean.toString(this.externalAuth)));
}
#Override
public Map<String, String> start() {
log.info("STARTING test lifecycle resources.");
Map<String, String> configOverride = new HashMap<>();
try {
startMongoTestServer();
} catch (IOException e) {
log.error("Unable to start Flapdoodle Mongo server");
}
configOverride.putAll(startKeycloakTestServer());
return configOverride;
}
#Override
public void stop() {
log.info("STOPPING test lifecycle resources.");
stopMongoTestServer();
}
}
The app can be found here: https://github.com/Epic-Breakfast-Productions/OpenQuarterMaster/tree/main/software/open-qm-base-station
The tests are currently failing in the ways I am describing, so feel free to look around.
Note that to run this, you will need to run ./gradlew build publishToMavenLocal in https://github.com/Epic-Breakfast-Productions/OpenQuarterMaster/tree/main/software/libs/open-qm-core to install a dependency locally.
Github issue also tracking this: https://github.com/quarkusio/quarkus/issues/22025
Any use of #QuarkusTestResource() without restrictToAnnotatedClass set to true, means that the QuarkusTestResourceLifecycleManager will be applied to all tests no matter where the annotation is placed.
Hope restrictToAnnotatedClass will solve the problem.
I'm trying to develop new Jenkins plugin. I've started from hello-world archetype provided by Jenkins. My plugin works fine!
Bun now i want to put some environment variables from my plugin. I've used whis code to do it
public void perform(Run<?, ?> run, FilePath workspace, Launcher launcher, TaskListener listener) {
...
EnvVars envVars = run.getEnvironment(listener);
envVars.put("SOME_VARIABLE", "SOME_VALUE");
...
}
But it don't work. I'm trying to use this variable on next build step and got nothing. I've googled it and there isn't quite clear discriptions. Source codes of existing plugins (EnvInject, etc) also doesn't help.
What am i doing wrong? Can anybody provide me some samples?
From my plugin...
private void putEnvVar(String key, String value) throws IOException {
Jenkins jenkins = Jenkins.getInstance();
DescribableList<NodeProperty<?>, NodePropertyDescriptor> globalNodeProperties = jenkins.getGlobalNodeProperties();
List<EnvironmentVariablesNodeProperty> envVarsNodePropertyList = globalNodeProperties.getAll(hudson.slaves.EnvironmentVariablesNodeProperty.class);
EnvironmentVariablesNodeProperty newEnvVarsNodeProperty = null;
EnvVars envVars = null;
if (envVarsNodePropertyList == null || envVarsNodePropertyList.isEmpty()) {
newEnvVarsNodeProperty = new hudson.slaves.EnvironmentVariablesNodeProperty();
globalNodeProperties.add(newEnvVarsNodeProperty);
envVars = newEnvVarsNodeProperty.getEnvVars();
} else {
envVars = envVarsNodePropertyList.get(0).getEnvVars();
}
envVars.put(key, value);
}
I'm adding an update feature in my Eclipse E4 application. Herefor I used the source code and tutorial from Lars Vogel. When I test my application the provisioningJob is always null. It should only be null when it run into Eclipse. But when I try to update my exported application the provisioningJob is still null. What I'm doing wrong?
public class UpdateHandler {
private static final String REPOSITORY_LOC = System.getProperty("UpdateHandler.Repo",
"file:////updateServer/repository");
#Execute
public void execute(final IProvisioningAgent agent, final Shell shell, final UISynchronize sync,
final IWorkbench workbench) {
Job updateJob = new Job("Update Job") {
#Override
protected IStatus run(final IProgressMonitor monitor) {
return checkForUpdates(agent, shell, sync, workbench, monitor);
}
};
updateJob.schedule();
}
private IStatus checkForUpdates(final IProvisioningAgent agent, final Shell shell, final UISynchronize sync,
final IWorkbench workbench, IProgressMonitor monitor) {
// configure update operation
final ProvisioningSession session = new ProvisioningSession(agent);
final UpdateOperation operation = new UpdateOperation(session);
configureUpdate(operation);
// check for updates, this causes I/O
final IStatus status = operation.resolveModal(monitor);
// failed to find updates (inform user and exit)
if (status.getCode() == UpdateOperation.STATUS_NOTHING_TO_UPDATE) {
LogModule.log(LogLevel.INFO, "No updated has been found");
showMessage(shell, sync);
return Status.CANCEL_STATUS;
}
else
{
LogModule.log(LogLevel.INFO, "Updates are found");
}
// run installation
final ProvisioningJob provisioningJob = operation.getProvisioningJob(monitor);
// updates cannot run from within Eclipse IDE!!!
if (provisioningJob == null) {
System.err.println("Trying to update from the Eclipse IDE? This won't work!");
LogModule.log(LogLevel.WARNING, "Trying to update from the Eclipse IDE? This won't work!");
return Status.CANCEL_STATUS;
}
configureProvisioningJob(provisioningJob, shell, sync, workbench);
//provisioningJob.schedule();
provisioningJob.run(monitor);
return Status.OK_STATUS;
}
private void configureProvisioningJob(ProvisioningJob provisioningJob, final Shell shell, final UISynchronize sync,
final IWorkbench workbench) {
// register a job change listener to track
// installation progress and notify user upon success
provisioningJob.addJobChangeListener(new JobChangeAdapter() {
#Override
public void done(IJobChangeEvent event) {
//if (event.getResult().isOK()) {
sync.syncExec(new Runnable() {
#Override
public void run() {
LogModule.log(LogLevel.INFO, "Update ready to install");
boolean restart = MessageDialog.openQuestion(shell, "Updates installed, restart?",
"Updates have been installed. Do you want to restart?");
if (restart) {
workbench.restart();
}
}
});
// }
super.done(event);
}
});
}
private void showMessage(final Shell parent, final UISynchronize sync) {
sync.syncExec(new Runnable() {
#Override
public void run() {
MessageDialog.openWarning(parent, "No update",
"No updates for the current installation have been found.");
}
});
}
private UpdateOperation configureUpdate(final UpdateOperation operation) {
// create uri and check for validity
URI uri = null;
try {
uri = new URI(REPOSITORY_LOC);
} catch (final URISyntaxException e) {
System.err.println(e.getMessage());
LogModule.log(LogLevel.ERROR, e.getMessage());
return null;
}
// set location of artifact and metadata repo
operation.getProvisioningContext().setArtifactRepositories(new URI[] { uri });
operation.getProvisioningContext().setMetadataRepositories(new URI[] { uri });
return operation;
}
}
P2 uses internally a lot of services and those are not explicitly referenced as bundle dependencies. So you might miss those additional required services. Adding them via the "Add required ..." inside PDE launches is not working.
Make sure that your Launch or Product is really including all requirements.I would start with the content of org.eclipse.equinox.p2.sdk. This should definitely work. Afterwards you can try to strip the requirements down to org.eclipse.equinox.p2.core.feature or even less.
I have created a cron job that start during application restart but when i tried to create db connection i am geeting null pointer exception. I am able to create and use db from other module using same configuration.
Below is my Application.conf
db.abc.driver=com.mysql.jdbc.Driver
db.abc.url="jdbc:mysql://localhost:3306/db_name?useSSL=false"
db.abc.username=root
db.abc.password=""
db.abc.autocommit=false
db.abc.isolation=READ_COMMITTED
And code that tried to access db is
public class SchduleJob extends AbstractModule{
#Override
protected void configure() {
bind(JobOne.class)
.to(JobOneImpl.class)
.asEagerSingleton();
} }
#ImplementedBy(JobOneImpl.class)
public interface JobOne {}
#Singleton
public class JobOneImpl implements JobOne {
final ActorSystem actorSystem = ActorSystem.create("name");
final ActorRef alertActor = actorSystem.actorOf(AlertActor.props);
public JobOneImpl() {
scheduleJobs();
}
private Cancellable scheduleJobs() {
return actorSystem.scheduler().schedule(
Duration.create(0, TimeUnit.MILLISECONDS), //Initial delay 0 milliseconds
Duration.create(6, TimeUnit.MINUTES), //Frequency 30 minutes
alertActor,
"alert",
actorSystem.dispatcher(),
null
);
}
}
public class AlertActor extends UntypedActor{
public static Props props = Props.create(AlertActor.class);
final ActorSystem actorSystem = ActorSystem.create("name");
final ActorRef messageActor = actorSystem.actorOf(MessageActor.props());
#Override
public void onReceive(Object message) throws Exception {
if(message != null && message instanceof String) {
RequestDAO requestDAO = new RequestDAO();
try {
List<DBRow> rows = requestDAO.getAllRow();
} catch(Exception exception) {
exception.printStackTrace();
}
}
}
}
public class RequestDAO {
public List<DBRow> getAllRow() throws Exception {
List<DBRow> rows = new ArrayList<DBRow>();
Connection connection = null;
try {
connection = DB.getDataSource("abc").getConnection();
connection.setAutoCommit(false);
} catch(Exception exception) {
exception.printStackTrace();
if(connection != null) {
connection.rollback();
} else {
System.out.println("in else***********");
}
return null;
} finally {
if(connection != null)
connection.close();
}
return schools;
}
When i am calling method getAllRow() of RequestDAO class it's throwing
java.lang.NullPointerException
at play.api.Application$$anonfun$instanceCache$1.apply(Application.scala:235)
at play.api.Application$$anonfun$instanceCache$1.apply(Application.scala:235)
at play.utils.InlineCache.fresh(InlineCache.scala:69)
at play.utils.InlineCache.apply(InlineCache.scala:55)
at play.api.db.DB$.db(DB.scala:22)
at play.api.db.DB$.getDataSource(DB.scala:41)
at play.api.db.DB.getDataSource(DB.scala)
at play.db.DB.getDataSource(DB.java:33)
But same code is working without cron job. What should i do to remove this error
Play uses the Typesafe config library for configuration.
I suspect your current working directory from the cron script isn't set, so it's probably not finding your application.conf (application.properties) file.
However, Config is nice in that it allows you to specify where to look for the file, either by its base name (to choose among .conf | .properties | .json extensions) or the filename including the extension on the java command line:
To specify the base name, use -Dconfig.resource=/path/to/application
To specify the full name, use -Dconfig.file=/path/to/application.properties