luckperms non-static method getUserManager() cannot be referenced from a static context - java

I'm trying to get user with uuid but an error pops up and says this:
non-static method getUserManager() cannot be referenced from a static context
I tried to place it in a different method and call that method but didn't worked, whatever I do that error still pops up. It could be because I wrote LuckPerms instead of luckPerms but I don't think so.
And here is the code (I deleted some useless stuff)
I tried searching but couldn't find anything, also I've read API like 5 times and it was useless
// BUNGEECORD
import net.md_5.bungee.api.plugin.Plugin;
// BUNGEECORD
//JAVA
import java.util.EnumSet;
//JAVA
//LuckPerms
import net.luckperms.api.LuckPermsProvider;
import net.luckperms.api.LuckPerms;
//LuckPerms
public void onPrivateMessageReceived(final PrivateMessageReceivedEvent event) {
UUID uuid = UUID.nameUUIDFromBytes(("OfflinePlayer:" + messages.get(0).getContentDisplay()).getBytes());
net.luckperms.api.model.user.User user = LuckPerms.getUserManager().getUser(uuid); // AND HERE IS THE ERROR
DataMutateResult result = user.data().add(Node.builder("group.admin").build());
};
And here is my pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>area</groupId>
<artifactId>amogus</artifactId>
<version>1.0.0-SNAPSHOT</version>
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<repositories>
<repository>
<id>dv8tion</id>
<name>m2-dv8tion</name>
<url>https://m2.dv8tion.net/releases</url>
</repository>
<repository>
<id>spigot-repo</id>
<url>https://hub.spigotmc.org/nexus/content/repositories/snapshots/</url>
</repository>
<repository>
<snapshots>
<enabled>false</enabled>
</snapshots>
<id>bintray-dv8fromtheworld-maven</id>
<name>bintray</name>
<url>http://dl.bintray.com/dv8fromtheworld/maven</url>
</repository>
<repository>
<id>bungeecord-repo</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>net.dv8tion</groupId>
<artifactId>JDA</artifactId>
<version>4.3.0_277</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>2.0.0-alpha4</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>2.0.0-alpha4</version>
</dependency>
<dependency>
<groupId>net.luckperms</groupId>
<artifactId>api</artifactId>
<version>5.3</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.bukkit</groupId>
<artifactId>bukkit</artifactId>
<version>1.12.2-R0.1-SNAPSHOT</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>net.md-5</groupId>
<artifactId>bungeecord-api</artifactId>
<version>1.16-R0.5-SNAPSHOT</version>
<type>jar</type>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>net.md-5</groupId>
<artifactId>bungeecord-api</artifactId>
<version>1.16-R0.5-SNAPSHOT</version>
<type>javadoc</type>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.1.0</version>
<configuration>
<archive>
<manifest>
<mainClass>area.amogus</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<exclude></exclude>
</excludes>
</artifactSet>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
<resources>
<resource>
<directory>resources</directory>
</resource>
</resources>
</build>
</project>

Imagine we have a class:
public class MyClass {
public static void main(String[] args) {
MyClass.doThis();
}
public void doThis(){
System.out.println("do this");
}
}
In this case, you will get exactly the same error as you're getting in your code. It means, we cannot access doThis() method, before an instance of MyClass is created, like
MyClass myClass = new MyClass();
myClass.doThis(); //this is valid since we have an instance to call this method on.
I would assume you have to get an instance of LuckPerms object in some way, before calling a .getUserManager() method on it.
Like this:
LuckPerms luckPerms = new LuckPerms(); //assuming there's a respective constructor
luckPerms.getUserManager();

you cannot use LuckPerms.getUserManager(), because you have not instantiated new Object from LuckPerms. Aslo LuckPerms cannot be instantiated because it is an interface. you should implement the interface first and then instantiate and use it.
your code could be like this:
public void onPrivateMessageReceived(final PrivateMessageReceivedEvent event) {
UUID uuid = UUID.nameUUIDFromBytes(("OfflinePlayer:" + messages.get(0).getContentDisplay()).getBytes());
LuckPerms luckPerms = new LuckPerms() {
#Override
public #org.checkerframework.checker.nullness.qual.NonNull String getServerName() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull UserManager getUserManager() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull GroupManager getGroupManager() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull TrackManager getTrackManager() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull <T> PlayerAdapter<T> getPlayerAdapter(#org.checkerframework.checker.nullness.qual.NonNull Class<T> aClass) {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull Platform getPlatform() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull PluginMetadata getPluginMetadata() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull EventBus getEventBus() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull Optional<MessagingService> getMessagingService() {
return Optional.empty();
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull ActionLogger getActionLogger() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull ContextManager getContextManager() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull MetaStackFactory getMetaStackFactory() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull CompletableFuture<Void> runUpdateTask() {
return null;
}
#Override
public void registerMessengerProvider(#org.checkerframework.checker.nullness.qual.NonNull MessengerProvider messengerProvider) {
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull NodeBuilderRegistry getNodeBuilderRegistry() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull QueryOptionsRegistry getQueryOptionsRegistry() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull NodeMatcherFactory getNodeMatcherFactory() {
return null;
}
};
net.luckperms.api.model.user.User user = luckPerms.getUserManager().getUser(uuid); // AND HERE IS THE ERROR
DataMutateResult result = user.data().add(Node.builder("group.admin").build());
}

Related

Can I have cucumber datatable printed in the console output in Java?

E.g. If this is my feature file:
Scenario: Some dummy scenario
Given that I do something with this datatable:
| print | this |
| and | this |
And something else
The output looks like:
Given that I do something with this datatable:
And something else
I was wondering if it is possible to have an output similar to this:
Given that I do something with this datatable:
| print | this |
| and | this |
And something else
Thank you for your help
Edit: As requested, the details of my setup are exposed bellow.
I am using Java and this is the class responsible for the configuration:
#RunWith(Cucumber.class)
#CucumberOptions(plugin = {"pretty", "html:target/cucumber"},
monochrome = false,
glue = {"my.dummy.package"},
features = {"classpath:dummy.feature"})
public class DummyFT {
#Test
public void test() throws IOException {
}
}
These tests are executed as a separate maven goal. My profile section has:
<profile>
<id>functional-tests</id>
<build>
<plugins>
<plugin>
<artifactId>maven-failsafe-plugin</artifactId>
<configuration>
<testSourceDirectory>test/test-functional/java</testSourceDirectory>
<includes>
<include>**/*FT.java</include>
</includes>
</configuration>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
and the failsafe plugin:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<dependencies>
<dependency>
<groupId>org.apache.maven.surefire</groupId>
<artifactId>surefire-junit47</artifactId>
<version>${maven.surefire.junit47}</version>
</dependency>
</dependencies>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>
And the cucumber dependencies:
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
<version>${cucumber.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>${cucumber.version}</version>
<scope>test</scope>
</dependency>
Finally, the tests are triggered by running:
mvn failsafe:integration-test -Pfunctional-tests
I noticed that there was no solution to this still and couldn't find one anywhere else. We were able to get this to work using a custom afterStep hook, though we're doing this with cucumber-jvm-groovy so obviously it's a bit different. You should be able to take something like this and convert it to java if you're determined though.
import cucumber.api.PickleStepTestStep
import cucumber.api.TestCase
import cucumber.runner.Scenario
import gherkin.pickles.PickleCell
import gherkin.pickles.PickleRow
import gherkin.pickles.PickleTable
import groovy.util.logging.Log4j
import util.StringHelper
import java.lang.reflect.Field
#Log4j
class CucumberHelper {
static TestCase getTestCase(Scenario scenario) {
Field testCaseField = scenario.getClass().getDeclaredField("testCase")
testCaseField.setAccessible(true)
(TestCase) testCaseField.get(scenario)
}
static PickleStepTestStep getStepObject(TestCase testCase, int stepIndex) {
(PickleStepTestStep) testCase.getTestSteps()[stepIndex]
}
static printDataTables(PickleStepTestStep step, boolean stepPassed) {
if (stepContainsDataTable(step)) {
PickleTable table = getDataTable(step)
printTable(table, stepPassed)
}
}
static private boolean stepContainsDataTable(PickleStepTestStep step) {
step?.step?.arguments?.any { it instanceof PickleTable }
}
static private PickleTable getDataTable(PickleStepTestStep step) {
step.step.arguments.find { it instanceof PickleTable }
}
static private void printTable(PickleTable table, boolean stepPassed) {
List<Integer> widths = []
table.rows.each { PickleRow row ->
row.cells.eachWithIndex { PickleCell cell, int i ->
int max = widths[i] ?: 0
int cellWidth = cell.value.length()
if(cellWidth > max){
widths[i] = cellWidth
}
}
}
table.rows.each { PickleRow row ->
printRow(row, stepPassed, widths)
}
}
static private void printRow(PickleRow row, boolean stepPassed, List<Integer> columnWidths) {
String output = ' | '
row.cells.eachWithIndex { PickleCell cell, int i ->
output += cell.value.padRight(columnWidths[i]) + ' | '
}
println getConsoleColor(stepPassed) + output + StringHelper.ANSI_RESET
}
static private String getConsoleColor(boolean stepPassed) {
stepPassed ? StringHelper.ANSI_GREEN : StringHelper.ANSI_RED
}
static void logBeforeStep(PickleStepTestStep step) {
log.trace "BEFORE STEP:\n\n\t${step.stepText}\n"
}
static void logAfterStep(PickleStepTestStep step) {
log.trace "AFTER STEP:\n\n\t${step.stepText}\n"
}
}
And the hook:
AfterStep() { Scenario scenario ->
try {
TestCase testCase = CucumberHelper.getTestCase(scenario)
PickleStepTestStep step = CucumberHelper.getStepObject(testCase, scenario.stepResults.size() - 1)
CucumberHelper.logAfterStep(step)
CucumberHelper.printDataTables(step, !scenario.isFailed())
} catch (Throwable t) {
// ignore, this hook is only for logging
}
}
(Obviously there's some extra functionality in there for our use cases, but it might help others. The StringHelper just adds unicode characters for coloring the output but you could easily remove that and have it print in standard terminal colors.)

Unable to consume kafka messages using apache storm

I have developed an application to consume the kafka message using apache storm, when i run topology using in LocalCluster in eclipse it works fine and messages consumes properly, but when i run this using storm command (bin\storm jar ..\kafka-storm-0.0.1-SNAPSHOT.jar com.kafka_storm.util.Topology storm-kafka-topology), the topology started but unable to consume any message, it there something wrong i am doing, or guide me what things i can do to find the problem
Topology Code
public class Topology {
public Properties configs;
public BoltBuilder boltBuilder;
public SpoutBuilder spoutBuilder;
public Topology(String configFile) throws Exception {
configs = new Properties();
InputStream is = null;
try {
is = this.getClass().getResourceAsStream("/application.properties");
configs.load(is);
//configs.load(Topology.class.getResourceAsStream("/application.properties"));
boltBuilder = new BoltBuilder(configs);
spoutBuilder = new SpoutBuilder(configs);
} catch (Exception ex) {
ex.printStackTrace();
System.exit(0);
}
}
private void submitTopology() throws Exception {
System.out.println("Entered in submitTopology");
TopologyBuilder builder = new TopologyBuilder();
KafkaSpout<?, ?> kafkaSpout = spoutBuilder.buildKafkaSpout();
SinkTypeBolt sinkTypeBolt = boltBuilder.buildSinkTypeBolt();
MongoDBBolt mongoBolt = boltBuilder.buildMongoDBBolt();
//set the kafkaSpout to topology
//parallelism-hint for kafkaSpout - defines number of executors/threads to be spawn per container
int kafkaSpoutCount = Integer.parseInt(configs.getProperty(Keys.KAFKA_SPOUT_COUNT));
builder.setSpout(configs.getProperty(Keys.KAFKA_SPOUT_ID), kafkaSpout, kafkaSpoutCount);
//set the sinktype bolt
int sinkBoltCount = Integer.parseInt(configs.getProperty(Keys.SINK_BOLT_COUNT));
builder.setBolt(configs.getProperty(Keys.SINK_TYPE_BOLT_ID),sinkTypeBolt,sinkBoltCount).shuffleGrouping(configs.getProperty(Keys.KAFKA_SPOUT_ID));
//set the mongodb bolt
int mongoBoltCount = Integer.parseInt(configs.getProperty(Keys.MONGO_BOLT_COUNT));
builder.setBolt(configs.getProperty(Keys.MONGO_BOLT_ID),mongoBolt,mongoBoltCount).shuffleGrouping(configs.getProperty(Keys.SINK_TYPE_BOLT_ID),Keys.MONGODB_STREAM);
String topologyName = configs.getProperty(Keys.TOPOLOGY_NAME);
Config conf = new Config();
//Defines how many worker processes have to be created for the topology in the cluster.
conf.setNumWorkers(1);
System.out.println("Submitting Topology");
//StormSubmitter.submitTopology(topologyName, conf, builder.createTopology());
System.out.println("Topology submitted");
LocalCluster cluster = new LocalCluster();
cluster.submitTopology(topologyName, conf, builder.createTopology());
}
public static void main(String[] args) throws Exception {
String configFile;
if (args.length == 0) {
System.out.println("Missing input : config file location, using default");
configFile = "application.properties";
} else{
configFile = args[0];
}
Topology ingestionTopology = new Topology(configFile);
ingestionTopology.submitTopology();
}
}
Spout Code
public class SpoutBuilder {
public Properties configs = null;
public SpoutBuilder(Properties configs) {
this.configs = configs;
}
public KafkaSpout<?, ?> buildKafkaSpout() {
String servers = configs.getProperty(Keys.KAFKA_BROKER);
String topic = configs.getProperty(Keys.KAFKA_TOPIC);
String group = configs.getProperty(Keys.KAFKA_CONSUMERGROUP);
return new KafkaSpout<>(getKafkaSpoutConfig(servers,topic,group));
}
protected KafkaSpoutConfig<String, String> getKafkaSpoutConfig(String bootstrapServers, String topic, String group) {
return KafkaSpoutConfig.builder(bootstrapServers, new String[]{topic})
.setProp(ConsumerConfig.GROUP_ID_CONFIG, group)
.setRetry(getRetryService())
.setOffsetCommitPeriodMs(10_000)
.setFirstPollOffsetStrategy(FirstPollOffsetStrategy.UNCOMMITTED_LATEST)
.setMaxUncommittedOffsets(250)
.setProcessingGuarantee(ProcessingGuarantee.AT_LEAST_ONCE)
.setTupleTrackingEnforced(true)
.setEmitNullTuples(false)
.setRecordTranslator(new DefaultRecordTranslator<String, String>())
.build();
}
protected KafkaSpoutRetryService getRetryService() {
return new KafkaSpoutRetryExponentialBackoff(TimeInterval.microSeconds(500),
TimeInterval.milliSeconds(2), Integer.MAX_VALUE, TimeInterval.seconds(10));
}
}
Bolt Builder
public class BoltBuilder {
public Properties configs = null;
public BoltBuilder(Properties configs) {
this.configs = configs;
}
public SinkTypeBolt buildSinkTypeBolt() {
return new SinkTypeBolt();
}
public MongoDBBolt buildMongoDBBolt() {
String host = configs.getProperty(Keys.MONGO_HOST);
int port = Integer.parseInt(configs.getProperty(Keys.MONGO_PORT));
String db = configs.getProperty(Keys.MONGO_DATABASE);
String collection = configs.getProperty(Keys.MONGO_COLLECTION);
return new MongoDBBolt(host, port, db, collection);
}
}
SinkTypeBolt Code
public class SinkTypeBolt extends BaseRichBolt {
private static final long serialVersionUID = 1L;
private OutputCollector collector;
public void execute(Tuple tuple) {
String value = tuple.getString(4);
System.out.println("Received in SinkType bolt : "+value);
if (value != null && !value.isEmpty()){
collector.emit(Keys.MONGODB_STREAM,new Values(value));
System.out.println("Emitted : "+value);
}
collector.ack(tuple);
}
public void prepare(Map conf, TopologyContext context, OutputCollector collector) {
this.collector = collector;
}
public void declareOutputFields(OutputFieldsDeclarer declarer) {
declarer.declareStream(Keys.MONGODB_STREAM, new Fields("content"));
}
}
MongoDB Bolt
public class MongoDBBolt extends BaseRichBolt {
private static final long serialVersionUID = 1L;
private OutputCollector collector;
private MongoDatabase mongoDB;
private MongoClient mongoClient;
private String collection;
public String host;
public int port ;
public String db;
protected MongoDBBolt(String host, int port, String db,String collection) {
this.host = host;
this.port = port;
this.db = db;
this.collection = collection;
}
public void prepare(Map stormConf, TopologyContext context, OutputCollector collector) {
this.collector = collector;
this.mongoClient = new MongoClient(host,port);
this.mongoDB = mongoClient.getDatabase(db);
}
public void execute(Tuple input) {
Document mongoDoc = getMongoDocForInput(input);
try{
mongoDB.getCollection(collection).insertOne(mongoDoc);
collector.ack(input);
}catch(Exception e) {
e.printStackTrace();
collector.fail(input);
}
}
#Override
public void cleanup() {
this.mongoClient.close();
}
public void declareOutputFields(OutputFieldsDeclarer declarer) {
// TODO Auto-generated method stub
}
public Document getMongoDocForInput(Tuple input) {
Document doc = new Document();
String content = (String) input.getValueByField("content");
String[] parts = content.trim().split(" ");
System.out.println("Received in MongoDB bolt "+content);
try {
for(String part : parts) {
String[] subParts = part.split(":");
String fieldName = subParts[0];
String value = subParts[1];
doc.append(fieldName, value);
}
} catch(Exception e) {
}
return doc;
}
}
pom.xml code
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.7</maven.compiler.source>
<maven.compiler.target>1.7</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-core</artifactId>
<version>1.2.2</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>1.1.0</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-kafka-client</artifactId>
<version>1.2.2</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>3.0.4</version>
</dependency>
<dependency>
<groupId>com.googlecode.json-simple</groupId>
<artifactId>json-simple</artifactId>
<version>1.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>1.4</version>
<configuration>
<createDependencyReducedPom>true</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer" />
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.kafka_storm.util.Topology</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<version>2.4</version>
</plugin>
</plugins>
<resources>
<resource>
<directory>src/main/java</directory>
<includes>
<include> **/*.properties</include>
</includes>
</resource>
</resources>
</build>
Storm UI
Just to be sure, you are remembering to use the StormSubmitter line in Topology, rather than the LocalCluster when you submit the topology with storm jar, right?
Also please check that you've started all the right daemons, i.e. storm nimbus, storm supervisor should be running as a minimum (plus your Zookeeper install)
The next places to look would be in your log files. In the Storm directory, you'll have a logs directory. Look in the logs/worker-artifacts/<your-topology-id>/<your-worker-port>/worker.log files. Those will hopefully get you on the right track to figuring out what's going on. I'd open Storm UI, find your spout and look up which worker ports it's running on, so you can look in the right log files.

NoNodeAvailableException elasticsearch in one IntelliJ project but not in other with same programmatic and elasticsearch.yml configuration and code

I know question related to NoNodeAvailbleException has already been asked multiple times on this platform previously. I had followed through all the solutions been provided by experts previously; yet none work with me. The problem with the Exception I get is unique.
I am facing NoNodeAvailableException in my Java Client. The usual solution is to ensure consistency in confiurations been set through programmatically and the one present in elasticsearch.yml. I had already ensured this.
Just for being on the safe side, I had created another IntelliJ project with same Maven dependencies with same elasticsearch client code and same configurations, but I didn't get NoNodeAvailableException in it.
I could work with this new Intellij Project, by shipping code from my previous IntelliJ project but I will lose all the commits I had previously have and other services written in my old project.
Old client code:
public class ElasticSearchTest {
private static TransportClient client;
public static void main(String[] args) {
try {
Settings elasticsearchSettings = Settings.builder().put("cluster.name", "elasticsearch_cse").put("node.name","elasticsearch_cse_data").build();
client = new PreBuiltTransportClient(elasticsearchSettings)
.addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("localhost"), 9300))
.addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("localhost"), 9300));
String sampleWebsite = "www.samplewebsite.com";
List<PageModel> pages = new ArrayList<PageModel>();
pages.add(new PageModel("www.samplewebsite.com/page1", "darkknight wins a lot", "First page"));
pages.add(new PageModel("www.samplewebsite.com/page1/page2", "superman survives", "Second page"));
pages.add(new PageModel("www.samplewebsite.com/page3", "wonderwoman releases", "third page"));
DomainModel domainModel = new DomainModel(sampleWebsite, pages);
insertIndicesFromDomain(domainModel);
boolean isResultFound = false;
List<PageModel> pageModels = getSearchResults(domainModel, "superman");
for(PageModel pageModel: pageModels){
if(pageModel.getPageContent().equals("superman survives")){
isResultFound = true;
System.out.println(pageModel);
}
}
// closeService();
} catch (UnknownHostException e) {
e.printStackTrace();
}
}
public static void insertIndicesFromDomain(DomainModel domainModel){
int i = 1;
for(PageModel pageModel: domainModel.getPages()){
client.prepareIndex("websiteindex", domainModel.getDomainName(), i + "").setSource(pageModel.getPageModelInMapFormat()).execute().actionGet();
i += 1;
System.out.println(i);
}
}
public static List<PageModel> getSearchResults(DomainModel domainModel, String searchKeyWords){
SearchResponse searchResponse = client.prepareSearch("websiteindex").setTypes(domainModel.getDomainName()).setSearchType(SearchType.QUERY_AND_FETCH)
.setQuery(QueryBuilders.termQuery("content", searchKeyWords)).execute().actionGet();
SearchHit[] results = searchResponse.getHits().getHits();
List<PageModel> pageModels = new ArrayList<PageModel>();
for(SearchHit searchHit: results){
pageModels.add(new PageModel(searchHit.getSource()));
}
return pageModels;
}
public static void getIndices(){
ImmutableOpenMap<String, IndexMetaData> indicies = client.admin().cluster().prepareState().get().getState().getMetaData().getIndices();
for(ObjectObjectCursor<String, IndexMetaData> entry: indicies){
System.out.printf(entry.key + " " + entry.value);
}
}
public static void closeService(){
client.close();
}
}
StackTrace of old client code:
Exception in thread "main" NoNodeAvailableException[None of the configured nodes are available: [{#transport#-1}{H90yHKj8R_aa0MeTciqjvA}{localhost}{127.0.0.1:9300}]]
at org.elasticsearch.client.transport.TransportClientNodesService.ensureNodesAreAvailable(TransportClientNodesService.java:347)
at org.elasticsearch.client.transport.TransportClientNodesService.execute(TransportClientNodesService.java:245)
at org.elasticsearch.client.transport.TransportProxyClient.execute(TransportProxyClient.java:59)
at org.elasticsearch.client.transport.TransportClient.doExecute(TransportClient.java:363)
at org.elasticsearch.client.support.AbstractClient.execute(AbstractClient.java:408)
at org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:80)
at org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:54)
at net.media.sitesearch.models.ElasticSearchTest.insertIndicesFromDomain(ElasticSearchTest.java:66)
at net.media.sitesearch.models.ElasticSearchTest.main(ElasticSearchTest.java:42)
New client Code:
public class Main {
private static TransportClient client;
public static void main(String[] args) {
try {
Settings elasticsearchSettings = Settings.builder().put("cluster.name", "elasticsearch_cse").put("node.name","elasticsearch_cse_data").build();
client = new PreBuiltTransportClient(elasticsearchSettings)
.addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("localhost"), 9300))
.addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("localhost"), 9300));
String sampleWebsite = "www.samplewebsite.com";
List<PageModel> pages = new ArrayList<PageModel>();
pages.add(new PageModel("www.samplewebsite.com/page1", "darkknight wins a lot", "First page"));
pages.add(new PageModel("www.samplewebsite.com/page1/page2", "superman survives", "Second page"));
pages.add(new PageModel("www.samplewebsite.com/page3", "wonderwoman releases", "third page"));
DomainModel domainModel = new DomainModel(sampleWebsite, pages);
insertIndicesFromDomain(domainModel);
boolean isResultFound = false;
List<PageModel> pageModels = getSearchResults(domainModel, "superman");
for(PageModel pageModel: pageModels){
if(pageModel.getPageContent().equals("superman survives")){
isResultFound = true;
System.out.println(pageModel);
}
}
// closeService();
} catch (UnknownHostException e) {
e.printStackTrace();
}
}
public static void insertIndicesFromDomain(DomainModel domainModel){
int i = 1;
for(PageModel pageModel: domainModel.getPages()){
client.prepareIndex("websiteindex", domainModel.getDomainName(), i + "").setSource(pageModel.getPageModelInMapFormat()).execute().actionGet();
i += 1;
System.out.println(i);
}
}
public static List<PageModel> getSearchResults(DomainModel domainModel, String searchKeyWords){
SearchResponse searchResponse = client.prepareSearch("websiteindex").setTypes(domainModel.getDomainName()).setSearchType(SearchType.QUERY_AND_FETCH)
.setQuery(QueryBuilders.termQuery("content", searchKeyWords)).execute().actionGet();
SearchHit[] results = searchResponse.getHits().getHits();
List<PageModel> pageModels = new ArrayList<PageModel>();
for(SearchHit searchHit: results){
pageModels.add(new PageModel(searchHit.getSource()));
}
return pageModels;
}
public static void getIndices(){
ImmutableOpenMap<String, IndexMetaData> indicies = client.admin().cluster().prepareState().get().getState().getMetaData().getIndices();
for(ObjectObjectCursor<String, IndexMetaData> entry: indicies){
System.out.printf(entry.key + " " + entry.value);
}
}
public static void closeService(){
client.close();
}
}
Output of new client code:
2
3
4
PageModel{pageUrl='www.samplewebsite.com/page1/page2', pageContent='superman survives', pageTitle='Second page', map={title=Second page, url=www.samplewebsite.com/page1/page2, content=superman survives}}
The only difference between my old code and new code, is that old code is under Git working tree and the new one is not.
New edit:
I found another important difference between my two projects. In my new Intellij Project, I haven't shipped all the other services from old one and just kept necessary ElasticSearch Client code. Hence, the new Intellij Project did not contain all the dependencies of the old project, but only one dependency of the elasticsearch.
In my new client code which does not contain all the maven dependencies except for elastic search, I don't get NoNodeAvailableException. Could the dependencies between elastic search and other maven apis be an indirect reason for that exception?
New pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.sampleelasticsearch</groupId>
<artifactId>SampleES</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>transport</artifactId>
<version>5.5.1</version>
</dependency>
</dependencies>
</project>
Old pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>net.media.siteresearch</groupId>
<artifactId>SiteSearch</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>com.sparkjava</groupId>
<artifactId>spark-core</artifactId>
<version>2.5</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
</dependency>
<dependency>
<groupId>net.media.crawler</groupId>
<artifactId>MnetCrawler</artifactId>
<version>0.995</version>
</dependency>
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>transport</artifactId>
<version>5.5.1</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.8.2</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.24</version>
</dependency>
<dependency>
<!-- jsoup HTML parser library # https://jsoup.org/ -->
<groupId>org.jsoup</groupId>
<artifactId>jsoup</artifactId>
<version>1.10.3</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.8.9</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.5.3</version>
</dependency>
</dependencies>
</project>
If authors here want to check logs from elastic search server or any other, then please ask (I have not added them because the question details were already filled with a lot of other relevant snippets from my perspective. Also, the question details would get cluttered).

Agent JAR not found or no Agent-Class attribute

// Fixed: This was not an Error because of code. It was because of the IDE.
I just tried to make a injection for a game called Minecraft.
But i have one Problem. It's not able to load Agent.
Here is the Exception:Exception in thread "main" com.sun.tools.attach.AgentLoadException: Agent JAR not found or no Agent-Class attribute
at sun.tools.attach.HotSpotVirtualMachine.loadAgent(HotSpotVirtualMachine.java:117)
at com.sun.tools.attach.VirtualMachine.loadAgent(VirtualMachine.java:540)
at pw.razex.injectionclient.Injectable.main(Injectable.java:55)
And my code:
AgentLoader [AgentClass]
public class AgentLoader {
public static void agentmain(String agent, Instrumentation instrumentation) {
try {
Class[] loadedClasses = instrumentation.getAllLoadedClasses();
File agentFile = new File(AgentLoader.class.getProtectionDomain().getCodeSource().getLocation().toURI().getPath());
agentFile.deleteOnExit();
for(int i = 0; i < loadedClasses.length; i++) {
Class loadedClass = loadedClasses[i];
if(loadedClass.getName().equals("net.minecraft.client.Minecraft")) {
LaunchClassLoader launchClassLoader = (LaunchClassLoader) loadedClass.getClassLoader();
launchClassLoader.addURL(AgentLoader.class.getProtectionDomain().getCodeSource().getLocation());
launchClassLoader.loadClass(MainGui.class.getName()).newInstance();
}
}
} catch (Throwable e) {
e.printStackTrace();
}
}
}
Main Class:
public class Injectable {
private static VirtualMachineDescriptor minecraftProcess;
public static void main(String[] args) throws Throwable{
OSUtil.initOS();
if(OSUtil.osType != OSUtil.OSType.WINDOWS) {
System.out.println("[X] Invalid OS [" + OSUtil.osType.name() + "]");
System.exit(-1);
}
File sourceFile = new File(Injectable.class.getProtectionDomain().getCodeSource().getLocation().toURI().getPath());
File tempAttachFile = File.createTempFile("lgt", ".dat");
Files.copy(sourceFile.toPath(), tempAttachFile.toPath(), StandardCopyOption.REPLACE_EXISTING);
System.out.println(new File("attach.dll").exists());
if (System.getProperty("java.library.path") != null) {
System.setProperty("java.library.path", new File("attach.dll").getAbsolutePath() + System.getProperty("path.separator") + System.getProperty("java.library.path"));
} else {
System.setProperty("java.library.path", new File("attach.dll").getAbsolutePath());
}
Field field = ClassLoader.class.getDeclaredField("sys_paths");
field.setAccessible(true);
field.set(null, null);
System.loadLibrary("attach");
if(VirtualMachine.list().size() == 0) {
System.out.println("[X] No injectable process found");
System.exit(-1);
}
List<VirtualMachineDescriptor> virtualMachineDescriptors = VirtualMachine.list();
for(VirtualMachineDescriptor virtualMachineDescriptor : virtualMachineDescriptors) {
if(virtualMachineDescriptor.displayName().startsWith("net.minecraft.client.main.Main")) {
minecraftProcess = virtualMachineDescriptor;
VirtualMachine virtualMachine = VirtualMachine.attach(minecraftProcess);
virtualMachine.loadAgent(tempAttachFile.getAbsolutePath());
System.out.println("[O] Attached to minecraft");
}
}
if(minecraftProcess == null) {
System.out.println("[X] Minecraft is not started yet.");
System.exit(-1);
}
}
}
And my pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
http://maven.apache.org/xsd/maven-4.0.0.xsd">
4.0.0
<groupId>InjectionClientMaven</groupId>
<artifactId>InjectionClientMaven</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifestEntries>
<Built-By>David</Built-By>
<Main-Class>de.david.injectionclient.Injectable</Main-Class>
<Agent-Class>de.david.injectionclient.AgentLoader</Agent-Class>
<Class-Path>tools.jar</Class-Path>
<Can-Retransform-Classes>true</Can-Retransform-Classes>
</manifestEntries>
</archive>
</configuration>
</plugin>
</plugins>
</build>
Make sure your agent jar has the following for its MANIFEST
Manifest-Version: 1.0
Agent-Class: com.package.AgentLoader.agentNameHere
Permissions: all-permissions

Error loading SQLite file on Tomcat war app

I am trying to use and embedded sqlite.db file on my tomcat app. I am working on NetBeans and deploying it on a local server through Netbean's play button. so far so good. However, I am facing a problem when I build a war file and deploy it on a Raspberry PI tomcat. I attach the error below.
For som reason, the same code that works locally, won't work on this tomcat server installed on Raspberry. Any ideas?
Thank you!
==== CODE ====
ContextService.java
public class ContexService implements ServletContextListener {
/**
*
* #param sce
*/
#Override
public void contextInitialized(ServletContextEvent sce) {
System.out.println("== context initialized ==");
SqliteTest.getInstance().start();
SqliteTest.getInstance().stop();
}
/**
*
* #param sce
*/
#Override
public void contextDestroyed(ServletContextEvent sce) {
System.out.println("== context destroyed ==");
SqliteTest.getInstance().stop();
}
}
SQLiteManager.java
public class SQLiteManager {
private static final Logger logger = LoggerFactory.getLogger(SQLiteManager.class);
private static SQLiteManager instance = null;
private static Connection connection = null;
private static int QUERY_TIMEOUT = 5;
private static String TABLE_PROPERTIES = "Properties";
private static String DB_NAME = "safemo.db";
private static class TABLE_PROPERTIES_COLUMNS {
public static final String TRIP_NUMBER = "tripNumber";
}
private SQLiteManager() {
openDB();
}
public static SQLiteManager getInstance() {
if (instance == null) {
instance = new SQLiteManager();
}
return instance;
}
private void openDB() {
if (connection == null) {
connect();
} else {
try {
if (connection.isClosed()) {
connect();
}
} catch (SQLException ex) {
logger.error("Error openning connection : " + ex.getMessage());
}
}
}
private void connect() {
try {
Class.forName("org.sqlite.JDBC");
connection = DriverManager.getConnection("jdbc:sqlite::resource:" + DB_NAME);
logger.debug("Opened database successfully");
} catch (ClassNotFoundException ex) {
logger.error(ex.getMessage());
} catch (SQLException ex) {
logger.error(ex.getMessage());
}
}
public void closeDB() {
try {
if (connection != null) {
connection.close();
}
logger.debug("Connection closed successfully");
} catch (SQLException ex) {
logger.error("Error closing database : " + ex.getMessage());
}
}
public void incrementTripNumber()
{
try {
Statement statement = connection.createStatement();
statement.setQueryTimeout(QUERY_TIMEOUT); // set timeout to 30 sec.
statement.executeUpdate("UPDATE " + TABLE_PROPERTIES + " SET " + TABLE_PROPERTIES_COLUMNS.TRIP_NUMBER + " = " + TABLE_PROPERTIES_COLUMNS.TRIP_NUMBER + " + 1");
} catch (SQLException ex) {
logger.error("Error getting trip number : " + ex.getMessage());
}
}
public int getTripNumber() {
int tripNumber = -1;
try {
Statement statement = connection.createStatement();
statement.setQueryTimeout(QUERY_TIMEOUT); // set timeout to 30 sec.
ResultSet rs = statement.executeQuery("SELECT " + TABLE_PROPERTIES_COLUMNS.TRIP_NUMBER + " FROM " + TABLE_PROPERTIES);
if (rs.next()) {
tripNumber = rs.getInt(TABLE_PROPERTIES_COLUMNS.TRIP_NUMBER);
}
rs.close();
} catch (SQLException ex) {
logger.error("Error getting trip number : " + ex.getMessage());
} finally {
return tripNumber;
}
}
}
SQLiteTest.java
public class SqliteTest
{
private static SqliteTest instance = null;
private static final Logger logger = LoggerFactory.getLogger(SqliteTest.class);
private SqliteTest()
{
}
public static SqliteTest getInstance()
{
if(instance == null)
{
instance = new SqliteTest();
}
return instance;
}
public void start()
{
SQLiteManager sqliteManager = SQLiteManager.getInstance();
logger.debug("TRIP NUMBER : " + sqliteManager.getTripNumber());
sqliteManager.incrementTripNumber();
logger.debug("TRIP NUMBER INCREASED: " + sqliteManager.getTripNumber());
}
public void stop()
{
SQLiteManager.getInstance().closeDB();
}
}
Pom File
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.app</groupId>
<artifactId>SqliteTest</artifactId>
<version>1.0</version>
<packaging>war</packaging>
<name>SqliteTest</name>
<properties>
<endorsed.dir>${project.build.directory}/endorsed</endorsed.dir>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.21</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.1.7</version>
</dependency>
<dependency>
<groupId>javax</groupId>
<artifactId>javaee-web-api</artifactId>
<version>7.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.xerial</groupId>
<artifactId>sqlite-jdbc</artifactId>
<version>3.8.11.2</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
<compilerArguments>
<endorseddirs>${endorsed.dir}</endorseddirs>
</compilerArguments>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.3</version>
<configuration>
<failOnMissingWebXml>false</failOnMissingWebXml>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.6</version>
<executions>
<execution>
<phase>validate</phase>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<outputDirectory>${endorsed.dir}</outputDirectory>
<silent>true</silent>
<artifactItems>
<artifactItem>
<groupId>javax</groupId>
<artifactId>javaee-endorsed-api</artifactId>
<version>7.0</version>
<type>jar</type>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
console output on Netbeans local installation
== context initialized ==
08:40:55.398 [http-nio-8080-exec-18] DEBUG SQLiteManager - Opened database successfully
08:40:55.406 [http-nio-8080-exec-18] DEBUG SqliteTest - TRIP NUMBER : 2
08:40:55.543 [http-nio-8080-exec-18] DEBUG SqliteTest - TRIP NUMBER INCREASED: 3
08:40:55.543 [http-nio-8080-exec-18] DEBUG SQLiteManager - Connection closed successfully
And this is the error I am getting when I build the war file and deploy it on Raspberry through Tomcat manager.
== context initialized ==
Aug 25, 2016 6:05:33 PM org.apache.catalina.core.StandardContext startInternal
SEVERE: Error listenerStart
Aug 25, 2016 6:05:33 PM org.apache.catalina.core.StandardContext startInternal
SEVERE: Context [/SqliteTest-1.0] startup failed due to previous errors
== context destroyed ==
Aug 25, 2016 6:05:33 PM org.apache.catalina.loader.WebappClassLoaderBase clearReferencesJdbc
WARNING: The web application [/SqliteTest-1.0] registered the JDBC driver [org.sqlite.JDBC] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered.
Aug 25, 2016 6:05:33 PM org.apache.catalina.startup.HostConfig deployWAR
INFO: Deployment of web application archive /var/lib/tomcat8/webapps/SqliteTest-1.0.war has finished in 42,719 ms
java version on Raspberry :
openjdk version "1.8.0_40-internal"
OpenJDK Runtime Environment (build 1.8.0_40-internal-b04)
OpenJDK Zero VM (build 25.40-b08, interpreted mode)
Tomcat version : Apache Tomcat/8.0.14 (Debian)
Tomcat JVM 1.8.0_40-internal-b04

Categories