Can I have cucumber datatable printed in the console output in Java? - java

E.g. If this is my feature file:
Scenario: Some dummy scenario
Given that I do something with this datatable:
| print | this |
| and | this |
And something else
The output looks like:
Given that I do something with this datatable:
And something else
I was wondering if it is possible to have an output similar to this:
Given that I do something with this datatable:
| print | this |
| and | this |
And something else
Thank you for your help
Edit: As requested, the details of my setup are exposed bellow.
I am using Java and this is the class responsible for the configuration:
#RunWith(Cucumber.class)
#CucumberOptions(plugin = {"pretty", "html:target/cucumber"},
monochrome = false,
glue = {"my.dummy.package"},
features = {"classpath:dummy.feature"})
public class DummyFT {
#Test
public void test() throws IOException {
}
}
These tests are executed as a separate maven goal. My profile section has:
<profile>
<id>functional-tests</id>
<build>
<plugins>
<plugin>
<artifactId>maven-failsafe-plugin</artifactId>
<configuration>
<testSourceDirectory>test/test-functional/java</testSourceDirectory>
<includes>
<include>**/*FT.java</include>
</includes>
</configuration>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
and the failsafe plugin:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<dependencies>
<dependency>
<groupId>org.apache.maven.surefire</groupId>
<artifactId>surefire-junit47</artifactId>
<version>${maven.surefire.junit47}</version>
</dependency>
</dependencies>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>
And the cucumber dependencies:
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
<version>${cucumber.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>${cucumber.version}</version>
<scope>test</scope>
</dependency>
Finally, the tests are triggered by running:
mvn failsafe:integration-test -Pfunctional-tests

I noticed that there was no solution to this still and couldn't find one anywhere else. We were able to get this to work using a custom afterStep hook, though we're doing this with cucumber-jvm-groovy so obviously it's a bit different. You should be able to take something like this and convert it to java if you're determined though.
import cucumber.api.PickleStepTestStep
import cucumber.api.TestCase
import cucumber.runner.Scenario
import gherkin.pickles.PickleCell
import gherkin.pickles.PickleRow
import gherkin.pickles.PickleTable
import groovy.util.logging.Log4j
import util.StringHelper
import java.lang.reflect.Field
#Log4j
class CucumberHelper {
static TestCase getTestCase(Scenario scenario) {
Field testCaseField = scenario.getClass().getDeclaredField("testCase")
testCaseField.setAccessible(true)
(TestCase) testCaseField.get(scenario)
}
static PickleStepTestStep getStepObject(TestCase testCase, int stepIndex) {
(PickleStepTestStep) testCase.getTestSteps()[stepIndex]
}
static printDataTables(PickleStepTestStep step, boolean stepPassed) {
if (stepContainsDataTable(step)) {
PickleTable table = getDataTable(step)
printTable(table, stepPassed)
}
}
static private boolean stepContainsDataTable(PickleStepTestStep step) {
step?.step?.arguments?.any { it instanceof PickleTable }
}
static private PickleTable getDataTable(PickleStepTestStep step) {
step.step.arguments.find { it instanceof PickleTable }
}
static private void printTable(PickleTable table, boolean stepPassed) {
List<Integer> widths = []
table.rows.each { PickleRow row ->
row.cells.eachWithIndex { PickleCell cell, int i ->
int max = widths[i] ?: 0
int cellWidth = cell.value.length()
if(cellWidth > max){
widths[i] = cellWidth
}
}
}
table.rows.each { PickleRow row ->
printRow(row, stepPassed, widths)
}
}
static private void printRow(PickleRow row, boolean stepPassed, List<Integer> columnWidths) {
String output = ' | '
row.cells.eachWithIndex { PickleCell cell, int i ->
output += cell.value.padRight(columnWidths[i]) + ' | '
}
println getConsoleColor(stepPassed) + output + StringHelper.ANSI_RESET
}
static private String getConsoleColor(boolean stepPassed) {
stepPassed ? StringHelper.ANSI_GREEN : StringHelper.ANSI_RED
}
static void logBeforeStep(PickleStepTestStep step) {
log.trace "BEFORE STEP:\n\n\t${step.stepText}\n"
}
static void logAfterStep(PickleStepTestStep step) {
log.trace "AFTER STEP:\n\n\t${step.stepText}\n"
}
}
And the hook:
AfterStep() { Scenario scenario ->
try {
TestCase testCase = CucumberHelper.getTestCase(scenario)
PickleStepTestStep step = CucumberHelper.getStepObject(testCase, scenario.stepResults.size() - 1)
CucumberHelper.logAfterStep(step)
CucumberHelper.printDataTables(step, !scenario.isFailed())
} catch (Throwable t) {
// ignore, this hook is only for logging
}
}
(Obviously there's some extra functionality in there for our use cases, but it might help others. The StringHelper just adds unicode characters for coloring the output but you could easily remove that and have it print in standard terminal colors.)

Related

luckperms non-static method getUserManager() cannot be referenced from a static context

I'm trying to get user with uuid but an error pops up and says this:
non-static method getUserManager() cannot be referenced from a static context
I tried to place it in a different method and call that method but didn't worked, whatever I do that error still pops up. It could be because I wrote LuckPerms instead of luckPerms but I don't think so.
And here is the code (I deleted some useless stuff)
I tried searching but couldn't find anything, also I've read API like 5 times and it was useless
// BUNGEECORD
import net.md_5.bungee.api.plugin.Plugin;
// BUNGEECORD
//JAVA
import java.util.EnumSet;
//JAVA
//LuckPerms
import net.luckperms.api.LuckPermsProvider;
import net.luckperms.api.LuckPerms;
//LuckPerms
public void onPrivateMessageReceived(final PrivateMessageReceivedEvent event) {
UUID uuid = UUID.nameUUIDFromBytes(("OfflinePlayer:" + messages.get(0).getContentDisplay()).getBytes());
net.luckperms.api.model.user.User user = LuckPerms.getUserManager().getUser(uuid); // AND HERE IS THE ERROR
DataMutateResult result = user.data().add(Node.builder("group.admin").build());
};
And here is my pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>area</groupId>
<artifactId>amogus</artifactId>
<version>1.0.0-SNAPSHOT</version>
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<repositories>
<repository>
<id>dv8tion</id>
<name>m2-dv8tion</name>
<url>https://m2.dv8tion.net/releases</url>
</repository>
<repository>
<id>spigot-repo</id>
<url>https://hub.spigotmc.org/nexus/content/repositories/snapshots/</url>
</repository>
<repository>
<snapshots>
<enabled>false</enabled>
</snapshots>
<id>bintray-dv8fromtheworld-maven</id>
<name>bintray</name>
<url>http://dl.bintray.com/dv8fromtheworld/maven</url>
</repository>
<repository>
<id>bungeecord-repo</id>
<url>https://oss.sonatype.org/content/repositories/snapshots</url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>net.dv8tion</groupId>
<artifactId>JDA</artifactId>
<version>4.3.0_277</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>2.0.0-alpha4</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>2.0.0-alpha4</version>
</dependency>
<dependency>
<groupId>net.luckperms</groupId>
<artifactId>api</artifactId>
<version>5.3</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.bukkit</groupId>
<artifactId>bukkit</artifactId>
<version>1.12.2-R0.1-SNAPSHOT</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>net.md-5</groupId>
<artifactId>bungeecord-api</artifactId>
<version>1.16-R0.5-SNAPSHOT</version>
<type>jar</type>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>net.md-5</groupId>
<artifactId>bungeecord-api</artifactId>
<version>1.16-R0.5-SNAPSHOT</version>
<type>javadoc</type>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.1.0</version>
<configuration>
<archive>
<manifest>
<mainClass>area.amogus</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<exclude></exclude>
</excludes>
</artifactSet>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
<resources>
<resource>
<directory>resources</directory>
</resource>
</resources>
</build>
</project>
Imagine we have a class:
public class MyClass {
public static void main(String[] args) {
MyClass.doThis();
}
public void doThis(){
System.out.println("do this");
}
}
In this case, you will get exactly the same error as you're getting in your code. It means, we cannot access doThis() method, before an instance of MyClass is created, like
MyClass myClass = new MyClass();
myClass.doThis(); //this is valid since we have an instance to call this method on.
I would assume you have to get an instance of LuckPerms object in some way, before calling a .getUserManager() method on it.
Like this:
LuckPerms luckPerms = new LuckPerms(); //assuming there's a respective constructor
luckPerms.getUserManager();
you cannot use LuckPerms.getUserManager(), because you have not instantiated new Object from LuckPerms. Aslo LuckPerms cannot be instantiated because it is an interface. you should implement the interface first and then instantiate and use it.
your code could be like this:
public void onPrivateMessageReceived(final PrivateMessageReceivedEvent event) {
UUID uuid = UUID.nameUUIDFromBytes(("OfflinePlayer:" + messages.get(0).getContentDisplay()).getBytes());
LuckPerms luckPerms = new LuckPerms() {
#Override
public #org.checkerframework.checker.nullness.qual.NonNull String getServerName() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull UserManager getUserManager() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull GroupManager getGroupManager() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull TrackManager getTrackManager() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull <T> PlayerAdapter<T> getPlayerAdapter(#org.checkerframework.checker.nullness.qual.NonNull Class<T> aClass) {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull Platform getPlatform() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull PluginMetadata getPluginMetadata() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull EventBus getEventBus() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull Optional<MessagingService> getMessagingService() {
return Optional.empty();
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull ActionLogger getActionLogger() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull ContextManager getContextManager() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull MetaStackFactory getMetaStackFactory() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull CompletableFuture<Void> runUpdateTask() {
return null;
}
#Override
public void registerMessengerProvider(#org.checkerframework.checker.nullness.qual.NonNull MessengerProvider messengerProvider) {
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull NodeBuilderRegistry getNodeBuilderRegistry() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull QueryOptionsRegistry getQueryOptionsRegistry() {
return null;
}
#Override
public #org.checkerframework.checker.nullness.qual.NonNull NodeMatcherFactory getNodeMatcherFactory() {
return null;
}
};
net.luckperms.api.model.user.User user = luckPerms.getUserManager().getUser(uuid); // AND HERE IS THE ERROR
DataMutateResult result = user.data().add(Node.builder("group.admin").build());
}

Unable to consume kafka messages using apache storm

I have developed an application to consume the kafka message using apache storm, when i run topology using in LocalCluster in eclipse it works fine and messages consumes properly, but when i run this using storm command (bin\storm jar ..\kafka-storm-0.0.1-SNAPSHOT.jar com.kafka_storm.util.Topology storm-kafka-topology), the topology started but unable to consume any message, it there something wrong i am doing, or guide me what things i can do to find the problem
Topology Code
public class Topology {
public Properties configs;
public BoltBuilder boltBuilder;
public SpoutBuilder spoutBuilder;
public Topology(String configFile) throws Exception {
configs = new Properties();
InputStream is = null;
try {
is = this.getClass().getResourceAsStream("/application.properties");
configs.load(is);
//configs.load(Topology.class.getResourceAsStream("/application.properties"));
boltBuilder = new BoltBuilder(configs);
spoutBuilder = new SpoutBuilder(configs);
} catch (Exception ex) {
ex.printStackTrace();
System.exit(0);
}
}
private void submitTopology() throws Exception {
System.out.println("Entered in submitTopology");
TopologyBuilder builder = new TopologyBuilder();
KafkaSpout<?, ?> kafkaSpout = spoutBuilder.buildKafkaSpout();
SinkTypeBolt sinkTypeBolt = boltBuilder.buildSinkTypeBolt();
MongoDBBolt mongoBolt = boltBuilder.buildMongoDBBolt();
//set the kafkaSpout to topology
//parallelism-hint for kafkaSpout - defines number of executors/threads to be spawn per container
int kafkaSpoutCount = Integer.parseInt(configs.getProperty(Keys.KAFKA_SPOUT_COUNT));
builder.setSpout(configs.getProperty(Keys.KAFKA_SPOUT_ID), kafkaSpout, kafkaSpoutCount);
//set the sinktype bolt
int sinkBoltCount = Integer.parseInt(configs.getProperty(Keys.SINK_BOLT_COUNT));
builder.setBolt(configs.getProperty(Keys.SINK_TYPE_BOLT_ID),sinkTypeBolt,sinkBoltCount).shuffleGrouping(configs.getProperty(Keys.KAFKA_SPOUT_ID));
//set the mongodb bolt
int mongoBoltCount = Integer.parseInt(configs.getProperty(Keys.MONGO_BOLT_COUNT));
builder.setBolt(configs.getProperty(Keys.MONGO_BOLT_ID),mongoBolt,mongoBoltCount).shuffleGrouping(configs.getProperty(Keys.SINK_TYPE_BOLT_ID),Keys.MONGODB_STREAM);
String topologyName = configs.getProperty(Keys.TOPOLOGY_NAME);
Config conf = new Config();
//Defines how many worker processes have to be created for the topology in the cluster.
conf.setNumWorkers(1);
System.out.println("Submitting Topology");
//StormSubmitter.submitTopology(topologyName, conf, builder.createTopology());
System.out.println("Topology submitted");
LocalCluster cluster = new LocalCluster();
cluster.submitTopology(topologyName, conf, builder.createTopology());
}
public static void main(String[] args) throws Exception {
String configFile;
if (args.length == 0) {
System.out.println("Missing input : config file location, using default");
configFile = "application.properties";
} else{
configFile = args[0];
}
Topology ingestionTopology = new Topology(configFile);
ingestionTopology.submitTopology();
}
}
Spout Code
public class SpoutBuilder {
public Properties configs = null;
public SpoutBuilder(Properties configs) {
this.configs = configs;
}
public KafkaSpout<?, ?> buildKafkaSpout() {
String servers = configs.getProperty(Keys.KAFKA_BROKER);
String topic = configs.getProperty(Keys.KAFKA_TOPIC);
String group = configs.getProperty(Keys.KAFKA_CONSUMERGROUP);
return new KafkaSpout<>(getKafkaSpoutConfig(servers,topic,group));
}
protected KafkaSpoutConfig<String, String> getKafkaSpoutConfig(String bootstrapServers, String topic, String group) {
return KafkaSpoutConfig.builder(bootstrapServers, new String[]{topic})
.setProp(ConsumerConfig.GROUP_ID_CONFIG, group)
.setRetry(getRetryService())
.setOffsetCommitPeriodMs(10_000)
.setFirstPollOffsetStrategy(FirstPollOffsetStrategy.UNCOMMITTED_LATEST)
.setMaxUncommittedOffsets(250)
.setProcessingGuarantee(ProcessingGuarantee.AT_LEAST_ONCE)
.setTupleTrackingEnforced(true)
.setEmitNullTuples(false)
.setRecordTranslator(new DefaultRecordTranslator<String, String>())
.build();
}
protected KafkaSpoutRetryService getRetryService() {
return new KafkaSpoutRetryExponentialBackoff(TimeInterval.microSeconds(500),
TimeInterval.milliSeconds(2), Integer.MAX_VALUE, TimeInterval.seconds(10));
}
}
Bolt Builder
public class BoltBuilder {
public Properties configs = null;
public BoltBuilder(Properties configs) {
this.configs = configs;
}
public SinkTypeBolt buildSinkTypeBolt() {
return new SinkTypeBolt();
}
public MongoDBBolt buildMongoDBBolt() {
String host = configs.getProperty(Keys.MONGO_HOST);
int port = Integer.parseInt(configs.getProperty(Keys.MONGO_PORT));
String db = configs.getProperty(Keys.MONGO_DATABASE);
String collection = configs.getProperty(Keys.MONGO_COLLECTION);
return new MongoDBBolt(host, port, db, collection);
}
}
SinkTypeBolt Code
public class SinkTypeBolt extends BaseRichBolt {
private static final long serialVersionUID = 1L;
private OutputCollector collector;
public void execute(Tuple tuple) {
String value = tuple.getString(4);
System.out.println("Received in SinkType bolt : "+value);
if (value != null && !value.isEmpty()){
collector.emit(Keys.MONGODB_STREAM,new Values(value));
System.out.println("Emitted : "+value);
}
collector.ack(tuple);
}
public void prepare(Map conf, TopologyContext context, OutputCollector collector) {
this.collector = collector;
}
public void declareOutputFields(OutputFieldsDeclarer declarer) {
declarer.declareStream(Keys.MONGODB_STREAM, new Fields("content"));
}
}
MongoDB Bolt
public class MongoDBBolt extends BaseRichBolt {
private static final long serialVersionUID = 1L;
private OutputCollector collector;
private MongoDatabase mongoDB;
private MongoClient mongoClient;
private String collection;
public String host;
public int port ;
public String db;
protected MongoDBBolt(String host, int port, String db,String collection) {
this.host = host;
this.port = port;
this.db = db;
this.collection = collection;
}
public void prepare(Map stormConf, TopologyContext context, OutputCollector collector) {
this.collector = collector;
this.mongoClient = new MongoClient(host,port);
this.mongoDB = mongoClient.getDatabase(db);
}
public void execute(Tuple input) {
Document mongoDoc = getMongoDocForInput(input);
try{
mongoDB.getCollection(collection).insertOne(mongoDoc);
collector.ack(input);
}catch(Exception e) {
e.printStackTrace();
collector.fail(input);
}
}
#Override
public void cleanup() {
this.mongoClient.close();
}
public void declareOutputFields(OutputFieldsDeclarer declarer) {
// TODO Auto-generated method stub
}
public Document getMongoDocForInput(Tuple input) {
Document doc = new Document();
String content = (String) input.getValueByField("content");
String[] parts = content.trim().split(" ");
System.out.println("Received in MongoDB bolt "+content);
try {
for(String part : parts) {
String[] subParts = part.split(":");
String fieldName = subParts[0];
String value = subParts[1];
doc.append(fieldName, value);
}
} catch(Exception e) {
}
return doc;
}
}
pom.xml code
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.7</maven.compiler.source>
<maven.compiler.target>1.7</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-core</artifactId>
<version>1.2.2</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>1.1.0</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-kafka-client</artifactId>
<version>1.2.2</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>3.0.4</version>
</dependency>
<dependency>
<groupId>com.googlecode.json-simple</groupId>
<artifactId>json-simple</artifactId>
<version>1.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>1.4</version>
<configuration>
<createDependencyReducedPom>true</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer" />
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.kafka_storm.util.Topology</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<version>2.4</version>
</plugin>
</plugins>
<resources>
<resource>
<directory>src/main/java</directory>
<includes>
<include> **/*.properties</include>
</includes>
</resource>
</resources>
</build>
Storm UI
Just to be sure, you are remembering to use the StormSubmitter line in Topology, rather than the LocalCluster when you submit the topology with storm jar, right?
Also please check that you've started all the right daemons, i.e. storm nimbus, storm supervisor should be running as a minimum (plus your Zookeeper install)
The next places to look would be in your log files. In the Storm directory, you'll have a logs directory. Look in the logs/worker-artifacts/<your-topology-id>/<your-worker-port>/worker.log files. Those will hopefully get you on the right track to figuring out what's going on. I'd open Storm UI, find your spout and look up which worker ports it's running on, so you can look in the right log files.

NoNodeAvailableException elasticsearch in one IntelliJ project but not in other with same programmatic and elasticsearch.yml configuration and code

I know question related to NoNodeAvailbleException has already been asked multiple times on this platform previously. I had followed through all the solutions been provided by experts previously; yet none work with me. The problem with the Exception I get is unique.
I am facing NoNodeAvailableException in my Java Client. The usual solution is to ensure consistency in confiurations been set through programmatically and the one present in elasticsearch.yml. I had already ensured this.
Just for being on the safe side, I had created another IntelliJ project with same Maven dependencies with same elasticsearch client code and same configurations, but I didn't get NoNodeAvailableException in it.
I could work with this new Intellij Project, by shipping code from my previous IntelliJ project but I will lose all the commits I had previously have and other services written in my old project.
Old client code:
public class ElasticSearchTest {
private static TransportClient client;
public static void main(String[] args) {
try {
Settings elasticsearchSettings = Settings.builder().put("cluster.name", "elasticsearch_cse").put("node.name","elasticsearch_cse_data").build();
client = new PreBuiltTransportClient(elasticsearchSettings)
.addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("localhost"), 9300))
.addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("localhost"), 9300));
String sampleWebsite = "www.samplewebsite.com";
List<PageModel> pages = new ArrayList<PageModel>();
pages.add(new PageModel("www.samplewebsite.com/page1", "darkknight wins a lot", "First page"));
pages.add(new PageModel("www.samplewebsite.com/page1/page2", "superman survives", "Second page"));
pages.add(new PageModel("www.samplewebsite.com/page3", "wonderwoman releases", "third page"));
DomainModel domainModel = new DomainModel(sampleWebsite, pages);
insertIndicesFromDomain(domainModel);
boolean isResultFound = false;
List<PageModel> pageModels = getSearchResults(domainModel, "superman");
for(PageModel pageModel: pageModels){
if(pageModel.getPageContent().equals("superman survives")){
isResultFound = true;
System.out.println(pageModel);
}
}
// closeService();
} catch (UnknownHostException e) {
e.printStackTrace();
}
}
public static void insertIndicesFromDomain(DomainModel domainModel){
int i = 1;
for(PageModel pageModel: domainModel.getPages()){
client.prepareIndex("websiteindex", domainModel.getDomainName(), i + "").setSource(pageModel.getPageModelInMapFormat()).execute().actionGet();
i += 1;
System.out.println(i);
}
}
public static List<PageModel> getSearchResults(DomainModel domainModel, String searchKeyWords){
SearchResponse searchResponse = client.prepareSearch("websiteindex").setTypes(domainModel.getDomainName()).setSearchType(SearchType.QUERY_AND_FETCH)
.setQuery(QueryBuilders.termQuery("content", searchKeyWords)).execute().actionGet();
SearchHit[] results = searchResponse.getHits().getHits();
List<PageModel> pageModels = new ArrayList<PageModel>();
for(SearchHit searchHit: results){
pageModels.add(new PageModel(searchHit.getSource()));
}
return pageModels;
}
public static void getIndices(){
ImmutableOpenMap<String, IndexMetaData> indicies = client.admin().cluster().prepareState().get().getState().getMetaData().getIndices();
for(ObjectObjectCursor<String, IndexMetaData> entry: indicies){
System.out.printf(entry.key + " " + entry.value);
}
}
public static void closeService(){
client.close();
}
}
StackTrace of old client code:
Exception in thread "main" NoNodeAvailableException[None of the configured nodes are available: [{#transport#-1}{H90yHKj8R_aa0MeTciqjvA}{localhost}{127.0.0.1:9300}]]
at org.elasticsearch.client.transport.TransportClientNodesService.ensureNodesAreAvailable(TransportClientNodesService.java:347)
at org.elasticsearch.client.transport.TransportClientNodesService.execute(TransportClientNodesService.java:245)
at org.elasticsearch.client.transport.TransportProxyClient.execute(TransportProxyClient.java:59)
at org.elasticsearch.client.transport.TransportClient.doExecute(TransportClient.java:363)
at org.elasticsearch.client.support.AbstractClient.execute(AbstractClient.java:408)
at org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:80)
at org.elasticsearch.action.ActionRequestBuilder.execute(ActionRequestBuilder.java:54)
at net.media.sitesearch.models.ElasticSearchTest.insertIndicesFromDomain(ElasticSearchTest.java:66)
at net.media.sitesearch.models.ElasticSearchTest.main(ElasticSearchTest.java:42)
New client Code:
public class Main {
private static TransportClient client;
public static void main(String[] args) {
try {
Settings elasticsearchSettings = Settings.builder().put("cluster.name", "elasticsearch_cse").put("node.name","elasticsearch_cse_data").build();
client = new PreBuiltTransportClient(elasticsearchSettings)
.addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("localhost"), 9300))
.addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName("localhost"), 9300));
String sampleWebsite = "www.samplewebsite.com";
List<PageModel> pages = new ArrayList<PageModel>();
pages.add(new PageModel("www.samplewebsite.com/page1", "darkknight wins a lot", "First page"));
pages.add(new PageModel("www.samplewebsite.com/page1/page2", "superman survives", "Second page"));
pages.add(new PageModel("www.samplewebsite.com/page3", "wonderwoman releases", "third page"));
DomainModel domainModel = new DomainModel(sampleWebsite, pages);
insertIndicesFromDomain(domainModel);
boolean isResultFound = false;
List<PageModel> pageModels = getSearchResults(domainModel, "superman");
for(PageModel pageModel: pageModels){
if(pageModel.getPageContent().equals("superman survives")){
isResultFound = true;
System.out.println(pageModel);
}
}
// closeService();
} catch (UnknownHostException e) {
e.printStackTrace();
}
}
public static void insertIndicesFromDomain(DomainModel domainModel){
int i = 1;
for(PageModel pageModel: domainModel.getPages()){
client.prepareIndex("websiteindex", domainModel.getDomainName(), i + "").setSource(pageModel.getPageModelInMapFormat()).execute().actionGet();
i += 1;
System.out.println(i);
}
}
public static List<PageModel> getSearchResults(DomainModel domainModel, String searchKeyWords){
SearchResponse searchResponse = client.prepareSearch("websiteindex").setTypes(domainModel.getDomainName()).setSearchType(SearchType.QUERY_AND_FETCH)
.setQuery(QueryBuilders.termQuery("content", searchKeyWords)).execute().actionGet();
SearchHit[] results = searchResponse.getHits().getHits();
List<PageModel> pageModels = new ArrayList<PageModel>();
for(SearchHit searchHit: results){
pageModels.add(new PageModel(searchHit.getSource()));
}
return pageModels;
}
public static void getIndices(){
ImmutableOpenMap<String, IndexMetaData> indicies = client.admin().cluster().prepareState().get().getState().getMetaData().getIndices();
for(ObjectObjectCursor<String, IndexMetaData> entry: indicies){
System.out.printf(entry.key + " " + entry.value);
}
}
public static void closeService(){
client.close();
}
}
Output of new client code:
2
3
4
PageModel{pageUrl='www.samplewebsite.com/page1/page2', pageContent='superman survives', pageTitle='Second page', map={title=Second page, url=www.samplewebsite.com/page1/page2, content=superman survives}}
The only difference between my old code and new code, is that old code is under Git working tree and the new one is not.
New edit:
I found another important difference between my two projects. In my new Intellij Project, I haven't shipped all the other services from old one and just kept necessary ElasticSearch Client code. Hence, the new Intellij Project did not contain all the dependencies of the old project, but only one dependency of the elasticsearch.
In my new client code which does not contain all the maven dependencies except for elastic search, I don't get NoNodeAvailableException. Could the dependencies between elastic search and other maven apis be an indirect reason for that exception?
New pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.sampleelasticsearch</groupId>
<artifactId>SampleES</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>transport</artifactId>
<version>5.5.1</version>
</dependency>
</dependencies>
</project>
Old pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>net.media.siteresearch</groupId>
<artifactId>SiteSearch</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>com.sparkjava</groupId>
<artifactId>spark-core</artifactId>
<version>2.5</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
</dependency>
<dependency>
<groupId>net.media.crawler</groupId>
<artifactId>MnetCrawler</artifactId>
<version>0.995</version>
</dependency>
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>transport</artifactId>
<version>5.5.1</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.8.2</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.24</version>
</dependency>
<dependency>
<!-- jsoup HTML parser library # https://jsoup.org/ -->
<groupId>org.jsoup</groupId>
<artifactId>jsoup</artifactId>
<version>1.10.3</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.8.9</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.5.3</version>
</dependency>
</dependencies>
</project>
If authors here want to check logs from elastic search server or any other, then please ask (I have not added them because the question details were already filled with a lot of other relevant snippets from my perspective. Also, the question details would get cluttered).

Agent JAR not found or no Agent-Class attribute

// Fixed: This was not an Error because of code. It was because of the IDE.
I just tried to make a injection for a game called Minecraft.
But i have one Problem. It's not able to load Agent.
Here is the Exception:Exception in thread "main" com.sun.tools.attach.AgentLoadException: Agent JAR not found or no Agent-Class attribute
at sun.tools.attach.HotSpotVirtualMachine.loadAgent(HotSpotVirtualMachine.java:117)
at com.sun.tools.attach.VirtualMachine.loadAgent(VirtualMachine.java:540)
at pw.razex.injectionclient.Injectable.main(Injectable.java:55)
And my code:
AgentLoader [AgentClass]
public class AgentLoader {
public static void agentmain(String agent, Instrumentation instrumentation) {
try {
Class[] loadedClasses = instrumentation.getAllLoadedClasses();
File agentFile = new File(AgentLoader.class.getProtectionDomain().getCodeSource().getLocation().toURI().getPath());
agentFile.deleteOnExit();
for(int i = 0; i < loadedClasses.length; i++) {
Class loadedClass = loadedClasses[i];
if(loadedClass.getName().equals("net.minecraft.client.Minecraft")) {
LaunchClassLoader launchClassLoader = (LaunchClassLoader) loadedClass.getClassLoader();
launchClassLoader.addURL(AgentLoader.class.getProtectionDomain().getCodeSource().getLocation());
launchClassLoader.loadClass(MainGui.class.getName()).newInstance();
}
}
} catch (Throwable e) {
e.printStackTrace();
}
}
}
Main Class:
public class Injectable {
private static VirtualMachineDescriptor minecraftProcess;
public static void main(String[] args) throws Throwable{
OSUtil.initOS();
if(OSUtil.osType != OSUtil.OSType.WINDOWS) {
System.out.println("[X] Invalid OS [" + OSUtil.osType.name() + "]");
System.exit(-1);
}
File sourceFile = new File(Injectable.class.getProtectionDomain().getCodeSource().getLocation().toURI().getPath());
File tempAttachFile = File.createTempFile("lgt", ".dat");
Files.copy(sourceFile.toPath(), tempAttachFile.toPath(), StandardCopyOption.REPLACE_EXISTING);
System.out.println(new File("attach.dll").exists());
if (System.getProperty("java.library.path") != null) {
System.setProperty("java.library.path", new File("attach.dll").getAbsolutePath() + System.getProperty("path.separator") + System.getProperty("java.library.path"));
} else {
System.setProperty("java.library.path", new File("attach.dll").getAbsolutePath());
}
Field field = ClassLoader.class.getDeclaredField("sys_paths");
field.setAccessible(true);
field.set(null, null);
System.loadLibrary("attach");
if(VirtualMachine.list().size() == 0) {
System.out.println("[X] No injectable process found");
System.exit(-1);
}
List<VirtualMachineDescriptor> virtualMachineDescriptors = VirtualMachine.list();
for(VirtualMachineDescriptor virtualMachineDescriptor : virtualMachineDescriptors) {
if(virtualMachineDescriptor.displayName().startsWith("net.minecraft.client.main.Main")) {
minecraftProcess = virtualMachineDescriptor;
VirtualMachine virtualMachine = VirtualMachine.attach(minecraftProcess);
virtualMachine.loadAgent(tempAttachFile.getAbsolutePath());
System.out.println("[O] Attached to minecraft");
}
}
if(minecraftProcess == null) {
System.out.println("[X] Minecraft is not started yet.");
System.exit(-1);
}
}
}
And my pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
http://maven.apache.org/xsd/maven-4.0.0.xsd">
4.0.0
<groupId>InjectionClientMaven</groupId>
<artifactId>InjectionClientMaven</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifestEntries>
<Built-By>David</Built-By>
<Main-Class>de.david.injectionclient.Injectable</Main-Class>
<Agent-Class>de.david.injectionclient.AgentLoader</Agent-Class>
<Class-Path>tools.jar</Class-Path>
<Can-Retransform-Classes>true</Can-Retransform-Classes>
</manifestEntries>
</archive>
</configuration>
</plugin>
</plugins>
</build>
Make sure your agent jar has the following for its MANIFEST
Manifest-Version: 1.0
Agent-Class: com.package.AgentLoader.agentNameHere
Permissions: all-permissions

generate javascript from java class as a maven build step

I have a java enum that is used in my web application. I also have a lot of javascript code that refers to the values of the enum. It would be ideal If I could generate a javascript file from the enum as part of the maven build process. Does anyone know of a project that solves this problem or of an elegant way to tackle it ?
Thanks!
It turns out that there is a great way to do it using a groovy maven plugin as a "prepare-package" phase.
This is the code :
In your pom.xml add this entry :
<plugin>
<groupId>org.codehaus.groovy.maven</groupId>
<artifactId>gmaven-plugin</artifactId>
<executions>
<execution>
<id>script-prepare-package1</id>
<phase>prepare-package</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<source>${basedir}/src/main/groovy/GenerateJavaScriptEnum.groovy</source>
</configuration>
</execution>
</executions>
</plugin>
This is how the groovy script, GenerateJavaScriptEnum.groovy, looks like :
def fields = []
com.foo.bar.YourEnum.values().each() { f->
fields << "${f.name()} : \"${f.getId()}\""
}
if (fields) {
log.info("Generating Javascript for com.foo.bar.YourEnum")
[
new File("${project.build.directory}/${project.build.finalName}/js"),
new File("${project.basedir}/src/main/webapp/js")
].each() { baseOutputDir->
if (!baseOutputDir.exists()) {
baseOutputDir.mkdirs()
log.info("Created output dir ${baseOutputDir}")
}
def outfile = new File(baseOutputDir, "YourEnum.js")
log.info("Generating ${outfile}")
def writer = outfile.newWriter("UTF-8")
writer << "// FILE IS GENERATED FROM com.foo.bar.YourEnum.java.\n"
writer << "// DO NOT EDIT IT. CHANGES WILL BE OVERWRITTEN BY THE BUILD.\n"
writer << "YourEnum = {\n"
writer << fields.join(",\n")
writer << "\n}"
writer.close()
}
}
I had the same problem and ended up creating a custom tag that would allow me to iterate over the enum in my jsp,
public static Enum<?>[] getValues(String klass) {
try {
Method m = Class.forName(klass).getMethod("values", (Class<?>[]) null);
Object obj = m.invoke(null, (Object[]) null);
return (Enum<?>[]) obj;
} catch (Exception ex) {
return null;
}
}
Then in my jsp I just do,
var MyEnum = [
<c:forEach var="type" items="${foocustomtags:enumiter('com.foo.MyEnum')}">
'${type.value}': '${type.text}',
</c:forEach>
];

Categories