groovy-eclipse-compiler compiles but javac compilation fails - java

My project builds successfully with groovy-eclipse-compiler, but fails without groovy-eclipse-compiler (using just javac). The build fails with an error message as given below (reported in a test class, while mocking an invocation)
java: reference to getFileResource is ambiguous
In order to debug the issue, I created a project with minimal files (given below). Though in project we have groovy source also, but I have not included them here to keep the code minimal.
The code is also pushed to git and is available at https://github.com/kaushalkumar/project-debug
My Doubt: The reported issue looks to be legitimate and I feel that groovy-eclipse-compiler must also fail, but it seems that the error is ignored. I am trying to understand what make groovy compiler to ignore it. Is it an issue in groovy compiler?
src/main/java/pkg1/IStrategy.java
package pkg1;
import java.util.Map;
public interface IStrategy {
Map<String, Object> getEnvMap();
}
src/main/java/pkg1/SharedResourceHelper.java
package pkg1;
import java.io.File;
import java.io.IOException;
import java.util.Map;
public class SharedResourceHelper {
public static File getFileResource(final String resourceName, final IStrategy strategy) throws IOException {
return getFileResource(resourceName, strategy.getEnvMap());
}
public static File getFileResource(final String resourceName, final Map<String, Object> envConfig) throws IOException {
return null;
}
}
src/test/java/pkg1/StrategyTest.java
package pkg1;
import pkg1.SharedResourceHelper;
import org.easymock.EasyMock;
import org.powermock.api.easymock.PowerMock;
import org.powermock.core.classloader.annotations.PrepareForTest;
import org.junit.Test;
import org.powermock.modules.junit4.PowerMockRunner;
import org.junit.runner.RunWith;
import java.io.File;
#PrepareForTest({SharedResourceHelper.class})
#RunWith(PowerMockRunner.class)
public class StrategyTest {
#Test
#PrepareForTest({SharedResourceHelper.class})
public void testGetFileResource() throws Exception {
PowerMock.mockStatic(SharedResourceHelper.class);
EasyMock.expect(SharedResourceHelper.getFileResource(EasyMock.anyString(), EasyMock.anyObject())).andReturn(File.createTempFile("tmp", "s"));
// EasyMock.expect(SharedResourceHelper.getFileResource("test", null)).andReturn(File.createTempFile("tmp", "s"));
}
}
/pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>project.debug</groupId>
<artifactId>project</artifactId>
<packaging>jar</packaging>
<version>1.0-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.powermock</groupId>
<artifactId>powermock-api-easymock</artifactId>
<version>2.0.7</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.powermock</groupId>
<artifactId>powermock-module-junit4</artifactId>
<version>2.0.7</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<compilerId>groovy-eclipse-compiler</compilerId>
<source>1.8</source>
<target>1.8</target>
</configuration>
<dependencies>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-eclipse-compiler</artifactId>
<version>2.9.2-01</version>
</dependency>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-eclipse-batch</artifactId>
<version>2.4.3-01</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
</project>
Java version - 1.8.0_231
Maven - 3.6.2
OS - Mac 10.15.6
groovy-eclipse-compiler - 2.9.2-01
groovy-eclipse-batch - 2.4.3-01

You reference "SharedResourceHelper.getFileResource(EasyMock.anyString(), EasyMock.anyObject())" is indeed ambiguous. If you add a typecast before "EasyMock.anyObject()" you could disambiguate. And EasyMock probably provides an "any" method that you can pass a type into as well.
groovy-eclipse-compiler is based upon ecj (eclipse compiler for java) and not javac, so there are bound to be differences. It may also be that ecj has a different default error/warning level for this particular case. If you feel this should be an error, you can file a JDT bug at bugs.eclipse.org.

eric-milles gave some direction to further explore this. His input is available at https://github.com/groovy/groovy-eclipse/issues/1157.
Based on his comment, we explored the history of https://github.com/groovy/groovy-eclipse/blob/master/extras/groovy-eclipse-batch-builder/build.properties and found that the compilation issue was between 2.4.12-01 (compilation works) and 2.4.12-02 (compilation breaks- as expected), which was part of release 2.9.2.
The change happened on Aug 10, 2017 (13c1c2a#diff-c8c111c3afb6080ae6b32148caaf6a0a), with comment as "Remove codehaus references". The jdt.patch.target was targeted for e44 which is Luna. This was same for both the files.
I invested some time in exploring https://github.com/eclipse/eclipse.jdt.core, to figure out how compiler behaviour could have altered, but could not get much. Though I am not very sure, but I feel that change in groovy-eclipse-batch (between 2.4.12-01 and 2.4.12-02) might be the cause of this.
Having invested this much time, I feel that it is not worth to further debug on this to figure out the root cause as the issue is already fixed in next version(s) [2.4.12-02 and beyond].

Related

Facing error "cucumber.runtime.CucumberException: No backends were found."

I am working on Selenium with Cucumber setup and I am facing error when I run the TestRunner file.
cucumber.runtime.CucumberException: No backends were found. Please make sure you have a backend module on your CLASSPATH.
POM.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>Automation</groupId>
<artifactId>Cucumber</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>Cucumber</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<!-- https://mvnrepository.com/artifact/info.cukes/cucumber-junit -->
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-junit</artifactId>
<version>1.2.5</version>
</dependency>
<!-- https://mvnrepository.com/artifact/info.cukes/cucumber-java -->
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-java</artifactId>
<version>1.2.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
</dependency>
</dependencies>
</project>
I have mapped feature file to StepDefinition file as shown below
Feature file
Feature: Login Functionality
Scenario: Home page default login
Given User is on Home page
When User logs into application with valid credentials
Then User should be logged into application and landing page should be displayed to user
StepDefinition file
package stepDefenition;
import org.junit.runner.RunWith;
import cucumber.api.junit.Cucumber;
import cucumber.api.java.en.Given;
import cucumber.api.java.en.When;
import cucumber.api.java.en.Then;
#RunWith(Cucumber.class)
public class StepDefenition {
#Given("^User is on Home page$")
public void user_is_on_home_page() throws Throwable {
System.out.println("User is on Home Page");
}
#When("^User logs into application with valid credentials$")
public void user_logs_into_application_with_valid_credentials() throws Throwable {
System.out.println("User enters valid credentials and clicks on submit");
}
#Then("^User should be logged into application and landing page should be displayed to user$")
public void user_should_be_logged_into_application_and_landing_page_should_be_displayed_to_user() throws Throwable {
System.out.println("User logged in successfully and landing page is displayed to user with all the details");
}
}
Test Runner File:
package cucumberOption;
import org.junit.runner.RunWith;
import cucumber.api.CucumberOptions;
import cucumber.api.junit.Cucumber;
#RunWith(Cucumber.class)
#CucumberOptions(
features = "src/test/java/features",
glue="stepDefenition")
public class TestRunner {
}
I am using version:
Neon.3 Release (4.6.3) of Eclipse
Natural 0.7.6 Cucumber plugin
Maven version : Apache Maven 3.6.3
Java Version : java version "1.8.0_241"
structure of the project
I have tried changing versions for java and Junit and adding Io.cucumber dependencies nothing worked.
Added java 1.8 jdk to project build path.
I have viewed all previous threads on this kind of issue.
I did not understand why you using #RunWith annotation in step def class. As cucumber will automatically take care of running it via cucumberRunner, so you can execute your test either from Runner class or via feature files(Scenario level or entire feature file) w/o using any #RunWith annotation. Probably this is the cause of your error here-See below exception:
cucumber.runtime.CucumberException:
Classes annotated with #RunWith(Cucumber.class) must not define any
Step Definition or Hook methods. Their sole purpose is to serve as
an entry point for JUnit. Step Definitions and Hooks should be defined
in their own classes. This allows them to be reused across features.
Offending class: class com.abc.StepDefs.FWFeatureTestSteps (for ex).
Also you need not to have separate junit dependency in pom, only cucumber-junit is enough as long as you want cucumber to get the job done.
Note: Don't mix io.cucumber and info.cukes, use any of them. Mixing both will also cause this very same exception.

Getting a RootDoc in jdk11

I am trying to test out some code that works with Java Doc, it is used under the maven-javadoc-plugin. I am trying to get it to work under jdk11. I am after an implementation of RootDoc which I can use when running tests.
Currently the tests use EasyDoclet which gives me a RootDoc like so:
EasyDoclet easyDoclet = new EasyDoclet(new File("dir"), "com.foo.bar");
RootDoc rootDoc = easyDoclet.getRootDoc()
However I could not get this to work under jdk11.
The first issue I had was tools.jar is missing so I changed my pom.xml to have:
<dependency>
<groupId>org.seamless</groupId>
<artifactId>seamless-javadoc</artifactId>
<version>1.1.1</version>
<exclusions>
<exclusion>
<groupId>com.sun</groupId>
<artifactId>tools</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- maybe this will get what ever was in tools.jar -->
<dependency>
<groupId>com.github.olivergondza</groupId>
<artifactId>maven-jdk-tools-wrapper</artifactId>
<version>0.1</version>
</dependency>
This lead to many instances of:
java.lang.NoClassDefFoundError: com/sun/tools/javadoc/PublicMessager
The PublicMessager class seems to exist to make public some constructors, I am not sure why it exists under the com.sun.tools package. I tried to make a copy of this class:
public static class PublicMessager extends
com.sun.tools.javadoc.main.Messager {
public PublicMessager(Context context, String s) {
super(context, s);
}
public PublicMessager(Context context, String s, PrintWriter printWriter, PrintWriter printWriter1, PrintWriter printWriter2) {
super(context, s, printWriter, printWriter1, printWriter2);
}
}
And the error message changes to:
java.lang.IllegalAccessError: superclass access check failed: class com.fun.javadoc.FooBar$PublicMessager (in unnamed module #0x4abdb505) cannot access class com.sun.tools.javadoc.main.Messager (in module jdk.javadoc) because module jdk.javadoc does not export com.sun.tools.javadoc.main to unnamed module #0x4abdb50
I exposed jdk.javadoc to the unnamed module using:
<build>
<plugins>
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>-Dfile.encoding=UTF-8</argLine>
<argLine>--add-opens=jdk.javadoc/com.sun.tools.javadoc.main=ALL-UNNAMED</argLine>
</configuration>
</plugin>
</plugins>
</build>
This meant that my custom version of PublicMessager would no longer have the errors shown however the version from seamless under com.sun.tools could not be found. I made my own version of EasyDoclet which used my PublicMessager however it turned out that the following two classes are missing:
import com.sun.tools.javadoc.JavadocTool;
import com.sun.tools.javadoc.ModifierFilter;
At this point I am not sure what to do. halp!
Perhaps an alternative would be to instead find the jdk11 equivalent of RootDoc which I think is DocletEnvironment and then some how get an implementation of that, I have no idea how to get an implementation of DocletEnvironment.

Java hadoop api YarnClient doesn't have "init()/start()" function?

I tried maven repo like this:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-yarn-common</artifactId>
<version>2.7.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-yarn-api -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-yarn-api</artifactId>
<version>2.7.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-yarn-client -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-yarn-client</artifactId>
<version>2.7.2</version>
</dependency>
Then my java code:
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.yarn.api.ApplicationConstants;
import org.apache.hadoop.yarn.api.records.*;
import org.apache.hadoop.yarn.client.api.YarnClient;
import org.apache.hadoop.yarn.client.api.YarnClientApplication;
import org.apache.hadoop.yarn.conf.YarnConfiguration;
import org.apache.hadoop.yarn.exceptions.YarnException;
public static void main(String[] args) {
YarnConfiguration yarnConfiguration = new YarnConfiguration();
YarnClient yarnClient = YarnClient.createYarnClient();
yarnClient.init(yarnConfiguration);
yarnClient.start();
}
Intellij ide shows "Cannot solve method init" and "Cannot solve method start".
I then tried to use jar version of 3.1.1 instead of "2.7.2". Same result. So what's wrong with my code and how to fix it?
the init and start method are derived from the AbstractService class.
you need to verify the YarnClient and AbstractService has the same version.
go to YarnClient and check the jar it refer to, then click on the AbstractService parent from the Yarn client and check his version.
change the YarnClient version according to your AbstractService version.
i had the same issue, it works for me . version 2.6.5 .

Standalone Java Websocket client NoClassDefFoundError by ContainerProvider

I'm new to Java, but I have to use it to do a small WebSocket related project.
So, I installed JDK 1.8.0 and NetBeans 8.1 on my CentOS 7 in a VirtualBox.
I added the tyrus-standalone-client-jdk 1.12 plug-in in the pom.xml to make the standalone Websocket client, and it built fine. However, I ran into the error below:
[root#cet7 ~]# java -jar "/root/NetBeansProjects/Switchclient/target/Switchclient-1.0-SNAPSHOT.jar"
Exception in thread "main" java.lang.NoClassDefFoundError: javax/websocket/ContainerProvider
at org.sample.switchclient.Switchclient.main(Switchclient.java:21)
Caused by: java.lang.ClassNotFoundException: javax.websocket.ContainerProvider
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 1 more
[root#cet7 ~]# java -version
java version "1.8.0_65"
Java(TM) SE Runtime Environment (build 1.8.0_65-b17)
Java HotSpot(TM) 64-Bit Server VM (build 25.65-b01, mixed mode)
I did a bit more searching and found that the "fully qualified classname of the container implementation of ContainerProvider must be listed in the META-INF/services/javax.websocket.ContainerProvider file in the implementation JAR file" for the ServiceLoader API according to Oracle documentation. So, I added the serviceloader-maven-plugin to the pom.xml. The result was that it did generate the META-INF/services/javax.websocket.ContainerProvider file, but without any content, and the runtime error continued to persist. I tried to modify the contents bellow manually and re-pack it into a JAR but it did not worked:
org.glassfish.tyrus.container.inmemory.InMemoryContainerProvider
org.glassfish.tyrus.client.ClientManager
I've attached the Java file and the pom.xml. I've worked for hours and haven't a clue what the issue is, so any response to this thread will be appreciated.
Thank you very much.
===========LIST1: pom.xml===========
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.sample</groupId>
<artifactId>Switchclient</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.6</version>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<mainClass>org.sample.switchclient.Switchclient</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
<plugin>
<groupId>eu.somatik.serviceloader-maven-plugin</groupId>
<artifactId>serviceloader-maven-plugin</artifactId>
<version>1.0.6</version>
<configuration>
<services>
<param>javax.websocket.ContainerProvider</param>
</services>
</configuration>
<executions>
<execution>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.glassfish.tyrus.bundles</groupId>
<artifactId>tyrus-standalone-client-jdk</artifactId>
<version>1.12</version>
</dependency>
</dependencies>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
</project>
===========LIST2: Switchclient.java===========
package org.sample.switchclient;
import java.net.URI;
import javax.websocket.ClientEndpoint;
import javax.websocket.ContainerProvider;
import javax.websocket.OnMessage;
import javax.websocket.Session;
import javax.websocket.WebSocketContainer;
#ClientEndpoint
public class Switchclient {
#OnMessage
public void onRemoteMessage (String message) {
System.out.println("Received msg: "+message);
}
public static void main(String[] args) {
WebSocketContainer container = null;
Session session = null;
try{
container = ContainerProvider.getWebSocketContainer();
session = container.connectToServer (Switchclient.class, URI.create("ws://localhost:8080/Switchserver/"));
}catch (Exception e) {
e.printStackTrace();
}
}
}
Basically, Tyrus requires Java EE. It's the reason you have to list a lot of dependencies in pom.xml. If you use Java SE and want to keep your project small, use another different WebSocket client library that depends on only Java SE. For example, nv-websocket-client (mine).
Just add the following dependency to pom.xml,
<dependency>
<groupId>com.neovisionaries</groupId>
<artifactId>nv-websocket-client</artifactId>
<version>1.13</version>
</dependency>
then try:
import com.neovisionaries.ws.client.*;
public class Switchclient
{
public static void main(String[] args) throws Exception
{
WebSocket websocket = new WebSocketFactory()
.createSocket("ws://localhost:8080/Switchserver/")
.addListener(new WebSocketAdapter() {
#Override
public void onTextMessage(WebSocket ws, String message) {
System.out.println("Received msg: " + message);
}
})
.connect();
// Don't forget to call disconnect() after use.
// websocket.disconnect();
}
}
I'm not sure what exactly caused the problem since I kept trying and problems kept jumping out during the past day. But finally here is it:
Client dependencies:
<dependency>
<groupId>javax.websocket</groupId>
<artifactId>javax.websocket-client-api</artifactId>
<version>1.1</version>
</dependency>
<dependency>
<groupId>org.glassfish.tyrus.bundles</groupId>
<artifactId>tyrus-standalone-client</artifactId>
<version>1.12</version>
</dependency>
<dependency>
<groupId>org.glassfish.tyrus</groupId>
<artifactId>tyrus-container-grizzly-client</artifactId>
<version>1.12</version>
</dependency>
at a first glance it seems javax.websocket-client-api should be enough but finally cyber said that ContainerProvider is not impediment.
Then, all built OK. (with different java codes from my original post, I tried a lot on including the source codes, but codes themselves don't matter that match here while the environment setup matters. They mostly based on the examples of Tyrus 1.9 user guide however.)
And the run from the NetBeans by maven was OK, but when I went to use "java -jar Switchclient.jar", same/similar problem jumped out saying problem with "Endpoint".
Finally (as a last try) I copied all those tar files included in the classpath (witch was generated by maven-jar-plugin by specifying "<addClasspath>true<addClasspath>" into one directory, and also copied the generated jar file in, then it worked:
[root#cet7 neededjars]# ls
grizzly-framework-2.3.22.jar tyrus-client-1.12.jar
grizzly-http-2.3.22.jar tyrus-container-grizzly-client-1.12.jar
grizzly-http-server-2.3.22.jar tyrus-core-1.12.jar
javax.websocket-api-1.1.jar tyrus-spi-1.12.jar
javax.websocket-client-api-1.1.jar tyrus-standalone-client-1.12.jar
Switchclient-1.1-SNAPSHOT.jar
[root#cet7 neededjars]# java -jar Switchclient-1.1-SNAPSHOT.jar
Received message: Hello world
That's it, dirty and but worked. and I'm at a new start. Again, I'm really new to java (one of those non-hard-tech guys, just pick it up in case of need); and it showed me the complicity of the community based development, especially in the case the technology is relatively new. dependencies and pitfall everywhere. That's is one part of the nature I guess...

How can I connect Storm and D3.js using Redis and Flask?

I have my Storm testing topology done, and before I created a d3 script on an Html code, that readed the data from a text file. I want it now to read the data directly from a Storm topology (a bolt maybe?) But I have no clue of how to do it. I'm using Horton Works Sandbox for the testing, Any help would be apprecieated.
Thanks in advance!
I've found a storm package for redis that I'm trying to use now. It allows you to set a bolt for writting on redis, and I've set the node already. My problem now is that eclipse can't find the imports of the java code and the ones on the pom.xml.I've downloaded the package. My current java bolt and imports are:
package Storm.practice.Storm.Prova;
import backtype.storm.Config;
import backtype.storm.LocalCluster;
import backtype.storm.StormSubmitter;
import backtype.storm.task.OutputCollector;
import backtype.storm.task.TopologyContext;
import backtype.storm.testing.TestWordSpout;
import backtype.storm.topology.OutputFieldsDeclarer;
import backtype.storm.topology.TopologyBuilder;
import backtype.storm.topology.base.BaseRichBolt;
import backtype.storm.tuple.Fields;
import backtype.storm.tuple.Tuple;
import backtype.storm.tuple.Values;
import backtype.storm.utils.Utils;
import backtype.storm.spout.SpoutOutputCollector;
import backtype.storm.topology.base.BaseRichSpout;
import java.util.Map;
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
import java.util.concurrent.atomic.AtomicLong;
import storm.external.*;// error from here
import storm.external.storm-redis.org.apache.storm.redis.common.config.JedisClusterConfig;
import org.apache.storm.redis.common.config.JedisPoolConfig;
import org.apache.storm.redis.common.mapper.RedisDataTypeDescription;
import org.apache.storm.redis.common.mapper.RedisStoreMapper;
import redis.clients.jedis.JedisCommands;//to here
..........
class MortsStoreMapper implements RedisStoreMapper {
private RedisDataTypeDescription description;
private final String hashKey = "wordCount";
public WordCountStoreMapper() {
description = new RedisDataTypeDescription(
RedisDataTypeDescription.RedisDataType.HASH, hashKey);
}
#Override
public RedisDataTypeDescription getDataTypeDescription() {
return description;
}
#Override
public String getKeyFromTuple(ITuple tuple) {
return tuple.getStringByField("word");
}
#Override
public String getValueFromTuple(ITuple tuple) {
return tuple.getStringByField("count");
}
}
And my pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>Storm.practice</groupId>
<artifactId>Storm.Prova</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>Storm.Prova</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-core</artifactId>
<version>0.9.1-incubating</version>
</dependency>
<dependency> #error from here...
<groupId>org.apache.storm</groupId>
<artifactId>storm-redis</artifactId>
<version>{0.9.1-incubating}</version>
<type>jar</type>
</dependency>#... to here
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
</dependencies> <build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.2.1</version>
<executions>
<execution>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
<configuration>
<mainClass>Storm.practice.Storm.Prova.ProvaTopology</mainClass>
</configuration>
</plugin>
</plugins>
</build>
</project>
The errors are that Eclipse can't find the dependences and the packages
Based on your scenario, I think you will need some system or code in the middle that will read data from Storm and push to D3. You can try out something like WSO2 CEP [1], which has the ability to connect to Storm and uses websockets to push events to a dashboard based on d3 [2].
In your scenario, you can map your logic in the Storm bolt to a Siddhi query [3] and then get those events from Storm to WSO2 CEP. Then you can create a websocket publisher to send events to your D3 code using the built-in websocket capabilities of the server.
Please note that this is one of the possible solutions based on your requirements and you might be better off utilizing the capabilities of an already existing CEP system that has integration to Storm and D3.
Hope this helps!
[1] http://wso2.com/products/complex-event-processor/
[2] https://docs.wso2.com/display/CEP400/Visualizing+Results+in+the+Analytics+Dashboard
[3] https://docs.wso2.com/display/CEP400/Sample+0501+-+Processing+a+Simple+Filter+Query+with+Apache+Storm+Deployment
I now it's a bit late, almost a year, but I was reviewing my account, and saw this question.
I finally used Redis, interfaced wit Jedis, that I import as a Maven artifact. Once this was working and I was able to see the results with the Redis Monitor via telnet, I created a simple Node.js code, launched it, and the data was arriving to the client, hence to d3. I needed Socket.io and Redis.js to achieve this, but is working now.
If someone need some details, please, ask me and I will help you happily.

Categories