I have been testing remote submission of Storm Topologies using IDE (Eclipse).
And I succeeded uploading simple storm topology to remote Storm cluster, but the weird thing is when I checked Storm UI to make sure whether the topology, which was submitted remotely, is working without problems, I saw just _acker bolt in the UI but other bolts and spout is not there. After that I submitted the topology manually from command line and again checked Storm UI, and it is working as it is supposed to work without problems. I have been looking for the reason but couldn't find. I attached both topology and remote submitter class below and corresponding Storm UI pictures:
This is the output from Eclipse console (after remote submission)
225 [main] INFO backtype.storm.StormSubmitter - Uploading topology jar T:\STORM_TOPOLOGIES\Benchmark.jar to assigned location: /app/storm/nimbus/inbox/stormjar-d3ca2e14-c1d4-45e1-b21c-70f62c62cd84.jar
234 [main] INFO backtype.storm.StormSubmitter - Successfully uploaded topology jar to assigned location: /app/storm/nimbus/inbox/stormjar-d3ca2e14-c1d4-45e1-b21c-70f62c62cd84.jar
Here is topology:
public class StormBenchmark {
// ******************************************************************************************
public static class GenSpout extends BaseRichSpout {
//private static final Logger logger = Logger.getLogger(StormBenchmark.class.getName());
private Long count = 1L;
private Object msgID;
private static final long serialVersionUID = 1L;
private static final Character[] CHARS = new Character[] { 'a', 'b', 'c', 'd', 'e', 'f', 'g', 'h'};
private static final String[] newsagencies = {"bbc", "cnn", "reuters", "aljazeera", "nytimes", "nbc news", "fox news", "interfax"};
SpoutOutputCollector _collector;
int _size;
Random _rand;
String _id;
String _val;
// Constructor
public GenSpout(int size) {
_size = size;
}
public void open(Map conf, TopologyContext context, SpoutOutputCollector collector) {
_collector = collector;
_rand = new Random();
_id = randString(5);
_val = randString2(_size);
}
//Business logic
public void nextTuple() {
count++;
msgID = count;
_collector.emit(new Values(_id, _val), msgID);
}
public void ack(Object msgID) {
this.msgID = msgID;
}
private String randString(int size) {
StringBuffer buf = new StringBuffer();
for(int i=0; i<size; i++) {
buf.append(CHARS[_rand.nextInt(CHARS.length)]);
}
return buf.toString();
}
private String randString2(int size) {
StringBuffer buf = new StringBuffer();
for(int i=0; i<size; i++) {
buf.append(newsagencies[_rand.nextInt(newsagencies.length)]);
}
return buf.toString();
}
public void declareOutputFields(OutputFieldsDeclarer declarer) {
declarer.declare(new Fields("id", "item"));
}
}
// =======================================================================================================
// =================================== B O L T ===========================================================
public static class IdentityBolt extends BaseBasicBolt {
private static final long serialVersionUID = 1L;
public void declareOutputFields(OutputFieldsDeclarer declarer) {
declarer.declare(new Fields("id", "item"));
}
public void execute(Tuple tuple, BasicOutputCollector collector) {
String character = tuple.getString(0);
String agency = tuple.getString(1);
List<String> box = new ArrayList<String>();
box.add(character);
box.add(agency);
try {
fileWriter(box);
} catch (IOException e) {
e.printStackTrace();
}
box.clear();
}
public void fileWriter(List<String> listjon) throws IOException {
String pathname = "/home/hduser/logOfStormTops/logs.txt";
File file = new File(pathname);
if (!file.exists()){
file.createNewFile();
}
BufferedWriter writer = new BufferedWriter(new FileWriter(file, true));
writer.write(listjon.get(0) + " : " + listjon.get(1));
writer.newLine();
writer.flush();
writer.close();
}
}
//storm jar storm-benchmark-0.0.1-SNAPSHOT-standalone.jar storm.benchmark.ThroughputTest demo 100 8 8 8 10000
public static void main(String[] args) throws Exception {
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout("spout", new GenSpout(8), 2).setNumTasks(4);
builder.setBolt("bolt", new IdentityBolt(), 4).setNumTasks(8)
.shuffleGrouping("spout");
Config conf = new Config();
conf.setMaxSpoutPending(200);
conf.setStatsSampleRate(0.0001);
//topology.executor.receive.buffer.size: 8192 #batched
//topology.executor.send.buffer.size: 8192 #individual messages
//topology.transfer.buffer.size: 1024 # batched
conf.put("topology.executor.send.buffer.size", 1024);
conf.put("topology.transfer.buffer.size", 8);
conf.put("topology.receiver.buffer.size", 8);
conf.put(Config.TOPOLOGY_WORKER_CHILDOPTS, "-Xdebug -Xrunjdwp:transport=dt_socket,address=1%ID%,server=y,suspend=n");
StormSubmitter.submitTopology("SampleTop", conf, builder.createTopology());
}
}
And here is The RemoteSubmitter class:
public class RemoteSubmissionTopo {
#SuppressWarnings({ "unchecked", "rawtypes", "unused" })
public static void main(String... args) {
Config conf = new Config();
TopologyBuilder topoBuilder = new TopologyBuilder();
conf.put(Config.NIMBUS_HOST, "117.16.142.49");
conf.setDebug(true);
Map stormConf = Utils.readStormConfig();
stormConf.put("nimbus.host", "117.16.142.49");
String jar_path = "T:\\STORM_TOPOLOGIES\\Benchmark.jar";
Client client = NimbusClient.getConfiguredClient(stormConf).getClient();
try {
NimbusClient nimbus = new NimbusClient(stormConf, "117.16.142.49", 6627);
String uploadedJarLocation = StormSubmitter.submitJar(stormConf, jar_path);
String jsonConf = JSONValue.toJSONString(stormConf);
nimbus.getClient().submitTopology("benchmark-tp", uploadedJarLocation, jsonConf, topoBuilder.createTopology());
} catch (TTransportException e) {
e.printStackTrace();
} catch (AlreadyAliveException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (InvalidTopologyException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (TException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
try {
Thread.sleep(6000);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
And Here is the Storm UI pict (in case of remote submission)
And Here is the other Storm UI pict (in case of manual submission)
In RemoteSubmissionTopo you use TopologyBuilder topoBuilder = new TopologyBuilder(); but do not call setSpout(...)/setBolt(...). Thus, you are submitting an topology with no operators...
Btw: RemoteSubmissionTopo is actually not required at all. You can use StormBenchmark to submit remotely. Just add conf.put(Config.NIMBUS_HOST, "117.16.142.49"); in main and set JVM option -Dstorm.jar=/path/to/topology.jar and you are good to run.
Related
I have to write a simple "Word Count" Topology in Java and Storm. In particular, i have an external data source generating CSV (comma separated) string like
Daniel, 0.5654, 144543, user, 899898, Comment,,,
These strings are inserted into a RabbitMQ queue called "input". This datasource works well, and i can see the strings in the queue.
Now, i modified the classic topology adding the RabbitMQSpout. The goal is to do a word count for the first field of every CSV line, and publish results into a new queue called "output". The problem is that i cannot see any tuple inside the new queue, but the topology was submitted and RUNNING.
So, summing up:
external data source puts items into the input queue
RabbitMQSpout takes items from input queue and insert them into topology
classic word-count topology isperformed
last bolt puts results into output queue
Problem:
i can see items inside input queue, but nothing into output, even if i used same method to send item into the queue in the external data source (and it works) and in RabbitMQExporter (does not work...)
Some code below
RabbitMQSpout
public class RabbitMQSpout extends BaseRichSpout {
public static final String DATA = "data";
private SpoutOutputCollector _collector;
private RabbitMQManager rabbitMQManager;
#Override
public void open(Map map, TopologyContext topologyContext, SpoutOutputCollector spoutOutputCollector) {
_collector = _collector;
rabbitMQManager = new RabbitMQManager("localhost", "rabbitmq", "rabbitmq", "test");
}
#Override
public void nextTuple() {
Utils.sleep(1000);
String data = rabbitMQManager.receive("input");
if (data != null) {
_collector.emit(new Values(data));
}
}
#Override
public void declareOutputFields(OutputFieldsDeclarer outputFieldsDeclarer) {
outputFieldsDeclarer.declare(new Fields(DATA));
}
}
SplitBolt
public class SplitBolt extends BaseRichBolt {
private OutputCollector _collector;
public SplitSentenceBolt() { }
#Override
public void prepare(Map stormConf, TopologyContext context, OutputCollector collector) {
this._collector = collector;
this.SPACE = Pattern.compile(",");
}
#Override
public void execute(Tuple input) {
String sentence = input.getStringByField(RabbitMQSpout.DATA);
String[] words = sentence.split(",");
if (words.length > 0)
_collector.emit(new Values(words[0]));
}
#Override
public void declareOutputFields(OutputFieldsDeclarer declarer) {
declarer.declare(new Fields("word"));
}
#Override
public Map<String, Object> getComponentConfiguration() {
return null;
}
}
WordCountBolt
public class WordCountBolt extends BaseBasicBolt {
Map<String, Integer> counts = new HashMap<String, Integer>();
#Override
public void execute(Tuple tuple, BasicOutputCollector collector) {
String word = tuple.getString(0);
Integer count = counts.get(word);
if (count == null)
count = 0;
count++;
counts.put(word, count);
System.out.println(count);
collector.emit(new Values(word, count));
}
#Override
public void declareOutputFields(OutputFieldsDeclarer declarer) {
declarer.declare(new Fields("word", "count"));
}
}
RabbitMQExporterBolt
public RabbitMQExporterBolt(String rabbitMqHost, String rabbitMqUsername, String rabbitMqPassword,
String defaultQueue) {
super();
this.rabbitMqHost = rabbitMqHost;
this.rabbitMqUsername = rabbitMqUsername;
this.rabbitMqPassword = rabbitMqPassword;
this.defaultQueue = defaultQueue;
}
#Override
public void prepare(#SuppressWarnings("rawtypes") Map map, TopologyContext topologyContext, OutputCollector outputCollector) {
this.collector=outputCollector;
this.rabbitmq = new RabbitMQManager(rabbitMqHost, rabbitMqUsername, rabbitMqPassword, defaultQueue);
}
#Override
public void execute(Tuple tuple) {
String word = tuple.getString(0);
Integer count = tuple.getInteger(1);
String output = word + " " + count;
rabbitmq.send(output);
}
#Override
public void declareOutputFields(OutputFieldsDeclarer outputFieldsDeclarer) {
outputFieldsDeclarer.declare(new Fields("word"));
}
}
Topology
public class WordCountTopology {
private static final String RABBITMQ_HOST = "rabbitmq";
private static final String RABBITMQ_USER = "rabbitmq";
private static final String RABBITMQ_PASS = "rabbitmq";
private static final String RABBITMQ_QUEUE = "output";
public static void main(String[] args) throws Exception {
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout("spout", new RabbitMQSpout(), 1);
builder.setBolt("split", new SplitSentenceBolt(), 1)
.shuffleGrouping("spout");
builder.setBolt("count", new WordCountBolt(), 1)
.fieldsGrouping("split", new Fields("word"));
Config conf = new Config();
conf.setDebug(true);
if (args != null && args.length > 0) {
builder.setBolt("exporter",
new RabbitMQExporterBolt(
RABBITMQ_HOST, RABBITMQ_USER,
RABBITMQ_PASS, RABBITMQ_QUEUE ),
1)
.shuffleGrouping("count");
conf.setNumWorkers(3);
StormSubmitter.submitTopologyWithProgressBar(args[0], conf, builder.createTopology());
} else {
conf.setMaxTaskParallelism(3);
LocalCluster cluster = new LocalCluster();
cluster.submitTopology("word-count", conf, builder.createTopology());
Thread.sleep(10000);
cluster.shutdown();
}
}
}
RabbitMQManager
public class RabbitMQManager {
private String host;
private String username;
private String password;
private ConnectionFactory factory;
private Connection connection;
private String defaultQueue;
public RabbitMQManager(String host, String username, String password, String queue) {
super();
this.host = host;
this.username = username;
this.password = password;
this.factory = null;
this.connection = null;
this.defaultQueue = queue;
this.initialize();
this.initializeQueue(defaultQueue);
}
private void initializeQueue(String queue){
ConnectionFactory factory = new ConnectionFactory();
factory.setHost(host);
factory.setUsername(username);
factory.setPassword(password);
Connection connection;
try {
connection = factory.newConnection();
Channel channel = connection.createChannel();
boolean durable = false;
boolean exclusive = false;
boolean autoDelete = false;
channel.queueDeclare(queue, durable, exclusive, autoDelete, null);
channel.close();
connection.close();
} catch (IOException | TimeoutException e) {
e.printStackTrace();
}
}
private void initialize(){
factory = new ConnectionFactory();
factory.setHost(host);
factory.setUsername(username);
factory.setPassword(password);
try {
connection = factory.newConnection();
} catch (IOException | TimeoutException e) {
e.printStackTrace();
}
}
public void terminate(){
if (connection != null && connection.isOpen()){
try {
connection.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
private boolean reopenConnectionIfNeeded(){
try {
if (connection == null){
connection = factory.newConnection();
return true;
}
if (!connection.isOpen()){
connection = factory.newConnection();
}
} catch (IOException | TimeoutException e) {
e.printStackTrace();
return false;
}
return true;
}
public boolean send(String message){
return this.send(defaultQueue, message);
}
public boolean send(String queue, String message){
try {
reopenConnectionIfNeeded();
Channel channel = connection.createChannel();
channel.basicPublish("", queue, null, message.getBytes());
channel.close();
return true;
} catch (IOException | TimeoutException e) {
e.printStackTrace();
}
return false;
}
public String receive(String queue) {
try {
reopenConnectionIfNeeded();
Channel channel = connection.createChannel();
Consumer consumer = new DefaultConsumer(channel);
return channel.basicConsume(queue, true, consumer);
} catch (Exception e) {
e.printStackTrace();
return null;
}
}
}
I have a class Logger that uses 3 arraysas xhared variables
The arrays are initialized in the contructor
but when accessing them in any other method of the class, I get a
NullPointerException.
I need to know the reason and the solution.
Please see comments in the code.
file Logger.java
package logger_010.standard;
import java.io.FileOutputStream;
import java.io.PrintStream;
public class Logger {
// declaration
private FileOutputStream[] files;
private PrintStream[] pss;
private String[] messages;
public Logger() {
// initialisation
try {
FileOutputStream[] files = {
new FileOutputStream("G:\\Users\\TarekEZZAT\\Documents\\logs\\logger0.log"),
new FileOutputStream("G:\\Users\\TarekEZZAT\\Documents\\logs\\logger1.log"),
new FileOutputStream("G:\\Users\\TarekEZZAT\\Documents\\logs\\logger2.log"),
};
PrintStream[] pss = {
new PrintStream(files[0]),
new PrintStream(files[1]),
new PrintStream(files[2]),
};
String[] messages = {
new String ("Write error message to log file 0"),
new String ("Write error message to log file 1 + user"),
new String ("Write error message to log file 2 + user+ email"),
};
// Arrays instanciation is OK
System.out.println(files[0].toString());
System.out.println(files[1].toString());
System.out.println(files[2].toString());
System.out.println(pss[0].toString());
System.out.println(pss[1].toString());
System.out.println(pss[2].toString());
System.out.println(messages[0].toString());
System.out.println(messages[1].toString());
System.out.println(messages[2].toString());
System.out.println("++++++++++++");
} catch (Exception e) {
System.out.println("Exception " + e.getMessage());
} finally {
}
}
public void LogMessage(int level) {
// Here I get a Null pointer exception
System.out.println(files[0].toString());
System.out.println(files[1].toString());
System.out.println(files[2].toString());
System.out.println(pss[0].toString());
System.out.println(pss[1].toString());
System.out.println(pss[2].toString());
System.out.println(messages[0].toString());
System.out.println(messages[1].toString());
System.out.println(messages[2].toString());
System.out.println("++++++++++++");
// PrintStream[] files = OpenFiles();
WriteLogMessage(this.getPss(), level);
CloseFiles(pss);
}
private void CloseFiles(PrintStream[] pss2) {
// TODO Auto-generated method stub
}
private PrintStream[] OpenFiles() {
// TODO Auto-generated method stub
return null;
}
private void WriteLogMessage(PrintStream[] files, int level) {
this.getPss()[level].println(messages[level]);
this.getPss()[level].flush();
}
public FileOutputStream[] getFiles() {
return files;
}
public void setFiles(FileOutputStream[] files) {
this.files = files;
}
public PrintStream[] getPss() {
return pss;
}
public void setPss(PrintStream[] pss) {
this.pss = pss;
}
public String[] getMessages() {
return messages;
}
public void setMessages(String[] messages) {
this.messages = messages;
}
}
this is the file containing the main function
package logger_010.standard;
public class Start {
public static void main(String[] args) {
// TODO Auto-generated method stub
Logger l = new Logger();
for (int i = 0; i < 15; i++) {
int level = i % 2;
l.LogMessage(level);
}
}
}
You are declare a new files, message, pss variable inside constructor instead of using the variable already created of class => when using in the LogMessage method, it use the variable not init => cause the error
You never actualy bind your class attribut with the object you define in your constructor.
By defining FileOutputStream[] files = ... instead of files = ..., which is your object attribut, you are just making a local variable whose scope is only inside the constructor.
Your constructor should be :
public Logger() {
// initialisation
try {
files = {
new FileOutputStream("G:\\Users\\TarekEZZAT\\Documents\\logs\\logger0.log"),
new FileOutputStream("G:\\Users\\TarekEZZAT\\Documents\\logs\\logger1.log"),
new FileOutputStream("G:\\Users\\TarekEZZAT\\Documents\\logs\\logger2.log"),
};
pss = {
new PrintStream(files[0]),
new PrintStream(files[1]),
new PrintStream(files[2]),
};
messages = {
new String ("Write error message to log file 0"),
new String ("Write error message to log file 1 + user"),
new String ("Write error message to log file 2 + user+ email"),
};
// Arrays instanciation is OK
System.out.println(files[0].toString());
System.out.println(files[1].toString());
System.out.println(files[2].toString());
System.out.println(pss[0].toString());
System.out.println(pss[1].toString());
System.out.println(pss[2].toString());
System.out.println(messages[0].toString());
System.out.println(messages[1].toString());
System.out.println(messages[2].toString());
System.out.println("++++++++++++");
} catch (Exception e) {
System.out.println("Exception " + e.getMessage());
} finally {
}
}
If I initialize the FileOutputStream array with a sub class of FileOutputStream and a constructor that throws a FileNotFoundException I get this compilation error
"Default constructor cannot handle exception type FileNotFoundException thrown by implicit super constructor. Must define an explicit constructor".
I finally solved the problem by using a function (makeFileOutputStream) and this function call the FileOutputStream constructor in a try/catch block
here is the code for my class Blogger
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.PrintStream;
public class Logger {
// declarations
private FileOutputStream[] files = {
makeFileOutputStream("G:\\Users\\TarekEZZAT\\Documents\\logs\\logger0.log"),
makeFileOutputStream("G:\\Users\\TarekEZZAT\\Documents\\logs\\logger1.log"),
makeFileOutputStream("G:\\Users\\TarekEZZAT\\Documents\\logs\\logger2.log"),
};
private PrintStream[] pss = {
new PrintStream(files[0]),
new PrintStream(files[1]),
new PrintStream(files[2]),
};
private String[] messages = {
new String("Write error message to log file 0"),
new String("Write error message to log file 1 + user"),
new String("Write error message to log file 2 + user+ email"),
};
private FileOutputStream makeFileOutputStream(String string) {
// TODO Auto-generated method stub
FileOutputStream fos = null;
try {
fos = new FileOutputStream(string);
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return fos;
}
public void LogMessage(int level) {
WriteLogMessage(this.getPss(), level);
}
public void CloseFile(PrintStream[] files, int level){
// TODO Auto-generated method stub
this.getPss()[level].close();
}
private void WriteLogMessage(PrintStream[] files, int level) {
this.getPss()[level].println(messages[level]);
this.getPss()[level].flush();
}
// Getters and Setters
public FileOutputStream[] getFiles() {
return files;
}
public void setFiles(FileOutputStream[] files) {
this.files = files;
}
public PrintStream[] getPss() {
return pss;
}
public void setPss(PrintStream[] pss) {
this.pss = pss;
}
public String[] getMessages() {
return messages;
}
public void setMessages(String[] messages) {
this.messages = messages;
}
}
I am trying to process few million records from a text file (i.e. reading the file sequentially using one thread, and trying to process the retrieved lines using multiple threads). A method call after 'queue.take();' is only executing for the number of times equal to initial capacity allocated to BlockingQueue (100 in this example), and then the process doesn't pickup anymore records.
Could you please help in debugging or identify the issue?
Main Method:
############
final int threadCount = 10;
BlockingQueue<String> queue = new ArrayBlockingQueue<String>(100);
ExecutorService service = Executors.newFixedThreadPool(threadCount);
for (int i = 0; i < (threadCount - 1); i++) {
service.submit(new EvaluateLine(queue));
}
service.submit(new ProcessFile(queue)).get();
service.shutdownNow();
service.awaitTermination(365, TimeUnit.DAYS);
EvaluateLine:
#############
private final BlockingQueue<String> queue;
public EvaluateLine(BlockingQueue<String> queue){
this.queue = queue;
}
#Override
public void run() {
String line;
while(true) {
try {
line = queue.take();
SyncOutput.Process(line);
} catch (InterruptedException ex) {
break;
}
}
}
ProcessFile:
############
private final BlockingQueue<String> queue;
public ProcessFile(BlockingQueue<String> queue) {
this.queue = queue;
}
#Override
public void run() {
Path path = Paths.get("C:\\Desktop\\testdata.txt");
BufferedReader br = null;
try {
br =Files.newBufferedReader(path, StandardCharsets.UTF_8);
for (String line; (line = br.readLine()) != null; ) {
queue.put(line);
}
} catch(IOException e){
e.printStackTrace();
} catch(InterruptedException e){
e.printStackTrace();
} finally{
try {
if (br != null) br.close();
}catch(IOException e){
e.printStackTrace();
}
}
Edit:
SyncOutput
##########
class SyncOutput{
public static void ProcessExpression(String inputLine) {
evalExpression(inputLine);
}
public static double evalExpression(String s){
Expression e = new ExpressionBuilder(s)
.build();
return e.evaluate();
}
}
The following code segment(using Expression evaluation library: exp4j) is blocking the multi-thread execution, not sure why. But I have replaced this code block with a different library(parsii) and now everything looks fine.
public static void ProcessExpression(String inputLine) {
evalExpression(inputLine);
}
public static double evalExpression(String s){
Expression e = new ExpressionBuilder(s)
.build();
return e.evaluate();
}
I am new to storm but still i have configured storm on my local machine. I made an eclipse project and followed a simple example from internet. Now my topology is getting submitted but its not working.
Was topology submitted?
Yeah it was submitted successfully as I can see it on storm ui.
Work of my topology is to just print a number if it is a prime number. But its not printing it.
I have provided my code as follows:
Spout Class:
public class NumberSpout extends BaseRichSpout
{
private SpoutOutputCollector collector;
private static final Logger LOGGER = Logger.getLogger(SpoutOutputCollector.class);
private static int currentNumber = 1;
#Override
public void open( Map conf, TopologyContext context, SpoutOutputCollector collector )
{
this.collector = collector;
}
#Override
public void nextTuple()
{
// Emit the next number
LOGGER.info("Coming in spout tuble method");
collector.emit( new Values( new Integer( currentNumber++ ) ) );
}
#Override
public void ack(Object id)
{
}
#Override
public void fail(Object id)
{
}
#Override
public void declareOutputFields(OutputFieldsDeclarer declarer)
{
declarer.declare( new Fields( "number" ) );
}
}
Bolt Class:
public class PrimeNumberBolt extends BaseRichBolt
{ private static final Logger LOGGER = Logger.getLogger(PrimeNumberBolt.class);
private OutputCollector collector;
public void prepare( Map conf, TopologyContext context, OutputCollector collector )
{
this.collector = collector;
}
public void execute( Tuple tuple )
{
int number = tuple.getInteger( 0 );
if( isPrime( number) )
{
LOGGER.info("Prime number printed is: )" +number);
System.out.println( number );
}
collector.ack( tuple );
}
public void declareOutputFields( OutputFieldsDeclarer declarer )
{
declarer.declare( new Fields( "number" ) );
}
private boolean isPrime( int n )
{
if( n == 1 || n == 2 || n == 3 )
{
return true;
}
// Is n an even number?
if( n % 2 == 0 )
{
return false;
}
//if not, then just check the odds
for( int i=3; i*i<=n; i+=2 )
{
if( n % i == 0)
{
return false;
}
}
return true;
}
}
Topology Class:
public class PrimeNumberTopology
{
public static void main(String[] args)
{
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout( "spout", new NumberSpout(),1 );
builder.setBolt( "prime", new PrimeNumberBolt(),1 )
.shuffleGrouping("spout");
Config conf = new Config();
conf.put(Config.NIMBUS_HOST, "127.0.0.1");
conf.setDebug(true);
Map storm_conf = Utils.readStormConfig();
storm_conf.put("nimbus.host", "127.0.0.1");
Client client = NimbusClient.getConfiguredClient(storm_conf)
.getClient();
String inputJar = "/home/jamil/Downloads/storm-twitter-word-count-master/target/storm-test-1.0-SNAPSHOT.jar";
NimbusClient nimbus = new NimbusClient("127.0.0.1",6627);
// upload topology jar to Cluster using StormSubmitter
String uploadedJarLocation = StormSubmitter.submitJar(storm_conf,
inputJar);
try {
String jsonConf = JSONValue.toJSONString(storm_conf);
nimbus.getClient().submitTopology("newtesttopology",
uploadedJarLocation, jsonConf, builder.createTopology());
} catch (AlreadyAliveException ae) {
ae.printStackTrace();
} catch (InvalidTopologyException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (TException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Now I want to ask that why its not printing? Or why its not writing it to log files?
PLUS: I am submitting topology from eclipse.
In addition to what #Thomas Jungblut said (regarding your log4j configuration) and assuming that is the complete source code of your topology, then have a look at your nextTuple() method of your spout.
Your spout is simply emitting one value and thats it. Great chances that you are missing the output of that emitting in your console because it is buried under a ton of other logging outputs.
Are you sure that you want to emit just one value?
Hello everyone i have created a multi threaded chat server that looks like this:
public class Main {
public static ServerSocket server;
public static Socket connection;
public static int backLog = 100;
public static int numberOfConnected;
public static boolean connected = false;
public final static int potNumber = 6080;
public static PrintWriter pw;
public static Scanner input;
public static int i = 0;
public static void main(String[] args) {
startServer();
}
public static void startServer(){
try {
server = new ServerSocket(potNumber, backLog);
waitingForConnection();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
private static void waitingForConnection() {
connected = false;
i++;
while (!connected) {
try {
if (connected) {
}
connection = server.accept();
Server s = new Server(connection, pw = new PrintWriter(connection.getOutputStream()), input = new Scanner(connection.getInputStream()));
s.start();
numberOfConnected++;
waitingForConnection();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
}
The idea is that this is suppose to be a chat server so when one connects to the server it starts the following thread:
threads
public void run(){
while (connection.isConnected()) {
if (input.hasNext()) {
String fullMessage = input.nextLine();
if (fullMessage.equalsIgnoreCase("Connect")) {
connectHim();
}else {
chatMessage(fullMessage);
}
}
}
}
private void chatMessage(String fullMessage) {
String name = fullMessage.substring(0, fullMessage.indexOf(" "));
String message = fullMessage.substring(fullMessage.indexOf(" "), fullMessage.length());
pw.println(name+": "+message);
pw.flush();
}
private void connectHim() {
String name = input.nextLine();
pw.println(0);
pw.flush();
pw.println(1);
pw.flush();
pw.println();
pw.flush();
pw.println(name);
pw.flush();
}
So my problem is the following:
if the user that is bound to thread 1 (this is an example) and the user bound to thread 2 sends a message to the server how will i send that message to the user bound on thread 1?
One of options is to use Hashtable or HashMap (just call Collections.synchronizedMap(myMap) in case of Map usage). When you start new Thread, give him unique name (for example user nick name ) and put it to your collection where key - Thread name, and value - Thread as Object.
if the user that is bound to thread 1 (this is an example) and the user bound to thread 2 sends a message to the server how will i send that message to the user bound on thread 1?
For example you have user1, user2, user3. Now you build 3 Threads and put them to HashMap, like:
Map<String, Thread> threadMap = new HashMap<String,Thread>();
threadMap = Collections.synchronizedMap(threadMap);
YourThread th1 = new YourThread();
threadMap.put("user1", th);
YourThread th2 = new YourThread();
threadMap.put("user2", th);
YourThread th3 = new YourThread();
threadMap.put("user3", th);
....
Set<String> userSet = threadMap.keySet();
Iterator<String> it = userSet.iterator();
Thread currThread = null;
while(it.hasNext()){
String key = it.next();
currThread = threadMap.get(key);
// do something with currThread
}