So, I'm working on a plugin at work and I've run into a situation where I could use a ContentProposalAdapter to my benefit. Basically, a person will start typing in someone's name and then a list of names matching the current query will be returned in a type-ahead manner (a la Google). So, I created a class IContentProposalProvider which, upon calling it's getProposals() method fires off a thread which handles getting the proposals in the background. The problem I am having is that I run into a race condition, where the processing for getting the proposals via HTTP happens and I try to get the proposals before they have actually been retrieved.
Now, I'm trying not to run into an issue of Thread hell, and that isn't getting me very far anyway. So, here is what I've done so far. Does anyone have any suggestions as to what I can do?
public class ProfilesProposalProvider implements IContentProposalProvider, PropertyChangeListener {
private IContentProposal[] props;
#Override
public IContentProposal[] getProposals(String arg0, int arg1) {
Display display = PlatformUI.getWorkbench().getActiveWorkbenchWindow().getShell().getDisplay();
RunProfilesJobThread t1 = new RunProfilesJobThread(arg0, display);
t1.run();
return props;
}
#Override
public void propertyChange(PropertyChangeEvent arg0) {
if (arg0.getSource() instanceof RunProfilesJobThread){
RunProfilesJobThread thread = (RunProfilesJobThread)arg0.getSource();
props = thread.getProps();
}
}
}
public class RunProfilesJobThread extends Thread {
private ProfileProposal[] props;
private Display display;
private String query;
public RunProfilesJobThread(String query, Display display){
this.query = query;
}
#Override
public void run() {
if (!(query.equals(""))){
GetProfilesJob job = new GetProfilesJob("profiles", query);
job.schedule();
try {
job.join();
} catch (InterruptedException e) {
e.printStackTrace();
}
GetProfilesJobInfoThread thread = new GetProfilesJobInfoThread(job.getResults());
try {
thread.join();
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
props = thread.getProps();
}
}
public ProfileProposal[] getProps(){
return props;
}
}
public class GetProfilesJobInfoThread extends Thread {
private ArrayList<String> names;
private ProfileProposal[] props;
public GetProfilesJobInfoThread(ArrayList<String> names){
this.names = names;
}
#Override
public void run() {
if (names != null){
props = new ProfileProposal[names.size()];
for (int i = 0; i < props.length - 1; i++){
ProfileProposal temp = new ProfileProposal(names.get(i), names.get(i));
props[i] = temp;
}
}
}
public ProfileProposal[] getProps(){
return props;
}
}
Ok, i'll try it...
I haven't tried to run it, but it should work more or less. At least it's a good start. If you have any questions, feel free to ask.
public class ProfilesProposalProvider implements IContentProposalProvider {
private List<IContentProposal> proposals;
private String proposalQuery;
private Thread retrievalThread;
public void setProposals( List<IContentProposal> proposals, String query ) {
synchronized( this ) {
this.proposals = proposals;
this.proposalQuery = query;
}
}
public IContentProposal[] getProposals( String contents, int position ) {
// Synchronize incoming thread and retrieval thread, so that the proposal list
// is not replaced while we're processing it.
synchronized( this ) {
/**
* Get proposals if query is longer than one char, or if the current list of proposals does with a different
* prefix than the new query, and only if the current retrieval thread is finished.
*/
if ( retrievalThread == null && contents.length() > 1 && ( proposals == null || !contents.startsWith( proposalQuery ) ) ) {
getProposals( contents );
}
/**
* Select valid proposals from retrieved list.
*/
if ( proposals != null ) {
List<IContentProposal> validProposals = new ArrayList<IContentProposal>();
for ( IContentProposal prop : proposals ) {
if(prop == null) {
continue;
}
String propVal = prop.getContent();
if ( isProposalValid( propVal, contents )) {
validProposals.add( prop );
}
}
return validProposals.toArray( new IContentProposal[ validProposals.size() ] );
}
}
return new IContentProposal[0];
}
protected void getProposals( final String query ) {
retrievalThread = new Thread() {
#Override
public void run() {
GetProfilesJob job = new GetProfilesJob("profiles", query);
job.schedule();
try {
job.join();
ArrayList<String> names = job.getResults();
if (names != null){
List<IContentProposal> props = new ArrayList<IContentProposal>();
for ( String name : names ) {
props.add( new ProfileProposal( name, name ) );
}
setProposals( props, query );
}
} catch (InterruptedException e) {
e.printStackTrace();
}
retrievalThread = null;
}
};
retrievalThread.start();
}
protected boolean isProposalValid( String proposalValue, String contents ) {
return ( proposalValue.length() >= contents.length() && proposalValue.substring(0, contents.length()).equalsIgnoreCase(contents));
}
}
Related
I'm a novice when it comes to JSPs and JAVA.
How do I get the output from the below code to display on a jsp, considering that it runs everything from the main and contains non-public methods, a nested static class etc?
I know that we are not supposed to use java code on jsp but my first step in this proof on concept exercise is to get the code running and returning data from a backend then I can set about using EL etc.
I can run the program, with the correct config settings, from within Eclipse and all works fine with the output appearing on the console but I'm really not sure how to access it from within a jsp.
How do I access the static class and static methods from a jsp if they aren't public?
All help greatly appreciated.
public class CustomDestinationDataProvider
{
static class MyDestinationDataProvider implements DestinationDataProvider
{
private DestinationDataEventListener eL;
private HashMap<String, Properties> secureDBStorage = new HashMap<String, Properties>();
public Properties getDestinationProperties(String destinationName)
{
try
{
//read the destination from DB
Properties p = secureDBStorage.get(destinationName);
if(p!=null)
{
//check if all is correct, for example
if(p.isEmpty())
throw new DataProviderException(DataProviderException.Reason.INVALID_CONFIGURATION, "destination configuration is incorrect", null);
return p;
}
return null;
}
catch(RuntimeException re)
{
throw new DataProviderException(DataProviderException.Reason.INTERNAL_ERROR, re);
}
}
public void setDestinationDataEventListener(DestinationDataEventListener eventListener)
{
this.eL = eventListener;
}
public boolean supportsEvents()
{
return true;
}
//implementation that saves the properties in a very secure way
void changeProperties(String destName, Properties properties)
{
synchronized(secureDBStorage)
{
if(properties==null)
{
if(secureDBStorage.remove(destName)!=null)
eL.deleted(destName);
}
else
{
secureDBStorage.put(destName, properties);
eL.updated(destName); // create or updated
}
}
}
} // end of MyDestinationDataProvider
//business logic
void executeCalls(String destName)
{
JCoDestination dest;
try
{
dest = JCoDestinationManager.getDestination(destName);
dest.ping();
System.out.println("Destination " + destName + " works");
step4WorkWithTable(dest);
}
catch(JCoException e)
{
e.printStackTrace();
System.out.println("Execution on destination " + destName+ " failed");
}
}
static Properties getDestinationPropertiesFromUI()
{
//adapt parameters in order to configure a valid destination
Properties connectProperties = new Properties();
// Add code here to set config settings
return connectProperties;
}
public static void main(String[] args)
{
MyDestinationDataProvider myProvider = new MyDestinationDataProvider();
//register the provider with the JCo environment;
//catch IllegalStateException if an instance is already registered
try
{
com.sap.conn.jco.ext.Environment.registerDestinationDataProvider(myProvider);
}
catch(IllegalStateException providerAlreadyRegisteredException)
{
//somebody else registered its implementation,
//stop the execution
throw new Error(providerAlreadyRegisteredException);
}
String destName = "????";
CustomDestinationDataProvider test = new CustomDestinationDataProvider();
//set properties for the destination and ...
myProvider.changeProperties(destName, getDestinationPropertiesFromUI());
//... work with it
test.executeCalls(destName);
}
public static void step4WorkWithTable(JCoDestination dest) throws JCoException
{
JCoFunction function = dest.getRepository().getFunction("BAPI_COMPANYCODE_GETLIST");
if(function == null)
throw new RuntimeException("BAPI_COMPANYCODE_GETLIST not found in SAP.");
try
{
function.execute(dest);
}
catch(AbapException e)
{
System.out.println(e.toString());
return;
}
JCoStructure returnStructure = function.getExportParameterList().getStructure("RETURN");
if (! (returnStructure.getString("TYPE").equals("")||returnStructure.getString("TYPE").equals("S")) )
{
throw new RuntimeException(returnStructure.getString("MESSAGE"));
}
JCoTable codes = function.getTableParameterList().getTable("COMPANYCODE_LIST");
for (int i = 0; i < codes.getNumRows(); i++)
{
codes.setRow(i);
System.out.println(codes.getString("COMP_CODE") + '\t' + codes.getString("COMP_NAME"));
}
//move the table cursor to first row
codes.firstRow();
for (int i = 0; i < codes.getNumRows(); i++, codes.nextRow())
{
function = dest.getRepository().getFunction("BAPI_COMPANYCODE_GETDETAIL");
if (function == null)
throw new RuntimeException("BAPI_COMPANYCODE_GETDETAIL not found in SAP.");
function.getImportParameterList().setValue("COMPANYCODEID", codes.getString("COMP_CODE"));
//We do not need the addresses, so set the corresponding parameter to inactive.
//Inactive parameters will be either not generated or at least converted.
function.getExportParameterList().setActive("COMPANYCODE_ADDRESS",false);
try
{
function.execute(dest);
}
catch (AbapException e)
{
System.out.println(e.toString());
return;
}
returnStructure = function.getExportParameterList().getStructure("RETURN");
if (! (returnStructure.getString("TYPE").equals("") ||
returnStructure.getString("TYPE").equals("S") ||
returnStructure.getString("TYPE").equals("W")) )
{
throw new RuntimeException(returnStructure.getString("MESSAGE"));
}
JCoStructure detail = function.getExportParameterList().getStructure("COMPANYCODE_DETAIL");
System.out.println(detail.getString("COMP_CODE") + '\t' +
detail.getString("COUNTRY") + '\t' +
detail.getString("CITY"));
}//for
}
}
I'm using an asyncronus XML-RPC-Client (https://github.com/gturri/aXMLRPC) in my Project and wrote some methods using the asyncronous Callback-Methods of this Client like this this:
public void xmlRpcMethod(final Object callbackSync) {
XMLRPCCallback listener = new XMLRPCCallback() {
public void onResponse(long id, final Object result) {
// Do something
if (callbackSync != null) {
synchronized (callbackSync) {
callbackSync.notify();
}
}
}
public void onError(long id, final XMLRPCException error) {
// Do something
if (callbackSync != null) {
synchronized (callbackSync) {
callbackSync.notify();
}
}
}
public void onServerError(long id, final XMLRPCServerException error) {
Log.e(TAG, error.getMessage());
if (callbackSync != null) {
synchronized (callbackSync) {
callbackSync.notifyAll();
}
}
}
};
XMLRPCClient client = new XMLRPCClient("<url>");
long id = client.callAsync(listener, "<method>");
}
In other methods I like to call this method (here "xmlRpcMethod") and wait until it finished. I wrote methods like this:
public void testMethod(){
Object sync = new Object();
xmlRpcMethod(sync);
synchronized (sync){
try{
sync.wait();
}catch(Interrupted Exception e){
e.printStackTrace();
}
}
// Do something after xmlRcpFinished
}
But this way of waiting and synchronizing get's ugly when the projects grows larger and I need to wait for many requests to finish.
So is this the only possible / best way? Or does someone knows a better solution?
My first shot to create blocking RPC calls would be:
// Little helper class:
class RPCResult<T>{
private final T result;
private final Exception ex;
private final long id;
public RPCResult( long id, T result, Exception ex ){
// TODO set fields
}
// TODO getters
public boolean hasError(){ return null != this.ex; }
}
public Object xmlRpcMethod() {
final BlockingQueue<RPCResult> pipe = new ArrayBlockingQueue<RPCResult>(1);
XMLRPCCallback listener = new XMLRPCCallback() {
public void onResponse(long id, final Object result) {
// Do something
pipe.put( new RPCResult<Object>(id, result, null) );
}
public void onError(long id, final XMLRPCException error) {
// Do something
pipe.put( new RPCResult<Object>(id, null, error) );
}
public void onServerError(long id, final XMLRPCServerException error) {
Log.e(TAG, error.getMessage());
pipe.put(new RPCResult<Object>(id, null, error));
}
};
XMLRPCClient client = new XMLRPCClient("<url>");
long id = client.callAsync(listener, "<method>");
RPCResult result = pipe.take(); // blocks until there is an element available
// TODO: catch and handle InterruptedException!
if( result.hasError() ) throw result.getError(); // Relay Exceptions - do not swallow them!
return result.getResult();
}
Client:
public void testMethod(){
Object result = xmlRpcMethod(); // blocks until result is available or throws exception
}
Next step would be to make a strongly typed version public T xmlRpcMethod().
I have a rpt file, using which i will be generating multiple reports in pdf format. Using the Engine class from inet clear reports. The process takes very long as I have nearly 10000 reports to be generated. Can I use the Mutli-thread or some other approach to speed up the process?
Any help of how it can be done would be helpful
My partial code.
//Loops
Engine eng = new Engine(Engine.EXPORT_PDF);
eng.setReportFile(rpt); //rpt is the report name
if (cn.isClosed() || cn == null ) {
cn = ds.getConnection();
}
eng.setConnection(cn);
System.out.println(" After set connection");
eng.setPrompt(data[i], 0);
ReportProperties repprop = eng.getReportProperties();
repprop.setPaperOrient(ReportProperties.DEFAULT_PAPER_ORIENTATION, ReportProperties.PAPER_FANFOLD_US);
eng.execute();
System.out.println(" After excecute");
try {
PDFExportThread pdfExporter = new PDFExportThread(eng, sFileName, sFilePath);
pdfExporter.execute();
} catch (Exception e) {
e.printStackTrace();
}
PDFExportThread execute
public void execute() throws IOException {
FileOutputStream fos = null;
try {
String FileName = sFileName + "_" + (eng.getPageCount() - 1);
File file = new File(sFilePath + FileName + ".pdf");
if (!file.getParentFile().exists()) {
file.getParentFile().mkdirs();
}
if (!file.exists()) {
file.createNewFile();
}
fos = new FileOutputStream(file);
for (int k = 1; k <= eng.getPageCount(); k++) {
fos.write(eng.getPageData(k));
}
fos.flush();
fos.close();
} catch (Exception e) {
e.printStackTrace();
} finally {
if (fos != null) {
fos.close();
fos = null;
}
}
}
This is a very basic code. A ThreadPoolExecutor with a fixed size threads in a pool is the backbone.
Some considerations:
The thread pool size should be equal or less than the DB connection pool size. And, it should be of an optimal number which is reasonable for parallel Engines.
The main thread should wait for sufficient time before killing all threads. I have put 1 hour as the wait time, but that's just an example.
You'll need to have proper Exception handling.
From the API doc, I saw stopAll and shutdown methods from the Engine class. So, I'm invoking that as soon as our work is done. That's again, just an example.
Hope this helps.
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.sql.Connection;
import java.util.concurrent.Executors;
import java.util.concurrent.ThreadPoolExecutor;
import java.util.concurrent.TimeUnit;
public class RunEngine {
public static void main(String[] args) throws Exception {
final String rpt = "/tmp/rpt/input/rpt-1.rpt";
final String sFilePath = "/tmp/rpt/output/";
final String sFileName = "pdfreport";
final Object[] data = new Object[10];
ThreadPoolExecutor executor = (ThreadPoolExecutor) Executors.newFixedThreadPool(10);
for (int i = 0; i < data.length; i++) {
PDFExporterRunnable runnable = new PDFExporterRunnable(rpt, data[i], sFilePath, sFileName, i);
executor.execute(runnable);
}
executor.shutdown();
executor.awaitTermination(1L, TimeUnit.HOURS);
Engine.stopAll();
Engine.shutdown();
}
private static class PDFExporterRunnable implements Runnable {
private final String rpt;
private final Object data;
private final String sFilePath;
private final String sFileName;
private final int runIndex;
public PDFExporterRunnable(String rpt, Object data, String sFilePath,
String sFileName, int runIndex) {
this.rpt = rpt;
this.data = data;
this.sFilePath = sFilePath;
this.sFileName = sFileName;
this.runIndex = runIndex;
}
#Override
public void run() {
// Loops
Engine eng = new Engine(Engine.EXPORT_PDF);
eng.setReportFile(rpt); // rpt is the report name
Connection cn = null;
/*
* DB connection related code. Check and use.
*/
//if (cn.isClosed() || cn == null) {
//cn = ds.getConnection();
//}
eng.setConnection(cn);
System.out.println(" After set connection");
eng.setPrompt(data, 0);
ReportProperties repprop = eng.getReportProperties();
repprop.setPaperOrient(ReportProperties.DEFAULT_PAPER_ORIENTATION,
ReportProperties.PAPER_FANFOLD_US);
eng.execute();
System.out.println(" After excecute");
FileOutputStream fos = null;
try {
String FileName = sFileName + "_" + runIndex;
File file = new File(sFilePath + FileName + ".pdf");
if (!file.getParentFile().exists()) {
file.getParentFile().mkdirs();
}
if (!file.exists()) {
file.createNewFile();
}
fos = new FileOutputStream(file);
for (int k = 1; k <= eng.getPageCount(); k++) {
fos.write(eng.getPageData(k));
}
fos.flush();
fos.close();
} catch (Exception e) {
e.printStackTrace();
} finally {
if (fos != null) {
try {
fos.close();
} catch (IOException e) {
e.printStackTrace();
}
fos = null;
}
}
}
}
/*
* Dummy classes to avoid compilation errors.
*/
private static class ReportProperties {
public static final String PAPER_FANFOLD_US = null;
public static final String DEFAULT_PAPER_ORIENTATION = null;
public void setPaperOrient(String defaultPaperOrientation, String paperFanfoldUs) {
}
}
private static class Engine {
public static final int EXPORT_PDF = 1;
public Engine(int exportType) {
}
public static void shutdown() {
}
public static void stopAll() {
}
public void setPrompt(Object singleData, int i) {
}
public byte[] getPageData(int k) {
return null;
}
public int getPageCount() {
return 0;
}
public void execute() {
}
public ReportProperties getReportProperties() {
return null;
}
public void setConnection(Connection cn) {
}
public void setReportFile(String reportFile) {
}
}
}
I will offer this "answer" as a possible quick & dirty solution to get you started on a parallelization effort.
One way or another you're going to build a render farm.
I don't think there is a trivial way to do this in java; I would love to have someone post an answer that show how to parallelize your example in just a few lines of code. But until that happens this will hopefully help you make some progress.
You're going to have limited scaling in the same JVM instance.
But... let's see how far you get with that and see if it helps enough.
Design challenge #1: restarting.
You will probably want a place to keep the status for each of your reports e.g. "units of work".
You want this in case you need to re-start everything (maybe your server crashes) and you don't want to re-run all of the reports thus far.
Lots of ways you can do this; database, check to see if a "completed" file exists in your report folder (not sufficient for the *.pdf to exist, as that may be incomplete... for xyz_200.pdf you could maybe make an empty xyz_200.done or xyz_200.err file to help with re-running any problem children... and by the time you code up that file manipulation/checking/initialization logic, seems like it may have been easier to add a column to your database which holds the list of work to-be-done).
Design consideration #2: maximizing throughput (avoiding overload).
You don't want to saturate you system and run one thousand reports in parallel.
Maybe 10.
Maybe 100.
Probably not 5,000.
You will need to do some sizing research and see what gets you near 80 to 90% system utilization.
Design consideration #3: scaling across multiple servers
Overly complex, outside the scope of a Stack Exchange answer.
You'd have to spin up JVM's on multiple systems that are running something like the workers below, and a report-manager that can pull work items from a shared "queue" structure, again a database table is probably easier here than doing something file-based (or a network feed).
Sample Code
Caution: None of this code is well tested, it almost certainly has an abundance of typos, logic errors and poor design. Use at your own risk.
So anyway... I do want to give you the basic idea of a rudimentary task runner.
Replace your "// Loops" example in the question with code like the following:
main loop (original code example)
This is more or less doing what your example code did, modified to push most of the work into ReportWorker (new class, see below). Lots of stuff seems to be packed into your original question's example of "// Loop", so I'm not trying to reverse engineer that.
fwiw, it was unclear to me where "rpt" and "data[i]" are coming from so I hacked up some test data.
public class Main {
public static boolean complete( String data ) {
return false; // for testing nothing is complete.
}
public static void main(String args[] ) {
String data[] = new String[] {
"A",
"B",
"C",
"D",
"E" };
String rpt = "xyz";
// Loop
ReportManager reportMgr = new ReportManager(); // a new helper class (see below), it assigns/monitors work.
long startTime = System.currentTimeMillis();
for( int i = 0; i < data.length; ++i ) {
// complete is something you should write that knows if a report "unit of work"
// finished successfully.
if( !complete( data[i] ) ) {
reportMgr.assignWork( rpt, data[i] ); // so... where did values for your "rpt" variable come from?
}
}
reportMgr.waitForWorkToFinish(); // out of new work to assign, let's wait until everything in-flight complete.
long endTime = System.currentTimeMillis();
System.out.println("Done. Elapsed time = " + (endTime - startTime)/1000 +" seconds.");
}
}
ReportManager
This class is not thread safe, just have your original loop keep calling assignWork() until you're out of reports to assign then keep calling it until all work is done, e.g. waitForWorkToFinish(), as shown above. (fwiw, I don't think you could say any of the classes here are especially thread safe).
public class ReportManager {
public int polling_delay = 500; // wait 0.5 seconds for testing.
//public int polling_delay = 60 * 1000; // wait 1 minute.
// not high throughput millions of reports / second, we'll run at a slower tempo.
public int nWorkers = 3; // just 3 for testing.
public int assignedCnt = 0;
public ReportWorker workers[];
public ReportManager() {
// initialize our manager.
workers = new ReportWorker[ nWorkers ];
for( int i = 0; i < nWorkers; ++i ) {
workers[i] = new ReportWorker( i );
System.out.println("Created worker #"+i);
}
}
private ReportWorker handleWorkerError( int i ) {
// something went wrong, update our "report" status as one of the reports failed.
System.out.println("handlerWokerError(): failure in "+workers[i]+", resetting worker.");
workers[i].teardown();
workers[i] = new ReportWorker( i ); // just replace everything.
return workers[i]; // the new worker will, incidentally, be avaialble.
}
private ReportWorker handleWorkerComplete( int i ) {
// this unit of work was completed, update our "report" status tracker as success.
System.out.println("handleWorkerComplete(): success in "+workers[i]+", resetting worker.");
workers[i].teardown();
workers[i] = new ReportWorker( i ); // just replace everything.
return workers[i]; // the new worker will, incidentally, be avaialble.
}
private int activeWorkerCount() {
int activeCnt = 0;
for( int i = 0; i < nWorkers; ++i ) {
ReportWorker worker = workers[i];
System.out.println("activeWorkerCount() i="+i+", checking worker="+worker);
if( worker.hasError() ) {
worker = handleWorkerError( i );
}
if( worker.isComplete() ) {
worker = handleWorkerComplete( i );
}
if( worker.isInitialized() || worker.isRunning() ) {
++activeCnt;
}
}
System.out.println("activeWorkerCount() activeCnt="+activeCnt);
return activeCnt;
}
private ReportWorker getAvailableWorker() {
// check each worker to see if anybody recently completed...
// This (rather lazily) creates completely new ReportWorker instances.
// You might want to try pooling (salvaging and reinitializing them)
// to see if that helps your performance.
System.out.println("\n-----");
ReportWorker firstAvailable = null;
for( int i = 0; i < nWorkers; ++i ) {
ReportWorker worker = workers[i];
System.out.println("getAvailableWorker(): i="+i+" worker="+worker);
if( worker.hasError() ) {
worker = handleWorkerError( i );
}
if( worker.isComplete() ) {
worker = handleWorkerComplete( i );
}
if( worker.isAvailable() && firstAvailable==null ) {
System.out.println("Apparently worker "+worker+" is 'available'");
firstAvailable = worker;
System.out.println("getAvailableWorker(): i="+i+" now firstAvailable = "+firstAvailable);
}
}
return firstAvailable; // May (or may not) be null.
}
public void assignWork( String rpt, String data ) {
ReportWorker worker = getAvailableWorker();
while( worker == null ) {
System.out.println("assignWork: No workers available, sleeping for "+polling_delay);
try { Thread.sleep( polling_delay ); }
catch( InterruptedException e ) { System.out.println("assignWork: sleep interrupted, ignoring exception "+e); }
// any workers avaialble now?
worker = getAvailableWorker();
}
++assignedCnt;
worker.initialize( rpt, data ); // or whatever else you need.
System.out.println("assignment #"+assignedCnt+" given to "+worker);
Thread t = new Thread( worker );
t.start( ); // that is pretty much it, let it go.
}
public void waitForWorkToFinish() {
int active = activeWorkerCount();
while( active >= 1 ) {
System.out.println("waitForWorkToFinish(): #active workers="+active+", waiting...");
// wait a minute....
try { Thread.sleep( polling_delay ); }
catch( InterruptedException e ) { System.out.println("assignWork: sleep interrupted, ignoring exception "+e); }
active = activeWorkerCount();
}
}
}
ReportWorker
public class ReportWorker implements Runnable {
int test_delay = 10*1000; //sleep for 10 seconds.
// (actual code would be generating PDF output)
public enum StatusCodes { UNINITIALIZED,
INITIALIZED,
RUNNING,
COMPLETE,
ERROR };
int id = -1;
StatusCodes status = StatusCodes.UNINITIALIZED;
boolean initialized = false;
public String rpt = "";
public String data = "";
//Engine eng;
//PDFExportThread pdfExporter;
//DataSource_type cn;
public boolean isInitialized() { return initialized; }
public boolean isAvailable() { return status == StatusCodes.UNINITIALIZED; }
public boolean isRunning() { return status == StatusCodes.RUNNING; }
public boolean isComplete() { return status == StatusCodes.COMPLETE; }
public boolean hasError() { return status == StatusCodes.ERROR; }
public ReportWorker( int id ) {
this.id = id;
}
public String toString( ) {
return "ReportWorker."+id+"("+status+")/"+rpt+"/"+data;
}
// the example code doesn't make clear if there is a relationship between rpt & data[i].
public void initialize( String rpt, String data /* data[i] in original code */ ) {
try {
this.rpt = rpt;
this.data = data;
/* uncomment this part where you have the various classes availble.
* I have it commented out for testing.
cn = ds.getConnection();
Engine eng = new Engine(Engine.EXPORT_PDF);
eng.setReportFile(rpt); //rpt is the report name
eng.setConnection(cn);
eng.setPrompt(data, 0);
ReportProperties repprop = eng.getReportProperties();
repprop.setPaperOrient(ReportProperties.DEFAULT_PAPER_ORIENTATION, ReportProperties.PAPER_FANFOLD_US);
*/
status = StatusCodes.INITIALIZED;
initialized = true; // want this true even if we're running.
} catch( Exception e ) {
status = StatusCodes.ERROR;
throw new RuntimeException("initialze(rpt="+rpt+", data="+data+")", e);
}
}
public void run() {
status = StatusCodes.RUNNING;
System.out.println("run().BEGIN: "+this);
try {
// delay for testing.
try { Thread.sleep( test_delay ); }
catch( InterruptedException e ) { System.out.println(this+".run(): test interrupted, ignoring "+e); }
/* uncomment this part where you have the various classes availble.
* I have it commented out for testing.
eng.execute();
PDFExportThread pdfExporter = new PDFExportThread(eng, sFileName, sFilePath);
pdfExporter.execute();
*/
status = StatusCodes.COMPLETE;
System.out.println("run().END: "+this);
} catch( Exception e ) {
System.out.println("run().ERROR: "+this);
status = StatusCodes.ERROR;
throw new RuntimeException("run(rpt="+rpt+", data="+data+")", e);
}
}
public void teardown() {
if( ! isInitialized() || isRunning() ) {
System.out.println("Warning: ReportWorker.teardown() called but I am uninitailzied or running.");
// should never happen, fatal enough to throw an exception?
}
/* commented out for testing.
try { cn.close(); }
catch( Exception e ) { System.out.println("Warning: ReportWorker.teardown() ignoring error on connection close: "+e); }
cn = null;
*/
// any need to close things on eng?
// any need to close things on pdfExporter?
}
}
I am new to storm but still i have configured storm on my local machine. I made an eclipse project and followed a simple example from internet. Now my topology is getting submitted but its not working.
Was topology submitted?
Yeah it was submitted successfully as I can see it on storm ui.
Work of my topology is to just print a number if it is a prime number. But its not printing it.
I have provided my code as follows:
Spout Class:
public class NumberSpout extends BaseRichSpout
{
private SpoutOutputCollector collector;
private static final Logger LOGGER = Logger.getLogger(SpoutOutputCollector.class);
private static int currentNumber = 1;
#Override
public void open( Map conf, TopologyContext context, SpoutOutputCollector collector )
{
this.collector = collector;
}
#Override
public void nextTuple()
{
// Emit the next number
LOGGER.info("Coming in spout tuble method");
collector.emit( new Values( new Integer( currentNumber++ ) ) );
}
#Override
public void ack(Object id)
{
}
#Override
public void fail(Object id)
{
}
#Override
public void declareOutputFields(OutputFieldsDeclarer declarer)
{
declarer.declare( new Fields( "number" ) );
}
}
Bolt Class:
public class PrimeNumberBolt extends BaseRichBolt
{ private static final Logger LOGGER = Logger.getLogger(PrimeNumberBolt.class);
private OutputCollector collector;
public void prepare( Map conf, TopologyContext context, OutputCollector collector )
{
this.collector = collector;
}
public void execute( Tuple tuple )
{
int number = tuple.getInteger( 0 );
if( isPrime( number) )
{
LOGGER.info("Prime number printed is: )" +number);
System.out.println( number );
}
collector.ack( tuple );
}
public void declareOutputFields( OutputFieldsDeclarer declarer )
{
declarer.declare( new Fields( "number" ) );
}
private boolean isPrime( int n )
{
if( n == 1 || n == 2 || n == 3 )
{
return true;
}
// Is n an even number?
if( n % 2 == 0 )
{
return false;
}
//if not, then just check the odds
for( int i=3; i*i<=n; i+=2 )
{
if( n % i == 0)
{
return false;
}
}
return true;
}
}
Topology Class:
public class PrimeNumberTopology
{
public static void main(String[] args)
{
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout( "spout", new NumberSpout(),1 );
builder.setBolt( "prime", new PrimeNumberBolt(),1 )
.shuffleGrouping("spout");
Config conf = new Config();
conf.put(Config.NIMBUS_HOST, "127.0.0.1");
conf.setDebug(true);
Map storm_conf = Utils.readStormConfig();
storm_conf.put("nimbus.host", "127.0.0.1");
Client client = NimbusClient.getConfiguredClient(storm_conf)
.getClient();
String inputJar = "/home/jamil/Downloads/storm-twitter-word-count-master/target/storm-test-1.0-SNAPSHOT.jar";
NimbusClient nimbus = new NimbusClient("127.0.0.1",6627);
// upload topology jar to Cluster using StormSubmitter
String uploadedJarLocation = StormSubmitter.submitJar(storm_conf,
inputJar);
try {
String jsonConf = JSONValue.toJSONString(storm_conf);
nimbus.getClient().submitTopology("newtesttopology",
uploadedJarLocation, jsonConf, builder.createTopology());
} catch (AlreadyAliveException ae) {
ae.printStackTrace();
} catch (InvalidTopologyException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (TException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Now I want to ask that why its not printing? Or why its not writing it to log files?
PLUS: I am submitting topology from eclipse.
In addition to what #Thomas Jungblut said (regarding your log4j configuration) and assuming that is the complete source code of your topology, then have a look at your nextTuple() method of your spout.
Your spout is simply emitting one value and thats it. Great chances that you are missing the output of that emitting in your console because it is buried under a ton of other logging outputs.
Are you sure that you want to emit just one value?
I have some places in a excel file, each of the point have a lng and lat coordinate.
Now I try to create a static Map for each point using the google map static map api.
And I have Two component, a parser and a loader.
The Parser is used to read the excel file while the loaded is used to load tiles.
And I make the loader run in a seprate Thread.
public class Parser {
private static Parser instance;
private StaticMapLoader loader;
private Parser(StaticMapLoader loader) {
this.loader = loader;
}
public synchronized static Parser getInstance(StaticMapLoader loader) {
if (instance == null) {
instance = new Parser(loader);
}
return instance;
}
public void parse(String path) {
List<Branch> result = new ArrayList<Branch>();
InputStream inp;
try {
inp = new FileInputStream(path);
Workbook wb = WorkbookFactory.create(inp);
Sheet sheet = wb.getSheetAt(0);
int rows = sheet.getLastRowNum();
for(Row r : sheet.getRows){
loader.addTask(r.type,r.name,r.x,r.y);
}
} catch (InvalidFormatException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
// Branch bc = new Branch("网点1", null, null);
return result;
}
}
Loader:
public class StaticMapLoader extends Thread {
private final static Logger log = Logger.getLogger(StaticMapLoader.class);
private List<Task> tasks = new ArrayList<Task>();
private String tilePath;
private boolean running = false;
public StaticMapLoader(String saveDir) {
this.tilePath = saveDir;
}
#Override
public void run() {
while (running) {
log.debug("run " + tasks.size());
if (tasks.size() > 0) {
Task t = tasks.get(0);
if (t != null && t.status == Status.waiting) {
tasks.remove(0);
t.status = Status.running;
downLoad(t);
}
}
}
}
private void downLoad(Task t) {
log.debug(String.format("load data for " + t.toString()));
//down tiles and save
t.status=Status.success;
}
public void addTask(String type, String name, double x, double y) {
log.debug(String.format("add task of :%s,%s", type, name));
tasks.add(new Task(type,name,x,y));
}
public void startRunning() {
running = true;
this.start();
}
public void stopRunning() {
running = false;
this.interrupt();
}
class Task {
Status status = Status.waiting;
String type, name;
double x,y;
Task(String type, String name, double x,double y) {
this.type = type;
this.name = name;
this.xian = xian;
this.x = x;
this.y = y;
}
}
enum Status {
waiting, running, fail, success
}
}
The process is rather simple, the StaticMapLoader have a field of ArrayList. While the Parser parse a record(place), it will be thrown to the list.
And the loader will iterator the list and download the data.
However I meet a strange problem here:
#Override
public void run() {
while (running) {
log.debug("run " + tasks.size());
if (tasks.size() > 0) {
Task t = tasks.get(0);
if (t != null && t.status == Status.waiting) {
tasks.remove(0);
t.status = Status.running;
downLoad(t);
}
}
}
}
The above codes runs, and I will get the logs like this:
run 1
add task of ..
run 2
add task of ...
However , if I comment the log line, the downLoad will be never called, I will get:
run 1
run 2
......
It seems that this may be caused by the Thread , do I miss anything?
BTW, the above codes ran inside the HttpServlet context, and I start them like this:
#Override
public void init() throws ServletException {
super.init();
try {
URL fileUrl = getServletContext().getResource(getInitParameter("xlsxFile"));
URL tilePath = getServletContext().getResource(getInitParameter("tilePath"));
StaticMapLoader loader = new StaticMapLoader(tilePath.getPath());
loader.startRunning();
Parser.getInstance(loader).parse(fileUrl.getPath());
} catch (MalformedURLException e) {
}
}