I am trying to check jobstatus of a Mapreduce Job.
When I run job.iscomplete() , I get exception
"Job in state DEFINE instead of RUNNING" .
try {
if (job.isComplete()) {
printInfoLog(LOG, this.filename,
"** " + job.getTrackingURL());
break;
}
} catch (Exception e) {
LOG.warn("** " + e.getMessage());
}
But there is no such state as I checked all the fields in Jobstatus(https://hadoop.apache.org/docs/stable/api/org/apache/hadoop/mapreduce/JobStatus.html)
I kind of understand it by the feeling that the job is not yet submitted . Can anyone please suggest me how to check whether job is submitted or not as I could not find any such method in the API.
I solved it as follows,
if ( job.getJobState() == JobStatus.State.RUNNING || job.getJobState() == JobStatus.State.SUCCEEDED || job.getJobState() == JobStatus.State.KILLED || job.getJobState() == JobStatus.State.FAILED)
{
try {
if (job.isComplete()) {
printInfoLog(LOG, this.filename,
"** " + job.getTrackingURL());
break;
}
} catch (Exception e) {
LOG.warn("** " + e.getMessage());
}
}
}
Although i do agree its a crude solution
Related
Since last week I got stuck in a problem that can't resolve it.
I have an ear project containing an EJB project and a WAR project that worked fine before.
When I execute my project first i get the login page, authenticate and get my home page.
But when I want to write in an input, i tell him to wait until the element is visible but it throws a WebDriverEception :
Can't send keys to the element com.sun.proxy.$Proxy23 Expected condition failed: waiting for visibility of [[ChromeDriver: chrome on XP (508d2b6115709e937cfa289fdb0a438b)] -> xpath: //div[#class='form-control browse__browse-name-display___2s17-']/following-sibling::input[#type='file']] (tried for 20 second(s) with 500 milliseconds interval)
The problem here is that I have an old project with the same files and when I execute it through main class, it works fine but when i want to run it with Junit, i get this exception.
This is my Code :
public void sendKeyOnElement(WebElement element, String string) {
try {
if (new WebDriverWait(driver, 20).until(ExpectedConditions.visibilityOf(element)) != null) {
System.out.println("d5allll");
if (element.getText().equals(""))
element.sendKeys(string);
else {
System.out.println("d5al louta");
element.clear();
element.sendKeys(string);
}
} else {
System.out.println("Can't send keys element not visible ");
}
} catch (ElementNotVisibleException v) {
System.out.println("Element Not Visible");
} catch (WebDriverException e) {
System.out.println("Can't send keys to the element " + element.getClass().getName() + " " + e.getMessage());
}
}
After modifying my code it appears to be a Timeout Exception but the problem is that the element exists and returns its tagName and shows that the element is enabled
public boolean waitVisibilityOfElement(WebElement element) {
try {
System.out.println("Waiting visibility of element : " + element.getTagName());
if (element.isEnabled())
System.out.println("Element " + element.getTagName() + " is enabled");
else
System.out.println("Element " + element.getTagName() + " is not enabled");
fluentWait.until(ExpectedConditions.visibilityOf(element));
return true;
} catch (TimeoutException e) {
System.out.println("Time out for visibility");
return false;
} catch (ElementNotVisibleException v) {
System.out.println("Element Not Visible");
return false;
} catch (NoSuchElementException u) {
System.out.println("Element does not exist");
return false;
}
}
public void sendKeyOnElement(WebElement element, String string) {
try {
if (waitVisibilityOfElement(element)) {
System.out.println("d5allll");
if (element.getText().equals(""))
element.sendKeys(string);
else {
System.out.println("d5al louta");
element.clear();
element.sendKeys(string);
}
} else {
System.out.println("Can't send keys element not visible ");
}
} catch (ElementNotVisibleException v) {
System.out.println("Element Not Visible");
} catch (WebDriverException e) {
System.out.println("Can't send keys to the element " + element.getClass().getName() + " " + e.getMessage());
}
I can't show my Html because it is confidential but i can only show the div and input elements :
<div class="form-control browse__browse-name-display___2s17-"> </div>
<input type="file" style="display:none">
I found the solution.
I don't know if it is normal with selenium, but by default the input which its type is file can't be displayed although it is displayed in Web Browser, The method isEnabled() returns true and isDisplayed() returns false so the wait until will wait, and at the end will throws a Timeout Exception and The most funny thing is even for selenium it is not displayed you can send keys in the input which is not logic.
I am trying to find count of rows in all tables of a database on source and destination, source being Greenplum and destination being Hive(on HDFS).
To do the parallel processing, I have created two threads which calls the methods that calculate the counts on both the ends independently. The code can be seen below:
new Thread(new Runnable() {
#Override
public void run() {
try {
gpTableCount = getGpTableCount();
} catch (SQLException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
}
}).start();
new Thread(new Runnable() {
#Override
public void run() {
try {
hiveTableCount = getHiveTableCount();
} catch (SQLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
}
}).start();
while(!(gpTableCount != null && gpTableCount.size() > 0 && hiveTableCount != null && hiveTableCount.size() > 0)) {
Thread.sleep(5000);
}
The results of both the threads are stored in two separate Java Hashmaps.
Below is the count for calculating the GP counts. Method of calculating Hive counts is same except the database name, hence I just gave one method.
public Map<String,String> getGpTableCount() throws SQLException {
Connection gpAnalyticsCon = (Connection) DbManager.getGpConnection();
while(keySetIterator_gpTableList.hasNext()) {
gpTabSchemakey = keySetIterator_gpTableList.next();
tablesnSSNs = gpTabSchemakey.split(",");
target = tablesnSSNs[1].split(":");
analyticsTable = target[0].split("\\.");
gpCountQuery = "select '" + analyticsTable[1] + "' as TableName, count(*) as Count, source_system_name, max(xx_last_update_tms) from " + tablesnSSNs[0] + " where source_system_name = '" + target[1] + "' group by source_system_name";
try {
gp_pstmnt = gpAnalyticsCon.prepareStatement(gpCountQuery);
ResultSet gpCountRs = gp_pstmnt.executeQuery();
while(gpCountRs.next()) {
gpCountRs.getLong(2) + ", Max GP Tms: " + gpCountRs.getTimestamp(4).toString());
gpDataMap.put(gpCountRs.getString(1) + "," + gpCountRs.getString(3), gpCountRs.getLong(2) + "," + gpCountRs.getTimestamp(4).toString());
}
} catch(org.postgresql.util.PSQLException e) {
e.printStackTrace();
} catch(SQLException e) {
e.printStackTrace();
} catch(Exception e) {
e.printStackTrace();
}
}
System.out.println("GP Connection closed");
gp_pstmnt.close();
gpAnalyticsCon.close();
return gpDataMap;
}
Hive's Method:
public Map<String, String> getHiveTableCount() throws IOException, SQLException {
Connection hiveConnection = DbManager.getHiveConnection();
while(hiveIterator.hasNext()) {
gpHiveRec = hiveIterator.next();
hiveArray = gpHiveRec.split(",");
hiveDetails = hiveArray[1].split(":");
hiveTable = hiveDetails[0].split("\\.");
hiveQuery = "select '" + hiveTable[1] + "' as TableName, count(*) as Count, source_system_name, max(xx_last_update_tms) from " + hiveDetails[0] + " where source_system_name='" + hiveDetails[1] + "' group by source_system_name";
try {
hive_pstmnt = hiveConnection.prepareStatement(hiveQuery);
ResultSet hiveCountRs = hive_pstmnt.executeQuery();
while(hiveCountRs.next()) {
hiveDataMap.put(hiveCountRs.getString(1) + "," + hiveCountRs.getString(3), hiveCountRs.getLong(2) + "," + hiveCountRs.getTimestamp(4).toString());
}
} catch(HiveSQLException e) {
e.printStackTrace();
} catch(SQLException e) {
e.printStackTrace();
} catch(Exception e) {
e.printStackTrace();
}
}
return hiveDataMap;
}
When the jar is submitted, both the threads are launched and the SQL Queries for GP & Hive start executing simultaneously.
But the problem here is, as soon as the thread for GP finishes the execution of the method: getGpTableCount(), I see the print statement: GP Connection closed and the hive's thread hangs for atleast 30mins before resuming.
If checked for locks on Hive tables incase there would be none locked. After 30-40mins, the hive threads starts again and finishes. This happens even for less number of tables (like 20 tables) on hive.
This is how I submit the jar:
/usr/jdk64/jdk1.8.0_112/bin/java -Xdebug -Dsun.security.krb5.debug=true -Djava.security.krb5.conf=/etc/krb5.conf -Djava.security.krb5.realm=PROD.COM -Djava.security.krb5.kdc=ip-xx-xxx-xxx-xxx.ec2.internal -Djavax.security.auth.useSubjectCredsOnly=false -jar /home/etl/ReconTest/ReconAuto_Test_Prod.jar
Could anyone let me know if there is any issue with the way I create threads in the code and how can I fix it ?
Assuming your gpTableCount and hiveTableCount are normal HashMaps, you're running in to synchronization issues.
This is a broad topic to fully explain here, but here's a short intro:
Since they are populated in different threads, your main thread does not 'see' these changes until the memory is synchronized. There's no guarantee when this happens (and it's best to assume it will never happen unless you force it).
To do this properly, either use threadsafe versions (see Collections.synchronizedMap or ConcurrentHashMap), or manually synchronize your checks on the same monitor. (i.e. but the check itself in a synchronized method, and put the code that populated the map in a synchronized method, too). Alternatively, you could put the count itself in two volatile ints, and update those in the other two threads.
I have a GUI-based application that takes in a file and displays it to the user in a table format, gets some input in the form of column annotations and a bunch of parameters. Then it parses the file accordingly and initiates an "analysis".
I just found a deadlock, one I have not encountered before.
Found one Java-level deadlock:
=============================
"RMI TCP Connection(5)-130.235.214.23":
waiting to lock monitor 0x00007fac650875e8 (object 0x0000000793267298, a java.util.logging.ConsoleHandler),
which is held by "AWT-EventQueue-0"
"AWT-EventQueue-0":
waiting to lock monitor 0x00007fac65086b98 (object 0x00000006c00dd8d0, a java.io.PrintStream),
which is held by "SwingWorker-pool-1-thread-3"
"SwingWorker-pool-1-thread-3":
waiting to lock monitor 0x00007fac65087538 (object 0x00000006c001db48, a java.awt.Component$AWTTreeLock),
which is held by "AWT-EventQueue-0"
Essentially there is a parsing error and trying to log it hangs the application altogether. Interestingly logging appears to work normally before and after that particular step..
Here's the part of the code that's relevant for the analysis task:
// Activate progress indicator
frame.getMainFrame().activateInfiGlass();
SwingWorker<Map<Analyte,AnalysisResult>, Void> worker = new SwingWorker<Map<Analyte,AnalysisResult>, Void>() {
#Override
protected Map<Analyte,AnalysisResult> doInBackground() {
try {
// register parameters
param.addParam(AnalysisParams.value_key,descPanel.getValueTypeComboIndex());
param.addParam(AnalysisParams.sepchar_key,descPanel.getSepCharComboIndex());
paramPanel.registerParams();
StringBuilder sb = new StringBuilder("Data preview completed, initiating analysis...");
sb.append(System.lineSeparator())
.append("... column annotations: ")
.append(Arrays.toString(annots));
logger.info(sb.toString() + System.lineSeparator());
// Create dataset; to be passed on to SwingWorker which will
// execute the analysis
ds = new Dataset();
String[] line;
for (int i=0; i < data.length; i++){
line = data[i];
// If ignore button is clicked, skip row..
if(!(Boolean) table.getValueAt(i, 0))
ds.addRow(line, annots); // <-- This step is where the parsing exception occurs
}
System.out.println("Dataset parsed...");
logger.info("Dataset parsing complete "
+ System.lineSeparator()
+ ds.toString()
+ System.lineSeparator());
visualizeDataset();
conserv = new ConcurrencyService(ds, dbMan);
conserv.serve();
} catch (InterruptedException e) {
logger.severe("Concurrency service interrupted"
+ System.lineSeparator()
+ DebugToolbox.getStackTraceAsString(e)
+ System.lineSeparator());
System.err.println("Interrupt exception!!");
}
return conserv.getAnalyzedPaths();
}
#Override
protected void done() {
try{
results = get();
visualizeResults();
}
catch (InterruptedException ignore) {}
catch (java.util.concurrent.ExecutionException e) {
String why = null;
Throwable cause = e.getCause();
if (cause != null) {
why = cause.getMessage();
} else {
why = e.getMessage();
}
System.err.println("Error analysing data: " + why);
} catch (SQLException e) {
e.printStackTrace();
}
logger.info("#DEBUG: Conserv should have been terminated by now..." + System.lineSeparator());
frame.getMainFrame().deactivateInfiGlass();
DebugToolbox.stopExecTimer();
}
};
worker.execute();
}});
The parsing of the values happens in an instance of Dataset, using method addRow(). The following piece of code shows the way the parsing error is handled
public double valueToIntensity(String val){
if(val.equalsIgnoreCase(""))
return missingVal;
try{
double d = Double.parseDouble(val);
switch(valType){
case RAW: break;
case LOG2: d = StrictMath.pow(2,d); break;
case LOGN: d = StrictMath.pow(StrictMath.E, d); break;
case LOG10: d = StrictMath.pow(10,d); break;
default: throw new RuntimeException("Unrecognized value type");
}
if(Double.isInfinite(d)){
StringBuilder msg = new StringBuilder("Double precision overflow occurred: 'd' is infinite!!");
msg.append(System.lineSeparator())
.append("chosen value scale is ").append(valType)
.append(System.lineSeparator())
.append("value = ").append(val);
logger.severe(msg.toString() + System.lineSeparator());
System.err.println("Data parsing error!!" +
"Please make sure that you have selected the correct scale...");
System.exit(FeverMainFrame.exitCodes.get(this.getClass()));
}
else
return d;
} catch (NumberFormatException e){
System.err.println("Data parsing error!!");
// THE FOLLOWING LINE IS WHERE DEADLOCK OCCURS
logger.severe("Expected: string representation of a numerical value, "
+ "Found: " + val + System.lineSeparator());
System.err.println("Please make sure the datafile does not include any strings "
+ "like 'N/A' or '-' for denoting missing values.");
System.exit(FeverMainFrame.exitCodes.get(this.getClass()));
}
// TODO: This should never happen!
throw new RuntimeException("Assertion failed during dataset parsing...");
}
If I remove the values that are causing the parsing error, without changing anything else, both the logging framework and the rest of application runs as expected.
I would really appreciate any insight as to what is going on in this particular case.
Absent a complete example, verify that your implementation of doInBackground() does not attempt to update any GUI component or model. Instead, publish() interim results and process() them on the EDT as they become available. A complete example is shown here.
My abandon() may throw AbandonException.
While handling the exception I have to recall the same method if there some element left in the Vector.
How should I proceed? And if I am not thinking straight, what would be the best solution?
if (i + 1 < lc.size()) {
try {
lc.get(i + 1).abondon();
}
catch (AbandonException e1) {
lc.get(i+2).abandon();}}
following is some pseudo-code:
List errorIndexList = new ArrayList();
for(...) {
if (i + 1 < lc.size()) {
try {
lc.get(i + 1).abondon();
} catch (AbandonException e1) {
errorIndexList.add(i+1);
// do some error handle work ..
// print error log/info if need,
continue; // this is optional, in case it's the last statement,
}
}
}
// use errorIndexList to handle your errors, if need,
You could use finally here.
try {
lc.get(i + 1).abondon();
}
catch (AbandonException e1) {
} finally {
your code
}
I've got the following code snippet that I'm thinking of refactoring to a more abstract application exception handler but I want to make sure I've got it as tidy as possible first
Any suggestions on how to improve this code or make it more resuable
int id = -1;
final StringBuilder errorMessage = new StringBuilder("Bad Input Value: ");
try {
id = Integer.parseInt(edtId.getText().toString());
} catch (final NumberFormatException e) {
errorMessage.append("Failed to parse id " + e.getMessage());
}
if (id < 0) {
errorToast(errorMessage.toString());
} else {
//go ahead an retreive values from database knowing the id has been parsed
//correctly to a positive int.
}
Why pre-assign id to a magic number?
try {
int id=Integer.parseInt(edtId.getText().toString());
//go on as normal
} catch (NumberFormatException e) {
//handle error
}