Why does the NetBeans Java debugger never reach this code? - java

I'm trying to debug a method in Java using NetBeans.
That method is:
public Integer getNumberOfClamps(Locations paLocation) {
Integer ret = -1;
List list = new ArrayList();
String removeme = "ABC";
if (paLocation == null) {
return ret;
}
try {
IO io = new IO(this.getSchemaName());
Session session = io.getSession();
String sql = "select count(*) from assets a join assettypes at on (at.id = a.assettype_id) ";
sql += "where a.currentlocation_id = " + paLocation.getId() + " and at.clamp = 1 and at.active = 1;";
list = session.createQuery(sql).list();
// for some reason, list is empty yet MySQL reports 40 records
// and the following two lines are never reached!
ret = list.size();
removeme = "WHAT???";
} catch (Exception ex) {
ret = -1; // explicitly set return
} finally {
return ret;
}
}
Towards the middle of the method you will see list = session.createQuery(sql).list();
For some reason, this is returning an empty list even though when the SQL is run manually, I get 40 results.
But the odd part is that once the .list() is called, it jumps to the finally block and never reaches the rest! So for testing, 'removeme' should equal WHAT??? but the debugger reports it as still ABC.
What gives?

You are using the wrong method. 'createQuery' is expecting HQL syntax. Change your method to 'createSQLQuery'

Related

using JPA, method is taking a 60 sec

I am working with JPA, my web application is taking 60 sec to execute this method, I want to execute it faster how to achive ?
public boolean evaluateStudentTestPaper (long testPostID, long studentID, long howManyTimeWroteExam) {
Gson uday = new Gson();
Logger custLogger = Logger.getLogger("StudentDao.java");
// custLogger.info("evaluateTestPaper test paper for testPostID: " +
// testPostID);
long subjectID = 0;
// checking in table
EntityManagerFactory EMF = EntityManagerFactoryProvider.get();
EntityManager em = EMF.createEntityManager();
List<StudentExamResponse> studentExamResponses = null;
try {
studentExamResponses = em
.createQuery(
"SELECT o FROM StudentExamResponse o where o.studentId=:studentId And o.testPostID=:testPostID and o.howManyTimeWroteExam=:howManyTimeWroteExam")
.setParameter("studentId", studentID).setParameter("testPostID", testPostID)
.setParameter("howManyTimeWroteExam", howManyTimeWroteExam).getResultList();
System.out.println("studentExamResponses--------------------------------------------------"
+ uday.toJson(studentExamResponses) + "---------------------------------------");
} catch (Exception e) {
custLogger.info("exception at getting student details:" + e.toString());
studentExamResponses = null;
}
int studentExamResponseSize = studentExamResponses.size();
if (AppConstants.SHOWLOGS.equalsIgnoreCase("true")) {
custLogger.info("student questions list:" + studentExamResponseSize);
}
// Get all questions based on student id and test post id
List<ExamPaperRequest> examPaperRequestList = new ArrayList<ExamPaperRequest>();
List<Questions> questionsList = new ArrayList<Questions>();
// StudentExamResponse [] studentExamResponsesArgs =
// (StudentExamResponse[]) studentExamResponses.toArray();
// custLogger.info("Total questions to be evaluated: " +
// examPaperRequestList.size());
List<StudentTestResults> studentTestResultsList = new ArrayList<StudentTestResults>();
StudentTestResults studentTestResults = null;
StudentResults studentResults = null;
String subjectnames = "", subjectMarks = "";
int count = 0;
boolean lastIndex = false;
if (studentExamResponses != null && studentExamResponseSize > 0) {
// studentExamResponses.forEach(studentExamResponses->{
for (StudentExamResponse o : studentExamResponses.stream().parallel()) {
// 900 lines of coade inside which includes getting data from database Queries
}
}
As #Nikos Paraskevopoulos mentioned, it should probably be the ~900 * N database iterations inside that for loop.
I'd say to avoid DB iterations as much as you can, specially inside a loop like that.
You can try to elaborate your current StudentExamResponse sql to englobe more clauses - those you're using inside your for mainly, which could even diminish the amount of items you iterate upon.
My guess would be your select query is taking time.
If possible, set query timeout to less than 60 seconds & confirm this.
Ways of setting query timeout can be found out there - How to set the timeout period on a JPA EntityManager query
If this is because of query, then you may need to work to make select query optimal.

Fixing GC Overhead Limit Exceeded Without Increasing Heap Size

I am working on a Java program that will take data from a Sybase database and, using UCanAccess, import it into a Microsoft Access Database. However, I am currently running into a problem, receiving the error “java.lang.OutOfMemoryError: GC overhead limit exceeded”.
To put the situation into context, I am attempting to import approximately 1.3 million records into the Access Database. The program currently encounters the error after approximately 800,000 of these records have been imported, about ten minutes at run time, and long after the ResultSet has been retrieved from the Sybase Database.
I have attempted to modify the heap size, but that causes the program to slow down significantly. Note that this is an ad hoc program to be run multiple times as needed, so the run time should be in the order of minutes or possibly hours, whereas increasing the heap size, based on my observations, would increase the run time to the order of days.
For reference, the error occurs in the main method, during the subroutine called getRecords (the exact line of code that this occurs on varies on a run-by-run basis). I have included the code to the program below, with some minor changes to parts of the code, such as the exact query I am using and the username and password to the access database, so as not to reveal sensitive information.
Is there anything that I can change in the code of my program to ease the load on the garbage collector without increasing the run time beyond a few hours?
EDIT: It appears that I was mistaken as to the default max heap size of Java. When I thought I was increasing the heap size by setting it to 512m, I was unintentionally cutting the heap size in half. When I set the heap size to 2048m instead, I got a java heap space error. I would still like to solve the problem without modifying the heap size, if possible.
EDIT 2: Apparently, I was misled as to a number of records I needed to process. It is double the size I originally thought it was, which indicates that I need to drastically change my approach. Going to go ahead and accept an answer, because that answer did result in large improvements.
getRecords method:
public static void getRecords(SybaseDatabase sdb, AccessDatabase adb)
{
ArrayList<Record> records = new ArrayList<Record>();
StringBuffer sql = new StringBuffer();
Record currentRecord = null;
try{
Statement sybStat = sdb.connection.createStatement();
PreparedStatement resetADB = adb.connection.prepareStatement("DELETE FROM Table");
PreparedStatement accStat = adb.connection.prepareStatement("INSERT INTO Table (A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P) VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)");
sql.append(query);//query is a placeholder, as I cannot give out the actual query to the database. I have confirmed that the query itself gives the ResultSet that I am looking for
ResultSet rs = sybStat.executeQuery(sql.toString());
resetADB.executeUpdate();
boolean nextWatch = true;
Integer i = 1;
Record r = new Record();
while(nextWatch)
{
for (int j = 0; j < 1000 && nextWatch; j++)
{
nextWatch = rs.next();
r.setColumn(i, 0);
r.setColumn(rs.getString("B"), 1);
r.setColumn(rs.getString("C"), 2);
r.setColumn(rs.getString("D"), 3);
r.setColumn(rs.getString("E"), 4);
r.setColumn(rs.getString("F"), 5);
r.setColumn(rs.getString("G"), 6);
r.setColumn(rs.getString("H"), 7);
r.setColumn(rs.getString("I"), 8);
r.setColumn(rs.getString("J"), 9);
r.setColumn(rs.getString("K"), 10);
r.setColumn(rs.getInt("L"), 11);
r.setColumn(rs.getString("M"), 12);
r.setColumn(rs.getString("N"), 13);
r.setColumn(rs.getString("O"), 14);
r.setColumn(rs.getString("P"), 15);
records.add(r);
i++;
}
for(int k = 0; k < records.size(); k++)
{
currentRecord = records.get(k);
for(int m = 0; m < currentRecord.getNumOfColumns(); m++)
{
if (currentRecord.getColumn(m) instanceof String)
{
accStat.setString(m + 1, "\"" + currentRecord.getColumn(m) + "\"");
}
else
{
accStat.setInt(m + 1, Integer.parseInt(currentRecord.getColumn(m).toString()));
}
}
accStat.addBatch();
}
accStat.executeBatch();
accStat.clearBatch();
records.clear();
}
adb.connection.commit();
}
catch(Exception e){
e.printStackTrace();
}
finally{
}
}
}
Full code:
import java.util.*;
import java.sql.*;
import com.sybase.jdbc2.jdbc.SybDriver;//This is an external file that is used to connect to the Sybase database. I will not include the full code here for the sake of space but will provide it upon request.
public class SybaseToAccess {
public static void main(String[] args){
String accessDBPath = "C:/Users/me/Desktop/Database21.accdb";//This is a placeholder, as I cannot give out the exact file path. However, I have confirmed that it points to the correct file on the system.
String sybaseDBPath = "{sybServerName}:{sybServerPort}/{sybDatabase}";//See above comment
try{
AccessDatabase adb = new AccessDatabase(accessDBPath);
SybaseDatabase sdb = new SybaseDatabase(sybaseDBPath, "user", "password");
getRecords(sdb, adb);
}
catch(Exception e){
e.printStackTrace();
}
finally{
}
}
public static void getRecords(SybaseDatabase sdb, AccessDatabase adb)
{
ArrayList<Record> records = new ArrayList<Record>();
StringBuffer sql = new StringBuffer();
Record currentRecord = null;
try{
Statement sybStat = sdb.connection.createStatement();
PreparedStatement resetADB = adb.connection.prepareStatement("DELETE FROM Table");
PreparedStatement accStat = adb.connection.prepareStatement("INSERT INTO Table (A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P) VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)");
sql.append(query);//query is a placeholder, as I cannot give out the actual query to the database. I have confirmed that the query itself gives the ResultSet that I am looking for
ResultSet rs = sybStat.executeQuery(sql.toString());
resetADB.executeUpdate();
boolean nextWatch = true;
Integer i = 1;
Record r = new Record();
while(nextWatch)
{
for (int j = 0; j < 1000 && nextWatch; j++)
{
nextWatch = rs.next();
r.setColumn(i, 0);
r.setColumn(rs.getString("B"), 1);
r.setColumn(rs.getString("C"), 2);
r.setColumn(rs.getString("D"), 3);
r.setColumn(rs.getString("E"), 4);
r.setColumn(rs.getString("F"), 5);
r.setColumn(rs.getString("G"), 6);
r.setColumn(rs.getString("H"), 7);
r.setColumn(rs.getString("I"), 8);
r.setColumn(rs.getString("J"), 9);
r.setColumn(rs.getString("K"), 10);
r.setColumn(rs.getInt("L"), 11);
r.setColumn(rs.getString("M"), 12);
r.setColumn(rs.getString("N"), 13);
r.setColumn(rs.getString("O"), 14);
r.setColumn(rs.getString("P"), 15);
records.add(r);
i++;
}
for(int k = 0; k < records.size(); k++)
{
currentRecord = records.get(k);
for(int m = 0; m < currentRecord.getNumOfColumns(); m++)
{
if (currentRecord.getColumn(m) instanceof String)
{
accStat.setString(m + 1, "\"" + currentRecord.getColumn(m) + "\"");
}
else
{
accStat.setInt(m + 1, Integer.parseInt(currentRecord.getColumn(m).toString()));
}
}
accStat.addBatch();
}
accStat.executeBatch();
accStat.clearBatch();
records.clear();
}
adb.connection.commit();
}
catch(Exception e){
e.printStackTrace();
}
finally{
}
}
}
class AccessDatabase{
public Connection connection = null;
public AccessDatabase(String filePath)
throws Exception
{
String dbString = null;
dbString = "jdbc:ucanaccess://" + filePath;
connection = DriverManager.getConnection(dbString);
connection.setAutoCommit(false);
}
}
class Record{
ArrayList<Object> columns;
public
Record(){
columns = new ArrayList<Object>();
columns.add("Placeholder1");
columns.add("Placeholder2");
columns.add("Placeholder3");
columns.add("Placeholder4");
columns.add("Placeholder5");
columns.add("Placeholder6");
columns.add("Placeholder7");
columns.add("Placeholder8");
columns.add("Placeholder9");
columns.add("Placeholder10");
columns.add("Placeholder11");
columns.add("Placeholder12");
columns.add("Placeholder13");
columns.add("Placeholder14");
columns.add("Placeholder15");
columns.add("Placeholder16");
}
<T> void setColumn(T input, int colNum){
columns.set(colNum, input);
}
Object getColumn(int colNum){
return columns.get(colNum);
}
int getNumOfColumns()
{
return columns.size();
}
}
class SybaseDatabase{
public Connection connection;
#SuppressWarnings("deprecation")
public SybaseDatabase(String filePath, String Username, String Password)
throws Exception
{
SybDriver driver;
try
{
driver = (SybDriver)Class.forName("com.sybase.jdbc2.jdbc.SybDriver").newInstance();
driver.setVersion(SybDriver.VERSION_6);
DriverManager.registerDriver(driver);
}
catch (Exception e)
{
e.printStackTrace(System.err);
}
connection = DriverManager.getConnection("jdbc:sybase:Tds:" + filePath, Username, Password);
}
}
If you want to use less memory, you should process less lines in same time but reuse all objects you can reuse (like the PreparedStatement)
First : You use an ArrayList<> in Record with a fixed size. You can just use an array Record[] for that. The principle of ArrayList is to have an array with a dynamic size which you don't need here
Second : don't load all the data from database before handle it, load a few part of data and process it, and continue.
You can do that by extracting the part of your code processing some rows and changing your query by limiting the number of returned rows.
Now, you load 1000 rows (from index 0 to 999), you process and commit them. Then you load 1000 rows (from index 1000 to 1999), you process and commit them. And then you continue. Between each pack of rows, don't keep any reference on precessed data (like on records) to avoid them to be kept in memory (like that they will be garbage-collected when necessary).
If you still have not enought memory, i guess you kept a reference on some objects which are not garbage collected due to that, causing a memory leak problem : your program need more and more memory when processing each data. You can use some tools like the jvisualvm (provided within java) to investigate the use of the memory

Java returning null pointer exception for SQL query that gets passed to JSP

I am working on a school assignment that required us to use SQL statements in Java code as well as use the LIKE operator for a search. In order to properly search I have to get a string from the user, and split the string by any delimiter, and then run the query like so:
SELECT * FROM movies WHERE (movies.title LIKE '%userInput%');
I then return this query in the form of an ArrayList.
Now, when I was testing it out. I originally tested it with no user input, and my query became: SELECT * FROM movies WHERE (movies.title LIKE '%%');. This gave me the correct results.
However when I put a title in there, all of the sudden I get a NullPointerException on this line:
if(title.equals("")) { return "(movies.title LIKE '%%') "; from this section of my code:
public String getSearchString(String title) {
if(title.equals("")) { return "(movies.title LIKE '%%') "; }
String ret = "(";
ArrayList<String> titleArray = Util.splitSearch(title);
for(int i = 0; i < titleArray.size() - 1; ++i) {
String temp = titleArray.get(i);
String stmt = "movies.title LIKE '%" + temp + "%' OR ";
ret += stmt;
}
String temp = "movies.title LIKE '%" + titleArray.get(titleArray.size() - 1) + "%')";
ret += temp;
return ret;
}
This is then called like so:
public List<Movie> listMovies(String title) throws SQLException {
List<Movie> search = new ArrayList<Movie>();
if(null != title && title.isEmpty()) { title = ""; }
ResultSet res = queryMovies(getSearchString(title));
while(res.next()) {
Movie mov = new Movie();
mov.setTitle(res.getString("title"));
search.add(mov);
}
return search;
}
private static queryMovies(String st) throws SQLException {
ResultSet res = null;
try {
PreparedStatement ps = dbcon.prepareStatement(st);
res = ps.executeQuery();
} catch(SQLException e) {
e.printStackTrace();
}
return res;
}
I unfortunately have to do this since I won't know how much a user will enter. And I am also not allowed to use external libraries that make the formatting easier. For reference my Util.splitSearch(...) method looks like this. It should be retrieving anything that is a alphanumeric character and should be splitting on anything that is not alphanumeric:
public static ArrayList<String> splitSearch(String str) {
String[] strArray = str.split("[^a-zA-Z0-9']");
return new ArrayList(Arrays.asList(strArray));
}
What is interesting is when I pass in getSearchString(""); explicitly, I do not get a NullPointerException. It is only when I allows the variable title to be used do I get one. And I still get one when no string is entered.
Am I splitting the String wrong? Am I somehow giving SQL the wrong statement? Any help would be appreciated, as I am very new to this.
the "title" which is passed from input is null, hence you're getting nullpointerexception when you do title.equals("").
Best practices suggest you do a null check like (null != title && title.equals("")).
You can also do "".equals(title)

Multithreading issues for database insertion

I have a piece of JAVA code that is accessed by multiple threads.
synchronized (this.getClass())
{
System.out.println("stsrt");
certRequest.setRequestNbr(
generateRequestNumber(
certInsuranceRequestAddRq.getAccountInfo().getAccountNumberId()));
System.out.println("outside funcvtion"+certRequest.getRequestNbr());
reqId = Utils.getUniqueId();
certRequest.setRequestId(reqId);
System.out.println(reqId);
ItemIdInfo itemIdInfo = new ItemIdInfo();
itemIdInfo.setInsurerId(certRequest.getRequestId());
certRequest.setItemIdInfo(itemIdInfo);
dao.insert(certRequest);
addAccountRel();
System.out.println("end");
}
the function generateRequestNumber() generates a request number based on the data fetched from two database tables.
public String generateRequestNumber(String accNumber) throws Exception
{
String requestNumber = null;
if (accNumber != null)
{
String SQL_QUERY = "select CERTREQUEST.requestNbr from CertRequest as CERTREQUEST, "
+ "CertActObjRel as certActObjRel where certActObjRel.certificateObjkeyId=CERTREQUEST.requestId "
+ " and certActObjRel.certObjTypeCd=:certObjTypeCd "
+ " and certActObjRel.certAccountId=:accNumber ";
String[] parameterNames = { "certObjTypeCd", "accNumber" };
Object[] parameterVaues = new Object[]
{
Constants.REQUEST_RELATION_CODE, accNumber
};
List<?> resultSet = dao.executeNamedQuery(SQL_QUERY,
parameterNames, parameterVaues);
// List<?> resultSet = dao.retrieveTableData(SQL_QUERY);
if (resultSet != null && resultSet.size() > 0) {
requestNumber = (String) resultSet.get(0);
}
int maxRequestNumber = -1;
if (requestNumber != null && requestNumber.length() > 0) {
maxRequestNumber = maxValue(resultSet.toArray());
requestNumber = Integer.toString(maxRequestNumber + 1);
} else {
requestNumber = Integer.toString(1);
}
System.out.println("inside function request number"+requestNumber);
return requestNumber;
}
return null;
}
The tables CertRequest and CertActObjRel used in generateRequestNumber() are updated by the functions "dao.insert(certRequest);" and "addAccountRel();" used in my initial code respectively. Also the System.out.println() statements used in my initial code have following output.
stsrt
inside function request number73
outside funcvtion73
A1664886-5F84-45A9-AB5F-C69768B83EAD
end
stsrt
inside function request number73
outside funcvtion73
44DAD872-6A1D-4524-8A32-15741FAC0CA9
end
If you notice both the threads run in a synchronized manner, but when the request number is generated , it's the same. My assumption is the database updation for CertRequest and CertActObjRel is done when both the threads finish their execution.
Could anyone help me to fix this?

Improve the search functionality

How can I improve the search functionality.? I have written some codes to search for something.The search was taking too much time. And the code snippets here,
I am pulling the data from the database using this method.,
OracleConnection connection = null;
OraclePreparedStatement ptmst = null;
OracleResultSet rs = null;
OracleCallableStatement cstmt = null;
StringBuffer strBfr = new StringBuffer();
ArrayList myList = new ArrayList();
try
{
connection = (OracleConnection) TransactionScope.getConnection();
strBfr.append("select distinct .......... ");
ptmst = (OraclePreparedStatement)connection.prepareStatement(strBfr.toString());
rs = (OracleResultSet)ptmst.executeQuery();
while (rs.next())
{
HashMap hashItems = new HashMap();
hashItems.put("first",rs.getString(1));
hashItems.put("second",rs.getString(2));
myList.add(hashItems);
}
}
catch (Exception e) {
}
finally {
try {
if (ptmst != null) {
ptmst.close();
}
} catch (Exception e) {
}
try {
if (connection != null) {
TransactionScope.releaseConnection(connection);
}
} catch (Exception e) {
}
}
return myList;
In my jsp:
ArrayList getValues = new ArrayList();
getValues = //calling Method here.
for(int i=0; i < getValues.size();i++)
{
HashMap quoteSrch=(HashMap)allPOV.get(i);
first = (String)quoteSrch.get("first");
second = (String)quoteSrch.get("second");
}
Query:
SELECT DISTINCT(mtl.segment1),
mtl.description ,
mtl.inventory_item_id ,
mtl.attribute16
FROM mtl_system_items_b mtl,
mtl_system_items_tl k
WHERE 1 =1
AND mtl.organization_id = ?
AND k.inventory_item_id = mtl.inventory_item_id
AND NVL(orderable_on_web_flag,'N')= 'Y'
AND NVL(web_status,'UNPUBLISHED') = 'PUBLISHED'
AND mtl.SEGMENT1 LIKE ? --Here is the search term
Make sure organization_id , inventory_item_id and especially SEGMENT1 is indexed in your table.
Your query is pretty standard , if that doesn't work then it seems like your DB server is responding slow which could be due to number of reasons like low space , low memory , slow disk/read etc.
You can then ask your DBA/Server admins to check that.
First you need to find out the real problem
Is it the DB query
Is it the Network (is the App and the DB located on the same machine?)
Once you have identified that it is the DB query, then it becomes more of a DB question.
How does the two tables look like?
Any index used?
How does the data look like (How many rows etc)
After you have analyzed this, you should be able to post the question differently and expect an answer. I am not a DB guy, but I am sure someone would be able to provide some pointers.
Tunning has to be done:
Check TransactionScope.getConnection(); is giving connection without any delay.
Instead of creating new HashMap hashItems = new HashMap(); you can use
while (rs.next()){
myList.add(rs.getString(1) + "delimiter" + rs.getString(2));
}
in jsp use
first = allPOV.get(i).split("delimter")[0];
second = allPOV.get(i).split("delimter")[1];
so that you can reduce memory.
If possible use limit in your query, and use index on SEGMENT1 link.

Categories