I am experiencing something strange while using Oracle SQL Developer 4.0.1.14.
When I connect to a particular db and run a simple select * from table1; I get the result set. (Still happens regardless of the number of records in the table, which is few, however the table does contain over 170 fields)
If I try and run it a second time I get a java heap space error.
If I try and run it again it starts throwing Protocol violation errors, with a different numbered protocol error each successive run.
I have never experienced this problem with other oracle db’s, even when connecting through the same installation of SQL Developer.
The only way for me to be able to query that table again is to reconnect to the db. Other users of this same db do not experience this problem. Has anyone ever experienced this issue?
you can edit sqldeveloper.conf and change the size of the heap space, by adding the following line:
AddVMOption -Xmx4096M
I know it is too late may be that could help someone else
in the explorer enter %appdata%
that will bring you to your :
C:\Users\username\AppData\Roaming\
Find your sql developer in my case it was:
sqldeveloper
Find a file named: product.conf
Almost at the end of the file change the line:
AddVMOption -Xmx800m
into :
AddVMOption -Xmx2048m
In my case I increased the heap size in "sqldeveloper.conf" to be 3072M --> 3GB but this didn't fix the issue.
AddVMOption -Xms1024M
AddVMOption -Xmx3072M
I found that there was another worksheet with different structure in the xlsx file I was trying to import. After deleting that extra unneeded worksheet I could import successfully. I was using MAC machine.
Final note: converting the excel file to csv and importing the later one was much faster than importing the excel file for the same data.
Related
Inside Netbeans 8.0.2:
Steps: New File > Hibernate> Hibernate reverse engineering
Retrieval of the Tables and Views begin, gets to 98% and then hangs. It freezes on the same view OR the very next view.
I've tried on multiple machines - same result.
Is there a limit on the size of the input data - in the wizard? Or, maybe a problem with the database - itself?
This is VisualVS snapshot
thanks
One guess. Try increasing the heap size of Eclipse. Use the .ini file and increase the -Xmx and -Xms parameters. It is a pure guess, but who knows :)
Second suggestion :) You visualVM or yourkit or profiler whatever profiler you have. Then attach yourself to the Netbeans and then find out where exactly it is blocking. On which operation. Then We/You will know more about the nature of the error and how to resolve it :D
I would recommend clean cache dir (details), after that read this topic and probably you should open a bug on netbeans bugzilla.
When I run my project in Netbeans IDE (compiling it and testing it), it works fine. It enables me reading xls file with size of 25000 rows and extract all the infromation above, then save them into database.
The problem appears when I generate the installer and deliver it. When I install my application and run it, I obtain that error:
java.lang.OutOfMemoryError: GC overhead limit exceeded
at jxl.read.biff.File.read(File.java:217)
at jxl.read.biff.Record.getData(Record.java:117)
at jxl.read.biff.CellValue.<init>(CellValue.java:94)
at jxl.read.biff.LabelSSTRecord.<init>(LabelSSTRecord.java:53)
at jxl.read.biff.SheetReader.read(SheetReader.java:412)
at jxl.read.biff.SheetImpl.readSheet(SheetImpl.java:716)
at jxl.read.biff.WorkbookParser.getSheet(WorkbookParser.java:257)
at com.insy2s.importer.SemapExcelImporter.launchImport(SemapExcelImporter.java:82)
at//staff
I even user POI libraries but I got same scenario.
UPDATE:
In messages.log file of my application, I found this strange values (I have changed them in netbeans.conf)
Input arguments:
-Xms24m
-Xmx64m
-XX:MaxPermSize=256m
-Dnetbeans.user.dir=C:\Program Files\insy2s_semap_app
-Djdk.home=C:\Program Files\Java\jdk1.8.0_05
-Dnetbeans.home=C:\Program Files\insy2s_semap_app\platform
OK, I got the answer... Let's begin from the beginning.
It is true that Microsoft documents hanlders' libraries need much resources but not so bad to cause application running failure as I thought at the beginning. In fact, that probleme has revealed to me a weakness and a shortage.
Because of working with Netbeans 8.0.2, the new property
app.conf
should be taken into consideration. It has all what needed to configure our applications.
But it is not possible to edit it directly so to increase the max permitted memory, we have to change the values in
harness/etc/app.conf
in netbeans installation directory. For more details look here.
im using jdk 1.8.0_25 and im trying to work with a big database. it weights about 2gb.
i run the program trough eclipse.
i use 64bit java version on 64bit windows 7.
got 8gb ram.
everytime i try connectiong to it , i get java heap errors... so i tried increasing my heap size and i didnt make it!
visualVM says my max is still 2gb.
what i did was - control panel> programs > java > java > view.
ive added -Xmx6g parameter to my jdk (and im sure its the right jdk) but still nothing works.
any other suggestions on how to increase my heap size?
EDIT:
here is the failing code line. just to make you guys sure that its not the code failing.
try {
Class.forName("net.ucanaccess.jdbc.UcanaccessDriver");
conn = DriverManager
.getConnection("jdbc:ucanaccess://D:/Work/currentFolder/reallyBigDB.mdb");
conn = ...... is the failing line.
From Ucanaccess home page (first hit on Google).
When dealing with large databases and using the default memory
settings (i.e., with Driver Property memory=true), it is recommended
that users allocate sufficient memory to the JVM using the -Xms and
-Xmx options. Otherwise it's needed to set Driver Property memory=false (Connection
conn=DriverManager.getConnection("jdbc:ucanaccess://c:/pippo.mdb;memory=false");)
Now obviously you have another problem, the heap size. That's an eclipse issue. You could compile it in Eclipse and run it on the command line, giving the memory parameters there. That's at least a surefire way to make sure the parameters are correct.
Nevertheless, unless you absolutely have to load a huge chunk of data into memory, you usually don't want to. Luckily Ucanaccess has that parameter.
To increase the heap size of the JVM when using Eclispe:
Window -> Preferences -> Java ->Installed JRE
Then select the JRE you are using, and click Edit and enter the argument for the JVM in Default VM arguments
PS: As already mentioned in the comment section, you should not load the entire DB in memory, so it may be a better idea to review your code instead for increasing the heap
Other two parameters alternative to memory=false that may be useful are:
skipIndexes=true (it avoids memory occupation due to indexes that aren't integrity constraints)
Lobscale=1 if the db size is due to blob/ole data. In this specific case both the load time and the memory occupation will be drammatically reduced.
They have been both introduced with the 2.0.9.4.
I have two Web Projects running in Tomcat 7.0.5.
1st Project hosts the Services(Web Services) which are used in 2nd Project.
Both Projects has separates Databases but on same Mysql Server Instance.
Technologies used are Spring,Hibernate. Database is Mysql. Server is Apache Tomcat 7.0.5
Initially everthing was working fine.
As More Records(1-2 lakh) were added to Database it started giving OutOfMemoryError Exception and Exception in thread "ajp-9009-AsyncTimeout" java.lang.OutOfMemoryError: Java heap space Errors.
I googled for Errors, according to solutions explained i updated Catalina.bat file as follows
set JAVA_OPTS="-Djava.awt.headless=true -Dfile.encoding=UTF-8
-server -Xms1536m -Xmx1536m
-XX:NewSize=256m -XX:MaxNewSize=256m -XX:PermSize=256m
-XX:MaxPermSize=256m -XX:+DisableExplicitGC"
and
set CATALINA_OPTS=-server -Xms256m -Xmx1024m
But nothing worked for me.
How to get Rid of above Errors..?
Am i doing anything wrong..?
And Also i have an Query, Does an WebService Return Lakhs of Record..?
Without specifics, it's difficult to identify your issues.
I would look at tools like YourKit to profile your app (you can get a free 30 day trial, ISTR) and VisualVM to dig down into your JVM.
But, if your webservice takes your 100,000 records, renders them into XML/JSON and sends them out in one request/response, then that's a lot of data to pull from the db, marshall and transmit. I would investigate that design and see if that's the fundamental issue.
As you r using the ORM tool to handle the database relaed operation, you can limit the bulk submit to optimum that will not prevent OutOfMemoryError.
Another possible case, I'm not sure on it. this may occur due to JDBC driver. if possible try to use updated jdbc driver jar.
I have went through many links and similar questions related to java.lang.OutOfMemoryError: Java heap space but none of the solutions resolved my problem. So here is my question,
I have a web application where user uploads a Excel file that has records around 2500 my application reads contents of this file and inserts them into Database.
but after inserting 700 records I am getting exception as throwable object caught= java.lang.OutOfMemoryError: Java heap space
Same code works if file contains 500 or less than 500 records. following is my JAVA_OPTS & CATALINA_OPTS variable in catalina.bat file
JAVA_OPTS=-Xmx1536m -XX:MaxPermSize=128m
CATALINA_OPTS=-Xms512m -Xmx512m
can anyone please tell me what can be done to resolve this issue?
Probably you are loading the entire file in memory.
Can you try to convert your Excel file to plain-text CSV and then read it line by line by using the Java class Scanner?
Scanner scanner = new Scanner("yourCsvFile.csv");
while (scanner.hasNextLine()) {
String line = scanner.nextLine();
//process your line
}
You are probably using Hibernate to insert the records. Hibernate maintains a session cache of all saved objects and postpones any actual inserts until transaction commit. You must flush the session cache every 100 records or so. It is also advised to disable the second-level cache while doing a massive insert.
There are several other issues related to batch inserts with Hibernate and the official documentation is mandatory reading for anyone implementing it.
Did you close you File stream?If you didn't close you file stream,the heap size would be very bigger!Then it would throw a OutOfMemoryError exception.So,check that if you close the file stream!
can anyone please tell me what can be done to resolve this issue?
It sounds to me like you have a memory leak (i.e. a bug) in your application. Without seeing your code, we can't tell you where it is. However the answers to these questions explain how to find memory leaks in Java programs:
General strategy to resolve Java memory leak?
How to find a Java Memory Leak
I should point out that you have specified two different values for the heap size in the JAVA_OPTS and CATALINA_OPTS variables. Only one of this is going to take effect, but which one depends on the details of the script that is used to start Tomcat.
However, a properly written application shouldn't run out of memory while uploading data from a spreadsheet. A heap size of 512Mb should be plenty.