Java Out of Memory Exception heap Space - java

I wrote code in java which reading one excel file and after encrypting data its writing againg. Code is working fine but when i am passing 12 mb size file to code its giving error
Warning: Usage of a local non-builtin name
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at jxl.read.biff.File.next(File.java:181)
at jxl.read.biff.SheetReader.read(SheetReader.java:375)
at jxl.read.biff.SheetImpl.readSheet(SheetImpl.java:716)
at jxl.read.biff.WorkbookParser.getSheet(WorkbookParser.java:257)
at ReadExcel.read(ReadExcel.java:41)
at ReadExcel.main(ReadExcel.java:162)
How can i resolve it.

Related

Is there a limit in loading dynamic libraries in JAVA using JNA?

Good morning,
We are executing the following code and we are reaching an error message when loading a certain number of dlls:
File file = new File("C:\\Users\\jevora\\Downloads\\dng_tests\\dllsCopies");
file.mkdirs();
for (int i = 1; i < 10000; i++) {
String filename = "heatedTankCvode" + i + ".dll";
Files.copy(new File("C:\\Users\\jevora\\Downloads\\dng_tests\\heatedTankCvode.dll").toPath(),
new File(file, filename).toPath(), StandardCopyOption.REPLACE_EXISTING);
NativeLibrary.getInstance(new File(file, filename).getAbsolutePath());
System.out.println("Loaded: " + filename);
}
As you can see here, we want to load 10,000 dlls using JNA. However,in the following log, the process stops at loading the instance 1,051:
Loaded: heatedTankCvode1048.dll
Loaded: heatedTankCvode1049.dll
Loaded: heatedTankCvode1050.dll
Exception in thread "main" java.lang.UnsatisfiedLinkError: Unable to load library 'C:\Users\jevora\Downloads\dng_tests\dllsCopies\heatedTankCvode1051.dll': Native library (win32-x86-64/C:\Users\jevora\Downloads\dng_tests\dllsCopies\heatedTankCvode1051.dll)
About the code, first we copy the dll in a new location with a different name and, then, we try to load it. We wonder if there is a limitation to the amount of dlls that can be loaded. Is there a limitation? can we overcome it?
Thanks in advance
EDIT: I've tried with several memory configurations and it always stop in the 1051 instance
I think that the cause might be explained by this old Microsoft Forum post:
DLL Limit?
It appears that each DLL that you are loading is consuming a TLS (thread local storage) slot. There is a per process limit of 1088 on the number of TLS slots. From all that I have read, the limit is hard ... and there is no way to increase it.
From what I have read, a DLL doesn't have to use TLS, so you should investigate if you can change the way that your DLLs are created so that they don't do this.

OutOfMemoryError while creating XSSFWorkbook in Apache POI

I have a spring boot Rest service which i am using to create excel file(xlsm). Getting strange issue as application start first time it will easily create a excel file but calling a rest endpoint again generate OutOfMemoryError exception.
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
at org.apache.xmlbeans.impl.store.Cur$CurLoadContext.attr(Cur.java:3044)
at org.apache.xmlbeans.impl.store.Cur$CurLoadContext.attr(Cur.java:3065)
at org.apache.xmlbeans.impl.store.Locale$SaxHandler.startElement(Locale.java:3198)
at org.apache.xerces.parsers.AbstractSAXParser.startElement(AbstractSAXParser.java:498)
at org.apache.xerces.parsers.AbstractXMLDocumentParser.emptyElement(AbstractXMLDocumentParser.java:180)
at org.apache.xerces.impl.XMLNSDocumentScannerImpl.scanStartElement(XMLNSDocumentScannerImpl.java:275)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl$FragmentContentDispatcher.dispatch(XMLDocumentFragmentScannerImpl.java:1653)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:324)
at org.apache.xerces.parsers.XML11Configuration.parse(XML11Configuration.java:890)
at org.apache.xerces.parsers.XML11Configuration.parse(XML11Configuration.java:813)
at org.apache.xerces.parsers.XMLParser.parse(XMLParser.java:108)
at org.apache.xerces.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1198)
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:564)
at org.apache.xmlbeans.impl.store.Locale$SaxLoader.load(Locale.java:3414)
at org.apache.xmlbeans.impl.store.Locale.parseToXmlObject(Locale.java:1272)
at org.apache.xmlbeans.impl.store.Locale.parseToXmlObject(Locale.java:1259)
at org.apache.xmlbeans.impl.schema.SchemaTypeLoaderBase.parse(SchemaTypeLoaderBase.java:345)
at org.openxmlformats.schemas.spreadsheetml.x2006.main.WorksheetDocument$Factory.parse(Unknown Source)
at org.apache.poi.xssf.usermodel.XSSFSheet.read(XSSFSheet.java:228)
at org.apache.poi.xssf.usermodel.XSSFSheet.onDocumentRead(XSSFSheet.java:220)
at org.apache.poi.xssf.usermodel.XSSFWorkbook.parseSheet(XSSFWorkbook.java:452)
at org.apache.poi.xssf.usermodel.XSSFWorkbook.onDocumentRead(XSSFWorkbook.java:417)
at org.apache.poi.ooxml.POIXMLDocument.load(POIXMLDocument.java:184)
at org.apache.poi.xssf.usermodel.XSSFWorkbook.<init>(XSSFWorkbook.java:286)
at com.service.ExcelReportManager.runReport(ExcelReportManager.java:248)
at com.report.controller.ReportingEndPoint.runReport(ReportingEndPoint.java:35)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:209)
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:136)
Here is a code which creating this exception:
try (OPCPackage pkg = OPCPackage.open(fileCopy);
XSSFWorkbook workbook = new XSSFWorkbook(pkg))
as i am closing a resource but still there is somehow creating problem. I read different already created issue here but nothing seems to be working for me. Is there any clue to resolve this ?
Is the excel file size too big?
If it is, it can be causing this error, the memory use for generating and reading big excel files with poi is very high.
Your try syntax is correct, the resources will always be closed.
You can give a try on increasing the max memory for the execution jvm.
Use -Xmx2048m on initialization, for example.
Instead of XSSFWorkbook (which keeps the entire Excel workbook in memory) try to use very efficient and high performance streaming SXSSFWorkbook class like below:
SXSSFWorkbook workbook = new SXSSFWorkbook(100);
where 100 is the default number of rows that will be kept in memory and processed in real time.

Exception java.lang.OutOfMemoryError: Failed to allocate a 65548 byte allocation with 55872 free bytes and 54KB until OOM

There are so many questions with the same type of this error but I am not able to find the solution for my error, I checked on my diffrent devices but still not able to find exactly where this error occurs. My app is live currently and installed on 50k active devices, I got this error through my firebase and it occurs so many times.
Exception java.lang.OutOfMemoryError: Failed to allocate a 65548 byte allocation with 55872 free bytes and 54KB until OOM
com.android.okhttp.okio.Segment.<init> (Segment.java:62)
com.android.okhttp.okio.SegmentPool.take (SegmentPool.java:46)
com.android.okhttp.okio.Buffer.writableSegment (Buffer.java:1114)
com.android.okhttp.okio.InflaterSource.read (InflaterSource.java:66)
com.android.okhttp.okio.GzipSource.read (GzipSource.java:80)
com.android.okhttp.okio.RealBufferedSource$1.read (RealBufferedSource.java:374)
bmr.a (:com.google.android.gms.DynamiteModulesC:95)
bmk.a (:com.google.android.gms.DynamiteModulesC:1055)
bmq.a (:com.google.android.gms.DynamiteModulesC:5055)
bmq.run (:com.google.android.gms.DynamiteModulesC:54)
java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1113)
java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:588)
java.lang.Thread.run (Thread.java:818)
It seems that you have tried to upload file by "okhttp".If it is,try to use a "Filepath" instead of "File" as a param.
Seems like you try to unzip a large file, a file so large that all the device's memory is spent. Available memory is different on each device.
If possible, try to split the file into smaller chunks and handle them individually. Otherwize, try to use a streaming solution where you can unzip using a stream instead of loading the entire file into memory before starting the unzip.
Try this: Unzip using a stream or read documentation here: GZUPInputStream
just remove HttpLoggingInterceptor.Level.BODY

Out of memory exception when extracting xmp metadata from picture

I have a problem like mentioned above when extracting metadata from tif file. It has size over 450 MB. I was extracting using http://commons.apache.org/sanselan/ library in newest version(0.97). When I execute code:
String xmpMeta = null;
try {
xmpMeta = Sanselan.getXmpXml(file);
} catch ...
, I get following stack trace:
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at org.apache.sanselan.common.byteSources.ByteSourceInputStream.readBlock(ByteSourceInputStream.java:65)
at org.apache.sanselan.common.byteSources.ByteSourceInputStream.access$000(ByteSourceInputStream.java:24)
at org.apache.sanselan.common.byteSources.ByteSourceInputStream$CacheBlock.getNext(ByteSourceInputStream.java:54)
at org.apache.sanselan.common.byteSources.ByteSourceInputStream$CacheReadingInputStream.read(ByteSourceInputStream.java:147)
...
I have to admit that I was increasing Xms and Xmx properties of my vm and it also failed, but at the end I am not interested in increasing this properties becouse I can get heavier pictures to parse. I would be grateful for help in this issue or referencing another library to parse xmp metadata from JPEG / Tif files.
Well you could call java with more heap space by calling
java -Xmx512M FooProgramm
This will run java with 512M Heap space. I know that this is not a good solution.
Maybe you could try something out of this examples:
http://www.example-code.com/java/java-xmp-extract.asp

java.io.IOException: Invalid argument

I have a web application running in cluster mode with a load balancer.
It consists in two tomcats (T1, and T2) addressing only one DB.
T2 is nfs mounted to T1. This is the only dofference between both nodes.
I have a java method generating some files. If the request
runs on T1 there is no problem but if the request is running on node 2
I get an exception as follows:
java.io.IOException: Invalid argument
at java.io.FileOutputStream.close0(Native Method)
at java.io.FileOutputStream.close(FileOutputStream.java:279)
The corresponding code is as follows:
for (int i = 0; i < dataFileList.size(); i++) {
outputFileName = outputFolder + fileNameList.get(i);
FileOutputStream fileOut = new FileOutputStream(outputFileName);
fileOut.write(dataFileList.get(i), 0, dataFileList.get(i).length);
fileOut.flush();
fileOut.close();
}
The exception appears at the fileOut.close()
Any hint?
Luis
Setting this line in the .profile resolved the issue:
ulimit –n 2048
How large do dataFileList and fileNameList get? You could be running out of file descriptors. It's odd that it happens on close(), though.
Finally I found the reason.
First I've notices that NOT always this exception comes
at the same point.
Sometimes was a
java.io.IOException: Invalid argument
at java.io.FileOutputStream.close0(Native Method)
at java.io.FileOutputStream.close(FileOutputStream.java:279)
^^^^^
and sometimes was
java.io.IOException: Invalid argument
at java.io.FileOutputStream.writeBytes(Native Method)
at java.io.FileOutputStream.write(FileOutputStream.java:260)
Therefore the problem is NOT a java problem. Not even a NFS problem.
The problem is the underlying File System type which is an DRBD
file system.
Testing at a shell to write across the nodes works if one is writing a small
file. I.e:
at the nfs mounted node
cd /tmp
date > /shared/path-to-some-not-mounted-dir/today
will work
but
cat myBigFile > /shared/path-to-some-not-mounted-dir/today
will deliver the following error
cat: write error: Invalid argument
Therefore the solution is to use other type of file system, gfs for example.

Categories