I have a spring boot Rest service which i am using to create excel file(xlsm). Getting strange issue as application start first time it will easily create a excel file but calling a rest endpoint again generate OutOfMemoryError exception.
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
at org.apache.xmlbeans.impl.store.Cur$CurLoadContext.attr(Cur.java:3044)
at org.apache.xmlbeans.impl.store.Cur$CurLoadContext.attr(Cur.java:3065)
at org.apache.xmlbeans.impl.store.Locale$SaxHandler.startElement(Locale.java:3198)
at org.apache.xerces.parsers.AbstractSAXParser.startElement(AbstractSAXParser.java:498)
at org.apache.xerces.parsers.AbstractXMLDocumentParser.emptyElement(AbstractXMLDocumentParser.java:180)
at org.apache.xerces.impl.XMLNSDocumentScannerImpl.scanStartElement(XMLNSDocumentScannerImpl.java:275)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl$FragmentContentDispatcher.dispatch(XMLDocumentFragmentScannerImpl.java:1653)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:324)
at org.apache.xerces.parsers.XML11Configuration.parse(XML11Configuration.java:890)
at org.apache.xerces.parsers.XML11Configuration.parse(XML11Configuration.java:813)
at org.apache.xerces.parsers.XMLParser.parse(XMLParser.java:108)
at org.apache.xerces.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1198)
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:564)
at org.apache.xmlbeans.impl.store.Locale$SaxLoader.load(Locale.java:3414)
at org.apache.xmlbeans.impl.store.Locale.parseToXmlObject(Locale.java:1272)
at org.apache.xmlbeans.impl.store.Locale.parseToXmlObject(Locale.java:1259)
at org.apache.xmlbeans.impl.schema.SchemaTypeLoaderBase.parse(SchemaTypeLoaderBase.java:345)
at org.openxmlformats.schemas.spreadsheetml.x2006.main.WorksheetDocument$Factory.parse(Unknown Source)
at org.apache.poi.xssf.usermodel.XSSFSheet.read(XSSFSheet.java:228)
at org.apache.poi.xssf.usermodel.XSSFSheet.onDocumentRead(XSSFSheet.java:220)
at org.apache.poi.xssf.usermodel.XSSFWorkbook.parseSheet(XSSFWorkbook.java:452)
at org.apache.poi.xssf.usermodel.XSSFWorkbook.onDocumentRead(XSSFWorkbook.java:417)
at org.apache.poi.ooxml.POIXMLDocument.load(POIXMLDocument.java:184)
at org.apache.poi.xssf.usermodel.XSSFWorkbook.<init>(XSSFWorkbook.java:286)
at com.service.ExcelReportManager.runReport(ExcelReportManager.java:248)
at com.report.controller.ReportingEndPoint.runReport(ReportingEndPoint.java:35)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:209)
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:136)
Here is a code which creating this exception:
try (OPCPackage pkg = OPCPackage.open(fileCopy);
XSSFWorkbook workbook = new XSSFWorkbook(pkg))
as i am closing a resource but still there is somehow creating problem. I read different already created issue here but nothing seems to be working for me. Is there any clue to resolve this ?
Is the excel file size too big?
If it is, it can be causing this error, the memory use for generating and reading big excel files with poi is very high.
Your try syntax is correct, the resources will always be closed.
You can give a try on increasing the max memory for the execution jvm.
Use -Xmx2048m on initialization, for example.
Instead of XSSFWorkbook (which keeps the entire Excel workbook in memory) try to use very efficient and high performance streaming SXSSFWorkbook class like below:
SXSSFWorkbook workbook = new SXSSFWorkbook(100);
where 100 is the default number of rows that will be kept in memory and processed in real time.
Related
There are so many questions with the same type of this error but I am not able to find the solution for my error, I checked on my diffrent devices but still not able to find exactly where this error occurs. My app is live currently and installed on 50k active devices, I got this error through my firebase and it occurs so many times.
Exception java.lang.OutOfMemoryError: Failed to allocate a 65548 byte allocation with 55872 free bytes and 54KB until OOM
com.android.okhttp.okio.Segment.<init> (Segment.java:62)
com.android.okhttp.okio.SegmentPool.take (SegmentPool.java:46)
com.android.okhttp.okio.Buffer.writableSegment (Buffer.java:1114)
com.android.okhttp.okio.InflaterSource.read (InflaterSource.java:66)
com.android.okhttp.okio.GzipSource.read (GzipSource.java:80)
com.android.okhttp.okio.RealBufferedSource$1.read (RealBufferedSource.java:374)
bmr.a (:com.google.android.gms.DynamiteModulesC:95)
bmk.a (:com.google.android.gms.DynamiteModulesC:1055)
bmq.a (:com.google.android.gms.DynamiteModulesC:5055)
bmq.run (:com.google.android.gms.DynamiteModulesC:54)
java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1113)
java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:588)
java.lang.Thread.run (Thread.java:818)
It seems that you have tried to upload file by "okhttp".If it is,try to use a "Filepath" instead of "File" as a param.
Seems like you try to unzip a large file, a file so large that all the device's memory is spent. Available memory is different on each device.
If possible, try to split the file into smaller chunks and handle them individually. Otherwize, try to use a streaming solution where you can unzip using a stream instead of loading the entire file into memory before starting the unzip.
Try this: Unzip using a stream or read documentation here: GZUPInputStream
just remove HttpLoggingInterceptor.Level.BODY
I'm currently working with Apachi, as I needed to read data from a XLSX file, and it will later be converted to CSV. Here's the code I'm using to create my XSSFWorkbook, and it is causing an exception every single time. From what I could find, XMLBeans is part of the cause. It has been deprecated, however, it is a dependency of POI in this instance.
public static void appendCSV(File inputFile, String outputFile, String tag)
{
System.out.println(inputFile.getAbsolutePath());
InputStream inp = null;
try {
inp = new FileInputStream(inputFile);
XSSFWorkbook wb = new XSSFWorkbook(inp);
My exception gets thrown at the last line in the block above.
Exception in thread "main" org.apache.poi.POIXMLException: java.lang.reflect.InvocationTargetException
at org.apache.poi.POIXMLFactory.createDocumentPart(POIXMLFactory.java:65)
at org.apache.poi.POIXMLDocumentPart.read(POIXMLDocumentPart.java:601)
at org.apache.poi.POIXMLDocument.load(POIXMLDocument.java:174)
at org.apache.poi.xssf.usermodel.XSSFWorkbook.<init>(XSSFWorkbook.java:279)
at BigBangarang.appendCSV(BigBangarang.java:68)
at BigBangarang.main(BigBangarang.java:268)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.poi.xssf.usermodel.XSSFFactory.createDocumentPart(XSSFFactory.java:56)
at org.apache.poi.POIXMLFactory.createDocumentPart(POIXMLFactory.java:62)
... 5 more
Caused by: java.lang.NoSuchMethodError: org.apache.xmlbeans.XmlOptions.setLoadEntityBytesLimit(I)Lorg/apache/xmlbeans/XmlOptions;
at org.apache.poi.POIXMLTypeLoader.<clinit>(POIXMLTypeLoader.java:50)
at org.apache.poi.xssf.model.SharedStringsTable.readFrom(SharedStringsTable.java:127)
at org.apache.poi.xssf.model.SharedStringsTable.<init>(SharedStringsTable.java:108)
... 11 more
Has anybody ran into this situation before? I have the most up to date release of XMLBeans, and I'm almost feeling like I may need to find an older version, as it says a method is missing. I'm not sure if there is an alternate/easier way to either read an XLSX, or to simply convert it to a CSV prior to handling any data.
You either need to upgrade your XMLBeans version to 2.6, or to upgrade you Apache POI version to 3.15 beta 1 or later.
You're hitting Apache POI bug #59195, for which a temporary workaround was applied around a month ago, and is included in the 3.15 beta 1 release. (Also in nightly builds from the time of the commit onwards). A full fix will take a bit longer, follow that bug if you're interested!
I have the below code to set an Excel file:
public static void setExcelFile(String Path) throws Exception {
try {
FileInputStream ExcelFile = new FileInputStream(Path);
ExcelWBook = new XSSFWorkbook(ExcelFile);
} catch (Exception e){
Log.error("Class Utils | Method setExcelFile | Exception desc : "+e.getMessage());
}
}
This will be called from a loop in another class. This loop will be repeated for each Excel file in a location. Do I need to close the FileInputStream every time for each Excel file? If I am not closing at the end of each Excel file. Will it have an effect on memory utilization? Or everytime when the new Filestream object is created for next Excel file will it close the previous one automatically and create for current file? I faced an issue with following error message.
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader.main(JarRsrcLoader.java:58)
Caused by: java.lang.OutOfMemoryError: Java heap space
at org.apache.xmlbeans.impl.store.Cur.createElementXobj(Cur.java:260)
at org.apache.xmlbeans.impl.store.Cur$CurLoadContext.startElement(Cur.java:2997)
at org.apache.xmlbeans.impl.store.Locale$SaxHandler.startElement(Locale.java:3211)
at org.apache.xmlbeans.impl.piccolo.xml.Piccolo.reportStartTag(Piccolo.java:1082)
at org.apache.xmlbeans.impl.piccolo.xml.PiccoloLexer.parseAttributesNS(PiccoloLexer.java:1822)
at org.apache.xmlbeans.impl.piccolo.xml.PiccoloLexer.parseOpenTagNS(PiccoloLexer.java:1521)
at org.apache.xmlbeans.impl.piccolo.xml.PiccoloLexer.parseTagNS(PiccoloLexer.java:1362)
at org.apache.xmlbeans.impl.piccolo.xml.PiccoloLexer.yylex(PiccoloLexer.java:4682)
at org.apache.xmlbeans.impl.piccolo.xml.Piccolo.yylex(Piccolo.java:1290)
at org.apache.xmlbeans.impl.piccolo.xml.Piccolo.yyparse(Piccolo.java:1400)
at org.apache.xmlbeans.impl.piccolo.xml.Piccolo.parse(Piccolo.java:714)
at org.apache.xmlbeans.impl.store.Locale$SaxLoader.load(Locale.java:3479)
at org.apache.xmlbeans.impl.store.Locale.parseToXmlObject(Locale.java:1277)
at org.apache.xmlbeans.impl.store.Locale.parseToXmlObject(Locale.java:1264)
at org.apache.xmlbeans.impl.schema.SchemaTypeLoaderBase.parse(SchemaTypeLoaderBase.java:345)
at org.openxmlformats.schemas.spreadsheetml.x2006.main.WorksheetDocument$Factory.parse(Unknown Source)
at org.apache.poi.xssf.usermodel.XSSFSheet.read(XSSFSheet.java:194)
at org.apache.poi.xssf.usermodel.XSSFSheet.onDocumentRead(XSSFSheet.java:186)
at org.apache.poi.xssf.usermodel.XSSFWorkbook.onDocumentRead(XSSFWorkbook.java:354)
at org.apache.poi.POIXMLDocument.load(POIXMLDocument.java:166)
at org.apache.poi.xssf.usermodel.XSSFWorkbook.<init>(XSSFWorkbook.java:263)
at utility...
If you do not close the stream, file will be locked until inputstream has been closed or JVM is shutdown. You should be closing stream otherwise you may run into IOException during next time reading that file from java or directly using windows
You should add ExcelFile.close() [ref] in your catch block and after you are done using the resource. This is done to prevent memory leaks, and in your case, exceptions.
From XSSFWorkbook(InputStream):
Constructs a XSSFWorkbook object, by buffering the whole stream into memory and then opening an OPCPackage object for it.
Using an InputStream requires more memory than using a File, so if a File is available then you should instead do something like
OPCPackage pkg = OPCPackage.open(path);
XSSFWorkbook wb = new XSSFWorkbook(pkg);
// work with the wb object
......
pkg.close(); // gracefully closes the underlying zip file
As you have a path string you should use XSSFWorkbook(File) or simply XSSFWorkbook(String).
As far as closing resources: Always close streams. From Java Practices -> Recovering resources:
Expensive resources should be reclaimed as soon as possible, by an explict call to a clean-up method defined for this purpose. If this is not done, then system performance can degrade. In the worst cases, the system can even fail entirely.
Resources include:
input-output streams
database result sets, statements, and connections
threads
graphic resources
sockets
As others already told you: yes, you always should close the streams. Additionally, each unclosed and non-finalized stream will hold a file handle on some systems, and you might run into a total maximum value per process. (This depends on your OS.)
Probably, the easiest thing in your context above is to auto-close by using a try-with-resources construct:
try (
FileInputStream ExcelFile = new FileInputStream(Path);
) {
ExcelWBook = new XSSFWorkbook(ExcelFile);
} catch (Exception e) {
Log.error("Class Utils | Method setExcelFile | Exception desc : "+ e.getMessage());
}
I am very new to R, and am reading an excel file (size 27,964 KB) into R using read.xlsx from library open.xlsx.
filename = "myfile.xlsx"
df1 = read.xlsx(filename, sheet="df1",colNames= TRUE)
df2 = read.xlsx(filename, sheet="df2",colNames= TRUE)
There are multiple sheets in the excel file. Right now I am reading one sheet at a time, but I want to automate the process and be able to create data frames according to sheet names.
I read we can use XLConnect for this purpose, but when I tried XLConnect in my case, I got the an error
wb = loadWorkbook("myfile.xlsx")
Error: OutOfMemoryError (Java): GC overhead limit exceeded
In order to fix this error I used
options(java.parameters = "-Xmx1024m")/"Xmx2g"
but both the options did not help. I tried reading about exce.link but could not really figure it out.
Can some one please guide how to use another method to read excel file with multiple sheets.
I have a problem like mentioned above when extracting metadata from tif file. It has size over 450 MB. I was extracting using http://commons.apache.org/sanselan/ library in newest version(0.97). When I execute code:
String xmpMeta = null;
try {
xmpMeta = Sanselan.getXmpXml(file);
} catch ...
, I get following stack trace:
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at org.apache.sanselan.common.byteSources.ByteSourceInputStream.readBlock(ByteSourceInputStream.java:65)
at org.apache.sanselan.common.byteSources.ByteSourceInputStream.access$000(ByteSourceInputStream.java:24)
at org.apache.sanselan.common.byteSources.ByteSourceInputStream$CacheBlock.getNext(ByteSourceInputStream.java:54)
at org.apache.sanselan.common.byteSources.ByteSourceInputStream$CacheReadingInputStream.read(ByteSourceInputStream.java:147)
...
I have to admit that I was increasing Xms and Xmx properties of my vm and it also failed, but at the end I am not interested in increasing this properties becouse I can get heavier pictures to parse. I would be grateful for help in this issue or referencing another library to parse xmp metadata from JPEG / Tif files.
Well you could call java with more heap space by calling
java -Xmx512M FooProgramm
This will run java with 512M Heap space. I know that this is not a good solution.
Maybe you could try something out of this examples:
http://www.example-code.com/java/java-xmp-extract.asp