poi-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Greg Woolsey <greg.wool...@gmail.com>
Subject Re: Loading Large XLSM file using POI without Memory issue .
Date Wed, 10 Apr 2019 14:59:07 GMT
First try with the latest POI, 4.1.0.  The version you are using, 3.8, is
very out of date. Much has changed, and I doubt you will get any help for
such an old version.

On Wed, Apr 10, 2019, 05:24 Dhanaraja, R (R.) <RDHANARA@ford.com> wrote:

> Hi Team,
>
> We use below 3 POI jars in in our organization for Excel processing .
>
> ·         poi-3.8
>
> ·         poi-ooxml-3.8-20120326.jar
>
> ·         poi-ooxml-schemas-3.8-20120326.jar
>
> Below excel formats are getting processed with the help of mentioned jar
> files.
>
> ·         .xls
>
> ·         .xlsx
>
> ·         .xlsm
>
> We have a new requirement where we need to load 500MB of .xlsm file and
> delete the particular sheet from that workbook and save the updated excel
> in to the server location.
>
> Tried something like this but, while  loading opc package into xssf work
> book gave us out of memory error .
> import org.apache.poi.openxml4j.opc.OPCPackage;
> import org.apache.poi.xssf.streaming.SXSSFWorkbook;
> import org.apache.poi.xssf.usermodel.XSSFSheet;
> import org.apache.poi.xssf.usermodel.XSSFWorkbook;
>
>                      OPCPackage pkg =
> OPCPackage.open("C:\\PROJECTS\\Test.xlsm");
>                      XSSFWorkbook wb = new XSSFWorkbook(pkg);
>                      SXSSFWorkbook wb1 = new SXSSFWorkbook(wb, 100);
>
>
> Out Of Memory Error log:
> JVMDUMP032I JVM requested System dump using 'C:\PROJECTS\
> Workspace\webcontent\core.20190409.141414.14460.0001.dmp' in response to an
> event
> JVMDUMP010I System dump written to 'C:\PROJECTS\ Workspace\webcontent
> \core.20190409.141414.14460.0001.dmp
>
> We have another logic running fine where currently we are parsing same
> file and reading some data .
>
>                      OPCPackage pkg = OPCPackage.open(filename);
>                      XSSFReader r = new XSSFReader( pkg );
>                      XSSFReader.SheetIterator iter =
> (XSSFReader.SheetIterator) r
>                                   .getSheetsData();
>                      int index = 1;
>                      while (iter.hasNext()) {
>                            iter.next();
>                            String sheetName = iter.getSheetName();
>                            if(sheetName.equals("SHEET1")){
>                                   break;
>                            }
>                            index++;
>                      }
>                      SharedStringsTable sst = r.getSharedStringsTable();
>                      XMLReader parser = fetchSheetParser(sst); ==>
> org.apache.xerces.parsers.SAXParser
>                      InputStream sheet2 = r.getSheet("rId"+index);
>                      InputSource sheetSource = new InputSource(sheet2);
>
>                      parser.parse(sheetSource);
>                      sheet2.close();
>                      pkg.close();
>
> From the stack overflow forum understood that one of the solution is to
> increase the JVM to support this .
> But in my case already we have enabled maximum possible size for our JVM
> since our application has more no of concurrent users .So further
> increasing it not possible in my case.
>
> Question: Is there any possible way available to delete particular sheet
> without memory issue using Apache POI  ?
>
>
> Note: We are using JRE 1.8
>
> Please provide your suggestions on this .
>
>
> Thanks & Regards,
> Dhanaraja.R
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message