xml-xmlbeans-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Matthias Kubik" <KU...@de.ibm.com>
Subject RE: high volume processing
Date Thu, 11 Dec 2003 13:09:40 GMT
Not a bad thing for something that's optional. You can't have all, right?
Anyway, thanks for the explanations from you and Eric. Sounds very 
interresting and promising.

Scott Ziegler <zieg@bea.com> 
12/10/2003 09:03 PM
Please respond to


RE: high volume processing

For V2 we are also working on an unmarshalling path that will simply
read the xml and create the corresponding java objects, which will then
lose any connection to the original xml.  This will be faster and less
memory intensive, but at the price of a significant feature of the
traditional xmlbeans approach (two way read/write of xml and java).


On Wed, 2003-12-10 at 09:24, Eric Vasilik wrote:
> The V2 store architecture will have this capability.   Briefly, I am
> designing the V2 store with an abstracted store back end which will
> have multiple implementations.  One will be an in-memory backend with
> with requirements similar to that that of the V1 store.  Another will
> be a memory mapped file based store where instances larger than can
> fit into memory can be handled.  This is a backend which may interest
> you. 
> - Eric
>         -----Original Message-----
>         From: Matthias Kubik [mailto:KUBIK@de.ibm.com]
>         Sent: Wednesday, December 10, 2003 4:26 AM
>         To: xmlbeans-dev@xml.apache.org
>         Subject: high volume processing
>         Hi all,
>         I'm new  to this list as I could not find any thing that would
>         indicate memory problems in the list archive.
>         Now, here's what happened:
>         I was trying the easypo sample as described o the web site.
>         After some script fixing (Linux) I finally got the sample to
>         work.
>         As I have a requirement to process xml files that are 100MB+
>         in size, I had some expectations...that were not fulfilled.
>         It seems that even a 30MB file would run into an out-of-memory
>         error.  I know I could temporarily fix that with giving the
>         JVM more memory. But this is not a solution. To me this looks
>         like the whole DOM tree (if any) and the Object hierarchy is 
>         kept in memory. I'd love to see something more "intelligent"
>         there.
>         My question now is, will that be addressed in V2 or is it even
>         a design goal? (didn't find anything in the project mgt, tho).
>         Thanks
>          - matthias

- ---------------------------------------------------------------------
To unsubscribe, e-mail:   xmlbeans-dev-unsubscribe@xml.apache.org
For additional commands, e-mail: xmlbeans-dev-help@xml.apache.org
Apache XMLBeans Project -- URL: http://xml.apache.org/xmlbeans/

View raw message