Out of memory when paring HUGE input files
0 posts in topic
Flat View  Flat View
TOPIC ACTIONS:
 

Posted By:   Fred_Morkel
Posted On:   Wednesday, December 3, 2008 04:51 AM

Hi, I'm trying to parse a rather simple but very HUGE log file (~ 300MB) with ANTLR 3 and the C# target. The grammar is simple. Each line in the log file is like: 'logentry' ID (FLOAT)+ NL The log file contains millions of log entries. When I try to parse the log file, my machine runs out of memory, because the parser application allocates more than 1.5GB of RAM. I noticed that after creating the ANTLRFileStream input stream, my parser already allocates ~600MB. I'm fairly new to ANTLR 3, so I'm a little stuck now. Where does this huge memory requirement come from and what do I have to do to reduce it? Many thanks in advance, Fr   More>>

Hi,

I'm trying to parse a rather simple but very HUGE log file (~ 300MB) with ANTLR 3 and the C# target.


The grammar is simple. Each line in the log file is like:


'logentry' ID (FLOAT)+ NL


The log file contains millions of log entries.



When I try to parse the log file, my machine runs out of memory, because the parser application allocates more than 1.5GB of RAM.


I noticed that after creating the ANTLRFileStream input stream, my parser already allocates ~600MB.

I'm fairly new to ANTLR 3, so I'm a little stuck now. Where does this huge memory requirement come from and what do I have to do to reduce it?



Many thanks in advance,
Fred

   <<Less
About | Sitemap | Contact