dcsimg
Reading Bulk Data
1 posts in topic
Flat View  Flat View
TOPIC ACTIONS:
 

Posted By:   pradeep_kumar
Posted On:   Friday, May 4, 2001 08:50 AM

We have to read the a .txt file which contains rows in terms of lakhs and write in to a SQL database. Currently we
are using Buffered Input Stream for this puropse.But the speed is not satisfactory in terms of reading the lines.Any Optimization for this technique??

Thanx in advance

Pradeep Kumar

Re: Reading Bulk Data

Posted By:   venkatesha_subbegowda  
Posted On:   Friday, May 4, 2001 04:21 PM

Instead of Buffered InputStream you can
use InputStream for this bulk read.
Depending on the main memory availble
you can decide the size of the of the buffer you want use.
This is the sample code.

String fileName = "a .txt";
static final BUFFER_SIZE = 8000;
static final byte[] BUFFER= new byte[BUFFER_SIZE ];
/*
You can use increased buffersize if you are having more main memory.
*/
.
.
FileInputStream fIn = new byteFileInputStream(fileName );
InputStream in = new InputStream(fIn);

int byteRead = 0;
while(byteRead != -1){
int byteRead = in.read(BUFFER,0,BUFFER.length);
// you can pass the BUFFER as
//an argument to a prasing routine
// which writes to a database.
// byteRead --> this is the actual number of bytes read in to BUFFER
}

Hope this helps you.
This speeds up the IO process at least by
4 to 8 times as compared to using BufferedInputStream.
About | Sitemap | Contact