How to read a large line-by-line file with Javascript (nodejs)

5

I have a very large file to import data into MongoDB via Javascript. The problem is that I can not load the entire file into memory.

I would like to read this file line by line, since each line is a record to insert into my bank. I know the fs library also works with stream, but it only looks for predefined byte sizes, without worrying if the line is gone or not. Has anyone done this?

    
asked by anonymous 01.01.2015 / 01:16

2 answers

5

There is an interesting project in GitHut created precisely to deal with this problem:

Line Reader

  

Line-by-line file reader asynchronously.

Example:

var lineReader = require('line-reader');

lineReader.eachLine('file.txt', function(line, last) {
  console.log(line);

  if (/* done */) {
    return false; // stop reading
  }
});

The function eachLine reads each line of the given file. On top of each new line, the return function is called with two parameters: the read line and a Boolean value that specifies whether the read line was the last line of the file. If the callback returns false , the reading will stop and the file will close.

    
01.01.2015 / 01:44
4

Using native NodeJS methods you have at least two options.

Using readline

var rl = readline.createInterface({
      input : fs.createReadStream('/path/file.txt'),
      output: process.stdout,
      terminal: false
})
rl.on('line',function(line){
     console.log(line) // aqui podes fazer o que precisas com cada linha
})

Using fs.readFile ()

In this case you will read the entire file first and then break that content down by inha break .

fs.readFile('/path/file.txt', 'utf-8', function(err, data){
    var linhas = data.split(/\r?\n/);
    linhas.forEach(function(linha){
       console.log(linha); // aqui podes fazer o que precisas com cada linha
    })
})
    
01.01.2015 / 11:18