39

tail -f logfile.txt outputs the last 10 lines of logfile.txt, and then continues to output appended data as the file grows.

What's the recommended way of doing the -f part in node.js?

The following outputs the entire file (ignoring the "show the last 10 lines") and then exits.

var fs = require('fs');
var rs = fs.createReadStream('logfile.txt', { flags: 'r', encoding: 'utf8'});
rs.on('data', function(data) {
  console.log(data);
});

I understand the event-loop is exiting because after the stream end & close event there are no more events -- I'm curious about the best way of continuing to monitor the stream.

mike
  • 7,027
  • 2
  • 22
  • 27

5 Answers5

53

The canonical way to do this is with fs.watchFile.

Alternatively, you could just use the node-tail module, which uses fs.watchFile internally and has already done the work for you. Here is an example of using it straight from the documentation:

Tail = require('tail').Tail;

tail = new Tail("fileToTail");

tail.on("line", function(data) {
  console.log(data);
});
Community
  • 1
  • 1
Rohan Singh
  • 18,750
  • 1
  • 39
  • 48
8

node.js APi documentation on fs.watchFile states:

Stability: 2 - Unstable. Use fs.watch instead, if available.

Funny though that it says almost the exact same thing for fs.watch:

Stability: 2 - Unstable. Not available on all platforms.

In any case, I went ahead and did yet another small webapp, TailGate, that will tail your files using the fs.watch variant.

Feel free to check it out here: TailGate on github.

Tackle
  • 163
  • 1
  • 9
3

Substack has a file slice module that behaves exactly like tail -f, slice-file can stream updates after the initial slice of 10 lines.

var sf = require('slice-file');

var xs = sf('/var/log/mylogfile.txt');
xs.follow(-10).pipe(process.stdout);

Source: https://github.com/substack/slice-file#follow

Woody
  • 373
  • 2
  • 9
3

you can try to use fs.read instead of ReadStream

var fs = require('fs')

var buf = new Buffer(16);
buf.fill(0);
function read(fd)
{
    fs.read(fd, buf, 0, buf.length, null, function(err, bytesRead, buf1) {
        console.log(buf1.toString());
        if (bytesRead != 0) {
            read(fd);
        } else {
            setTimeout(function() {
                read(fd);
            }, 1000);
        }
    }); 
}

fs.open('logfile', 'r', function(err, fd) {
    read(fd);
});

Note that read calls callback even if there is no data and it just reached end of file. Without timeout you'll get 100% cpu here. You could try to use fs.watchFile to get new data immediately.

Andrey Sidorov
  • 24,056
  • 4
  • 61
  • 74
2

https://github.com/jandre/always-tail seems a great option if you have to worry about log rotating, example from the readme:

var Tail = require('always-tail');
var fs = require('fs');
var filename = "/tmp/testlog";

if (!fs.existsSync(filename)) fs.writeFileSync(filename, "");

var tail = new Tail(filename, '\n');

tail.on('line', function(data) {
  console.log("got line:", data);
});


tail.on('error', function(data) {
  console.log("error:", data);
});

tail.watch();
user1278519
  • 816
  • 6
  • 9
  • always-tail looks like it was abandoned; the last update was in 2015. See https://github.com/lucagrulla/node-tail for something more recent. – James Moore Oct 06 '21 at 16:50