How can I parse records on demand with Node.js?
Is there a Node module that can parse a specific number of records from a CSV file? The use case is to parse a large log file and deliver records to a paging client as requested.
node-csv can't yet do this, and the closest I've found is to read lines one by one, which requires reinventing the CSV parsing wheel, and will break on multi-line records.
But let's lower the bar: how can I parse single-line CSV records one by one with Node.js? Pretty trivial task in most other languages.
As far as I understand, you just want to parse the comma-separated line of values into an array? If so, try this one:
https://npmjs.org/package/csvrow
Parsing a single 'line' (which can also have embedded newlines):
var csv = require('csv'); // node-csv
csv()
.from.string(SINGLE_LINE_OF_CSV)
.to.array(function(record) {
console.log('R', record);
});
I'm not sure what you mean by 'a specific number of records from a CSV file', or what the issue is exactly. Once you've read the amount you need, just send the response back to the client and you're done.
EDIT : if you want to implement paging, you can use node-csv
too:
var csv = require('csv');
var skip = 100;
var limit = 10;
csv()
.from.path('file.csv')
.on('record', function(row, index) {
if (index >= skip && index < (skip + limit))
console.log('R', index);
});
链接地址: http://www.djcxy.com/p/1510.html
上一篇: NodeJs如何创建一个非
下一篇: 如何使用Node.js按需分析记录?