在Nodejs中解析大型JSON文件并独立处理每个对象

我需要在Nodejs中读取一个大的JSON文件(大约630MB),并将每个对象插入MongoDB。

但是,答案是逐行处理JSON文件,而不是逐对象处理。因此,我仍然不知道如何从该文件中获取对象并对其进行操作。

我的JSON文件中大约有100,000种此类对象。

资料格式:

[

{

"id": "0000000",

"name": "Donna Blak",

"livingSuburb": "Tingalpa",

"age": 53,

"nearestHospital": "Royal Children's Hospital",

"treatments": {

"19890803": {

"medicine": "Stomach flu B",

"disease": "Stomach flu"

},

"19740112": {

"medicine": "Progeria C",

"disease": "Progeria"

},

"19830206": {

"medicine": "Poliomyelitis B",

"disease": "Poliomyelitis"

}

},

"class": "patient"

},

...

]

干杯,

回答:

有一个名为“ stream-json”的不错的模块,它可以完全满足您的需求。

它可以解析远远超出可用内存的JSON文件。

StreamArray处理一个频繁的用例:与Django生产的数据库转储类似的大量相对较小的对象。它逐个流处理阵列组件,并自动进行组装。

这是一个非常基本的示例:

const StreamArray = require('stream-json/streamers/StreamArray');

const path = require('path');

const fs = require('fs');

const jsonStream = StreamArray.withParser();

//You'll get json objects here

//Key is an array-index here

jsonStream.on('data', ({key, value}) => {

console.log(key, value);

});

jsonStream.on('end', () => {

console.log('All done');

});

const filename = path.join(__dirname, 'sample.json');

fs.createReadStream(filename).pipe(jsonStream.input);

如果您想做一些更复杂的事情,例如按顺序处理一个对象(保持顺序)并为每个对象应用一些异步操作,则可以执行以下自定义Writeable流:

const StreamArray = require('stream-json/streamers/StreamArray');

const {Writable} = require('stream');

const path = require('path');

const fs = require('fs');

const fileStream = fs.createReadStream(path.join(__dirname, 'sample.json'));

const jsonStream = StreamArray.withParser();

const processingStream = new Writable({

write({key, value}, encoding, callback) {

//Save to mongo or do any other async actions

setTimeout(() => {

console.log(value);

//Next record will be read only current one is fully processed

callback();

}, 1000);

},

//Don't skip this, as we need to operate with objects, not buffers

objectMode: true

});

//Pipe the streams as follows

fileStream.pipe(jsonStream.input);

jsonStream.pipe(processingStream);

//So we're waiting for the 'finish' event when everything is done.

processingStream.on('finish', () => console.log('All done'));

请注意: 以上示例已针对“ stream-json@1.1.3”进行了测试。对于某些以前的版本(可能早于1.0.0),您可能必须:

const StreamArray = require('stream-json/utils/StreamArray');

然后

const jsonStream = StreamArray.make();

以上是 在Nodejs中解析大型JSON文件并独立处理每个对象 的全部内容, 来源链接: utcz.com/qa/424650.html

回到顶部