0

This nodejs file is suppose to read a file line by line. Each line represents and object that I create and add to an array. After finished reading the file it should return that array. Not an expert on javascript but this seems to return an empty array each time. I thought it had something to do with global but creating a temp array and pushing to it in parseline() didn't work either. What am I doing wrong?


var exports = module.exports = {};
const lineReader = require('line-reader');
const Obj = require("./Data")
const Data = Obj.Data;
var records = [];
exports.readAllLines = async function() {
 await lineReader.eachLine('./datafile.dat', function(line) {
 parseLine(line);
 });
 return records;
}
function parseLine(inputLine) {
 var splitArray = inputLine.split("\t");
 var date = new Date(Date.parse(splitArray[0]));
 var o= splitArray[1];
 var h= splitArray[2];
 var l= splitArray[3];
 var c= splitArray[4];
 var v= splitArray[5];
 var dataObject = new Data (date, o, h, l, c, v);
 records.push(dataObject);
}

Calling Code

var readFiles = require("./ReadFile.js");
readFiles.readAllLines().then(function(result) {
 console.log(result);
});
asked Sep 6, 2019 at 4:46
5
  • show the structure of that file you trying to rea dfrom Commented Sep 6, 2019 at 4:48
  • I know the structure of the file is read correctly. Creating the objects is no problem. It's a tab delimited with numbers. Doing some testing it looks like it attempts to return before finishing the .eachLine function. Commented Sep 6, 2019 at 4:53
  • 1
    Where's the code that calls readAllLines(). Are you using the promise that it returns? Commented Sep 6, 2019 at 4:55
  • Well that is unfortunate Commented Sep 6, 2019 at 4:59
  • If you read the doc, it shows you how to promisify it and use it with promises. Commented Sep 6, 2019 at 5:02

3 Answers 3

3

A simple solution using native apis

var fs = require('fs');
let fileArray = fs.readFileSync(filepath).split('\n');
answered Sep 6, 2019 at 4:52
Sign up to request clarification or add additional context in comments.

3 Comments

The OP is doing a lot more than accumulating a list of lines. They're parsing each line and putting it into an object. The desired output is an array of objects.
True, this answer is not the whole solution, it a way towards a right direction and a generic example so other people who comes here can use this as a guideline
Also, many times reading an entire file into memory all at once is not the recommended approach in cases where the file can get large. It is often the simplest mechanism, but not always the best. And, any server process MUST use asynchronous file I/O, not synchronous file I/O in all places except server startup code.
1

As per line-reader docs

eachLine and open are compatible with promisify from bluebird

So in order to wait for each line to finish then return data you can install bluebird as per the example and change your code to be like the below

var exports = module.exports = {};
const lineReader = require('line-reader');
const Obj = require("./Data")
const Data = Obj.Data;
Promise = require('bluebird');
var eachLine = Promise.promisify(lineReader.eachLine);
var records = [];
exports.readAllLines = async function() {
 await eachLine('./datafile.dat', function (line) {
 parseLine(line);
 });
 return records;
}
function parseLine(inputLine) {
 var splitArray = inputLine.split("\t");
 var date = new Date(Date.parse(splitArray[0]));
 var o= splitArray[1];
 var h= splitArray[2];
 var l= splitArray[3];
 var c= splitArray[4];
 var v= splitArray[5];
 var dataObject = new Data (date, o, h, l, c, v);
 records.push(dataObject);
}
answered Sep 6, 2019 at 5:31

1 Comment

Ohhh I like this. I must of looked at the wrong docs. Clearer looking solution and easier to read.
0

Thanks to jfriends00 and others this is what I came up with. It was indeed a race condition where the array was being returned before the file was read.

var exports = module.exports = {};
const fs = require('fs');
const readline = require('readline');
const Obj = require("./Data")
const Data = Obj.Data;
exports.readAllLines = async function processLineByLine() {
 var records = [];
 const fileStream = fs.createReadStream('./datafile.dat');
 const rl = readline.createInterface({
 input: fileStream,
 crlfDelay: Infinity
 });
 for await (const line of rl) {
 records.push(parseLine(line));
 }
 return records;
}
function parseLine(inputLine, records) {
 var splitArray = inputLine.split("\t");
 var date = new Date(Date.parse(splitArray[0]));
 var o= splitArray[1];
 var h= splitArray[2];
 var l= splitArray[3];
 var c= splitArray[4];
 var v= splitArray[5];
 return new Data(date, o, h, l, c, v);
}

Calling Code

var readFiles = require("./ReadFile.js");
readFiles.readAllLines().then(result => {
 console.log(result);
}).catch(exception => {
 console.log(exception);
});
answered Sep 6, 2019 at 5:27

2 Comments

One enhancement suggestion. Move the definition of records inside the processLineByLine() function so it won't conflict if someone has more than one of these functions calls in flight at the same time.
But, your answer hasn't yet been changed to show that. Answers here are references for a good way to solve some problem. It would be best if you updated the answer. Otherwise, this is a bug waiting to happen.

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.