Node.js: readfile reads just one line? - node.js

I'm trying to parse a CSV file line by line.
However, when i output the contents of the file it shows just one line?
Here's the code:
fs.readFile('data.csv', 'utf8', function (err,data) {
if (err) {
return console.log(err);
}
console.log(data)
var tbl = data.split('\n');
console.log(tbl.length);
})
First console.log outputs just one line of data while tbl.length outputs 1.
Why is it reading just one line instead of the entire file?
EDIT: Something strange going on, if i do data.length i get 580218, which is much more than that one line i'm getting as output?

Wanted to give Jonathan the chance to answer so he could get the points.
So couple of issues going on here.
Listing just one line from the CSV instead of the whole data.
Turns out the JSON.strigify(string) did the trick.
The extra lines or invalid characters may have caused it to output just one line, instead of the whole file.
The array.length for the split operation returned 1 line. I noticed later that the entire csv file was the [0] element of the array. Apparently something to do with the new lines in the string.
So i did stringily of the csv, and improved my split line a bit, and it worked.
Here's the modified code:
tbl = data.replace(/(\r\n|\n|\r)/gm,"|");
tbl = tbl.split("|")
console.log(tbl.length);
Voila!

Related

In Ruby, how would one create new CSV's conditionally from an original CSV?

I'm going to use this as sample data to simplify the problem:
data_set_1
I want to split the contents of this csv according to Column A - DEPARTMENT and place them on new csv's named after the department.
If it were done in the same workbook (so it can fit in one image) it would look like:
data_set_2
My initial thought was something pretty simple like:
CSV.foreach('test_book.csv', headers: true) do |asset|
CSV.open("/import_csv/#{asset[1]}", "a") do |row|
row << asset
end
end
Since that should take care of the logic for me. However, from looking into it, CSV#foreach does not accept file access rights as second parameter, and it gets an error when I run it. Any help would be appreciated, thanks!
I don't see why you would need to pass file access rights to CSV#foreach. This method just reads the CSV. How I would do this is like so:
# Parse the entire CSV into an array.
orig_rows = CSV.parse(File.read('test_book.csv'), headers: true)
# Group the rows by department.
# This becomes { 'deptA' => [<rows>], 'deptB' => [<rows>], etc }
groups = orig_rows.group_by { |row| row[1] }
# Write each group of rows to its own file
groups.each do |dept, rows|
CSV.open("/import_csv/#{dept}.csv", "w") do |csv|
rows.each do |row|
csv << row.values
end
end
end
A caveat, though. This approach does load the entire CSV into memory, so if your file is very large, it wouldn't work. In that case, the "streaming" approach (line-by-line) that you show in your question would be preferrable.

Reading textfile and insert data into database

I have an exported text file that looks like this:
Text file
So it is built up like a table and it is unicode encoded. I don't create the export file, so please don't tell me to use csv files.
I already have a mariadb database in place with with a table that contains the respective headers (ID, Name, ..).
My goal is to read the data from the text file and insert it correctly into the daatabase. I am using node js and would like to know what steps i need to follow in order to accomplish my goal.
Can is use this instruction URL? I already tried it this way but i think the unicode encoding caused some problems.
You should really consider using csv files to import/export any column related data, they have a tabular structure sort of implied from them.
Here in this case, you'll have to write some sort of a parser, which reads your file, one line at a time, then splits the data using multiple spaces as a delimiter, overall not really worth it.
Search into using csvs, there are even npm modules available for use with csv.
So the way you would approach this is with streams. You would read each line into the buffer, parse it and save the data into the database.
Have a look at csv-stream library https://github.com/klaemo/csv-stream. It does the parsing for you and you can configure it to use tab as the delimiter.
const csv = require('csv-streamify')
const fs = require('fs')
const parser = csv({
delimiter: '\t',
columns: true,
});
// emits each line as a an object with keys as headers and properties as row values
parser.on('data', (line) => {
console.log(line)
// { ID: '1', Name: 'test', Date: '2010', State: 'US' }
// Insert row into DB
// ...
})
fs.createReadStream('data.txt').pipe(parser)

Why would String(contentsOfFile) fail even though the file exists?

In the below code, I am trying to access a file 0.txt located in my home directory. The path to the home directory is saved within a string and the name 0.txt is appended to it upon call (0 is a reference counter which will change values as the program runs. For the sake of the question, I'll refer to it as 0).
func loadfile(counter: Int) -> String { // counter here is assumed to be "0"
var contents = String()
var defaultpath = ("~/" as NSString).stringByExpandingTildeInPath as String
do {
contents = try String(contentsOfFile: defaultpath.stringByAppendingString(String("\(counter).txt")))
return contents
} catch {
print("For some reason, the file couldn't be accessed.")
return "failed"
}
}
However, every time this block of code runs, the return value is failed and the line For some reason, the file couldn't be accessed is printed, even though ~/0.txt exists. Does anyone have an idea as to why this abnormal behavior is occurring, and if so, how should I resolve this issue??
Side question: is there a way to print the errors generated by the try-catch block to stdout?
You need to add a separator to the filename:
contents = try String(contentsOfFile: defaultpath.stringByAppendingString(String("/\(counter).txt")))
Note the forward slash at the beginning of the filename. The defaultPath does not end with a slash.

headerlinesIn not working correctly in importdata MATLAB

I am importing some data from the excel and the code looks like this:
Code:
%Import Data
filename = 'Stocks.xlsx';
delimiterIn = ' ';
headerlinesIn = 1;
A = importdata(filename,delimiterIn,headerlinesIn);
The excel file looks like this:
When I read in the data, A looks like this:
I thought when the headerlineIn = 1, the first line should not read. Why is it that it is being read? How to avoid this?
Need some guidance..
How I thought your code is alright.
With your example file and your code I get a struct A.
A = importdata('Stocks.xlsx',' ',1);
In A.data.Sheet1 is all the data correctly read:
And in A.textdata.Sheet1 the appears what you posted you get.
So the problem must be something I can't reproduce.
Alternatively you could try if xlsread works for you.
B = xlsread('Stocks.xlsx',1)
I get the same result as before.
I finally get your problem, you're not concerned about the data, you really want to skip the first line of the header in the means of textdata.
Well headerlinesIn just signalizes importdata when your data starts, respectively when it should start to read actual data. Everything else, which is then declared not to be data, is put into A.textdata.Sheet1, also the first line. So the code works as intended.
If you want to get rid of the first line of your header, you could apply the following line:
N = 2; %// number of columns before data starts
A.textdata.Sheet1 = {A.textdata.Sheet1{headerlinesIn+1:end,1:N}};

Reading text file and omitting line

Is there any method of reading from a text file and omitting certain lines from the output into a text box?.
the text file will look like this
Name=Test Name
Date=19/02/14
Message blurb spanning over several lines
The format will always be the same and the Name & Date will always be the 1st 2 rows and these are the rows that i want to omit and return the rest of the message blurb to a text box.
I know how to use the ReadAllLines function and StreamReader but not sure how to start coding it.
Any pointers or directions to some relevant online documentation?
Thanks in advance
You can read file line by line and just skip lines with given beginnings:
string[] startsToOmit = new string[] { "Name=", "Date=" };
var result = File.ReadLines(path)
.Where(line => !startsToOmit.Any(start => line.StartsWith(start)));
and then you have an IEnumerable<string> as a result, you can use it for example by result.ToList().
Just read the stream line by line:
using (StreamReader sr = new StreamReader(path))
{
Console.WriteLine(sr.ReadLine());
}
Ignore the first two lines, and process the 3rd line however you need.

Resources