I´m using fast-csv to export some data from a DB to a CSV file.
When I use the code from the example in the docs :
var csvStream = csv.createWriteStream({headers:true}),
writableStream = fs.createWriteStream('./csv/list.csv');
writableStream.on('finish', function(){
console.log('DONE!');
});
csvStream.pipe(writableStream);
csvStream.write(
[
{
a: "a1",
b: "b1",
c: "c1"
}
]
);
csvStream.end();
res.send('export done!')
my csv file have one entry [object Object]
It looks like csvStream.write() will only accept object arguments:
csvStream.write({
a: "a1",
b: "b1",
c: "c1"
});
If you want to write arrays, you should use csv.write() or csv.writeToStream() (documented here, search for "Writing data" as I can't link to it directly).
Related
I want to read a string that is encoded in CSV format synchronously in Node.js. (I don't want to read a CSV file asynchronously).
To illustrate, below is asynchronous code to read a CSV file in Node.js. This code works well.
const fs = require("fs");
// https://www.npmjs.com/package/csv
const { parse } = require("csv-parse");
const readCSV = () => {
fs.createReadStream("./testfile.csv").pipe(
parse({ delimiter: ",", from_line: 1 })
.on("data", function (row) {
console.log(row);
})
.on("end", function () {
console.log("finished");
})
.on("error", function (error) {
console.log(error.message);
})
);
};
Below is the content of testfile.csv. Although it's just one line, it is not an easy one (notice the the multiple special characters and commas, arbitrarily placed for the purposes of this test), and it is best handled with a proper CSV parser.
Guatemala,United States,"Congo, ""Dem."" Rep. 'of' `the` (Kinshasa)",other country name
How do I read the same content as an encoded string in Node.js?
Below is the same example as an encoded string in Node.js which I wish to read synchronously. Notice that I had to escape the ' characters given the syntax of the language.
const encodedString = 'Guatemala,United States,"Congo, ""Dem."" Rep. \'of\' `the` (Kinshasa)",other country name';
I came across this problem for two reasons:
I wish to encode a Node.js array as a string that can handle special characters and commas (and decode it as needed); and
I couldn't figure how to use the CSV package (or any other package, I really looked at a lot of them) for strings. That is, they seem to always assume that you will be reading a CSV file with the fs package asynchronously; while my use case assumes that I already have the data encoded as a string variable. Thus, I also wish to do it synchronously, for there's no reason to do it differently since I am not getting the data from a stream.
I've already looked a lot of Stack Overflow questions (How do I read the contents of a Node.js stream into a string variable?, Is there a way to read CSV in node.js synchronously?, etc.), and all I could find is examples of how to read a CSV file asynchronously. So the solution to this basic question is still missing.
To summarize
A successful answer to this question will be the one that provides a function to decode the encodedString variable into an array like the one below. An even more ideal answer would also provide the corresponding function to encode the Node.js array into the same original encoded string.
[
'Guatemala',
'United States',
'Congo, "Dem." Rep. \'of\' `the` (Kinshasa)',
'other country name',
]
I learned how to use the csv-parse package synchronously. It is in fact supported by the package. You just need to import the synchronous version of the module. See documentation here https://csv.js.org/parse/api/sync/.
Below is a script that can handle the encoded string I provided as an example.
// https://csv.js.org/parse/api/sync/
const { parse } = require("csv-parse/sync");
const decodeCSVStringSync = () => {
const encodedString =
'Guatemala,United States,"Congo, ""Dem."" Rep. \'of\' `the` (Kinshasa)",other country name';
const records = parse(encodedString, { delimiter: ",", from_line: 1 });
console.log(records[0]);
};
decodeCSVStringSync();
Output:
[
'Guatemala',
'United States',
'Congo, "Dem." Rep. \'of\' `the` (Kinshasa)',
'other country name'
]
And below is the corresponding function to encode an array of data into an string that follows the CSV format. (The output is the original encoded string.)
//https://csv.js.org/stringify/options/cast/#field-level-options
const { stringify } = require("csv-stringify/sync");
const encodeArrayToCSVStringSync = () => {
const arrayData = [
"Guatemala",
"United States",
"Congo, \"Dem.\" Rep. 'of' `the` (Kinshasa)",
"other country name",
];
const encodedString = stringify([arrayData]);
console.log(encodedString);
};
encodeArrayToCSVStringSync();
Output:
Guatemala,United States,"Congo, ""Dem."" Rep. 'of' `the` (Kinshasa)",other country name
The csv-parse expects to act on streams not strings. But you can easily create your own ReadableStream and push your string into it.
const Readable = require('stream').Readable
const { parse } = require("csv-parse")
const stream = new Readable()
const encodedString = 'Guatemala,United States,"Congo, ""Dem."" Rep. \'of\' `the` (Kinshasa)",other country name'
stream.push(encodedString)
stream.push(null)
stream.pipe(
parse({ delimiter: ",", from_line: 1 })
.on("data", (row) => {
console.log(row);
})
.on("error", (error) => {
console.log(error.message);
})
)
Output:
[
'Guatemala',
'United States',
'Congo, "Dem." Rep. \'of\' `the` (Kinshasa)',
'other country name'
]
I'm new to Node.js/AWS lambda. Ive successfully created several documentClient QUERY functions that return a single or multiple item JSON Document in this format:
[
{
"name": "andy",
"color": "purple",
"snack": "oreos"
}
]
When I use documentClient GET and get back my single record its in THIS format, which is not playing well with the client code (apple / ios swift)
{
"name": "andy",
"color": "purple",
"snack": "oreos"
}
I'm hopign i can change the format returned from documentClient.get() to include the fill JSON document format including leading and trailing brackets .. []
I am a node.js & aws.lambda and documentClient novice, so apologies if this is a very basic question....
provided in above text
If I understood well, you're receiving an object instead of a array.
You can use the scan function to retrieve an array of results:
var params = {
TableName : 'Table',
FilterExpression : 'Year = :this_year',
ExpressionAttributeValues : {':this_year' : 2015}
};
var documentClient = new AWS.DynamoDB.DocumentClient();
documentClient.scan(params, function(err, data) {
if (err) console.log(err);
else console.log(data);
});
Or you can transform the result to array:
const document = await documentClient.get({
TableName: "table-of-example",
Key: {
id: "id-of-example"
}
});
return [data]
Please read the document to understand more how the document dynamodb sdk works: https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/DynamoDB/DocumentClient.html#scan-property
I want to parse this JSON data to a file like this :
JSON
{
"nodes": [
{"id": 1, "name" : "a", "radius" :0.5, "angle" : 2.64159265359},
{"id": 2, "name" : "b", "radius" : 0.6, "angle" : 3.64159265359}
],
"links": [
{"source": "a", "target": "b", "delay": "10ms"}
]
}
File :
[nodes]
a: _ radius=0.5 angle=2.64159265359
b: _ radius=0.6 angle=3.64159265359
[links]
a:b delay=10ms
So far my code is just reading the JSON file
const fs = require('fs');
data = JSON.parse(fs.readFileSync("topo.json","utf-8"));
for (node in data)
{
for (link in data[node])
{
console.log(data[node][link]);
}
}
How can I get those values saved and create a new file having those values in them ?
You did not specify what file type you want to use, so I went with a simple .txt file.
The only small Issue with the code posted, is that these are arrays, so you have to loop a bit differently over them. Apart from that simply write the data into placeholders in your format string and append it to a global object that will be written to the file. Then you should be good to like that:
const fs = require('fs');
const data = JSON.parse(fs.readFileSync('topo.json','utf-8'));
let textFileContent = '';
textFileContent += '[nodes]\n';
data.nodes.forEach(node => {
textFileContent += `${node.name}: _ radius=${node.radius} angle=${node.angle}\n`;
});
textFileContent += '[links]\n';
data.links.forEach(link => {
textFileContent += `${link.source}:${link.target} delay=${link.delay}\n`;
});
fs.writeFile('parsed.txt', textFileContent, function(err) {
if(err) {
return console.log(err);
}
});
Note that this could probably be done more elegant and it won`t scale very well, if your JSON is more complex, than whats shown in the sample...
I am trying to store a list with firebase firestore, but I always get a encoding error:
UnhandledPromiseRejectionWarning: Error: Cannot encode value: a,b,c
I have tried the following code snippets for node js:
const data = {
a: "hello world",
b: ["a", "b", "c"]
};
const res = await db.collection('data').doc('one').set(data);
and
const data = {
a: "hello world",
b: new Array(["a", "b", "c"])
};
const res = await db.collection('data').doc('one').set(data);
and
const data = {
a: "hello world",
b: []
};
const res = await db.collection('data').doc('one').set(data);
of which these all result in the same encoding error.
I have also tried to store a list as the following the blog post here: https://firebase.googleblog.com/2014/04/best-practices-arrays-in-firebase.html
{
0: "a",
1: "b",
2: "c"
}
but this stored it as a object not a list and I need to store it as a list.
I can create new documents with lists in the firebase console, but i cant do it programmatically. I am using node js.
Thank you for aany help you can offer.
So, I found a solution after looking at the firebase docs a bit more.
This page : https://firebase.google.com/docs/firestore/manage-data/add-data told me that I could use array union to do this. Here is the working code sample:
const data = {
a: "hello world",
b: admin.firestore.FieldValue.arrayUnion("a")
};
const res = await db.collection('data').doc('one').set(data);
I have a json response containing inner json content. I've added a library of json2xls and converted my json to excel. But, on the excel file in the address column, I'm getting [object object]. Is there any way where I could able to get those inner json value on excel?
var option = [{id: 1, name: "jason", address: [{city: "XYZ", state: "ABC"}]}]
var json2xls = require('json2xls');
var xls = json2xls(option);
fs.writeFileSync('data-report.xlsx', xls, 'binary');