How to deserialize avro in nodejs? - node.js

I have an avro file.
I want to use nodejs to open and read its schema and iterate through its records.
How to do this? The avro libraries I see in nodejs appear to require you to pass in a schema instead of getting the schema out of the .avro file. Also, I want to be able to support arrays, which there does not seem to exist a node library that does (node-avro-io).
My avro/avroschema Contains:
A nested field {a:{suba: vala, subb: vala}}.
An array field {a:["A","B"]}. node-avro-io does not work.
Error I get with node-avro-io:
Avro Invalid Schema Error: Primitive type must be one of: ["null","boolean","int","long","float","double","bytes","string"]; got DependencyNode

In case you're still looking, you can do this with avsc. The code would look something like:
var avro = require('avsc');
// To stream the file's records (they will already be decoded):
avro.createFileDecoder('your/data.avro')
.on('data', function (record) { /* Do something with the record. */ });
// Or, if you just want the file's header (which includes the schema):
var header = avro.extractFileHeader('your/data.avro');

If you want to open a file, the following code found here : https://www.npmjs.com/package/node-avro-io will do the trick :
var DataFile = require("node-avro-io").DataFile;
var avro = DataFile.AvroFile();
var reader = avro.open('test.avro', { flags: 'r' });
reader.on('data', function(data) {
console.log(data);
});

Related

Node.js fs.writeFile() not creating new files?

I need to create many .json files for the system i am trying to develop. To do this, i ran a for loop over the file names i needed, then used fs.writeFileSync('filename.json', [data]).
However, when trying to open these later, and when I try to find them in the project folder, I cannot find them anywhere.
I have tried writing in a name that was less complex and should have appeared in the same directory as my script, however that was fruitless as well. To my understanding, even if my file name wasn't what I expected it to be, I should get at least something, somewhere, however I end up with nothing changed.
My current code looks like this:
function addEmptyDeparture(date) {
fs.readFileSync(
__dirname + '/reservations/templates/wkend_dep_template.json',
(err, data) => {
if (err) throw err
fs.writeFileSync(
getDepartureFileName(date),
data
)
}
)
}
function getDepartureFileName(date){
return __dirname + '/reservations/' +
date.getFullYear() +
'/departing/' +
(date.getMonth() + 1).toString().padStart(2, "0") +
date.getDate().toString().padStart(2, "0") +
'.json'
}
Where data is the JSON object returned from fs.readFileSync() and is immediately written into fs.writeFileSync(). I don't think I need to stringify this, since it's already a JSON object, but I may be wrong.
The only reason I think it's not working at all (as opposed to simply not showing up in my project) is that, in a later part of the code, we have this:
fs.readFileSync(
getDepartureFileName(date)
)
.toString()
which is where I get an error for not having a file by that name.
It is also worth noting that date is a valid date object, as I was able to test that part in a fiddle.
Is there something I'm misunderstanding in the effects of fs.writeFile(), or is this not the best way to write .json files for use on a server?
You probably are forgetting to stringify the data:
fs.writeFileSync('x.json', JSON.stringify({id: 1}))
I have tried to create similar case using a demo with writeFileSync() creating different files and adding json data to these ,using a for loop. In my case it works . Each time a new file name is created . Here is my GitHub for the same :-
var fs = require('fs');
// Use readFileSync() method
// Store the result (return value) of this
// method in a variable named readMe
for(let i=0 ; i < 4 ; i++) {
var readMe = JSON.stringify({"data" : i});
fs.writeFileSync('writeMe' + i + '.json', readMe, "utf8");
}
// Store the content and read from
// readMe.txt to a file WriteMe.txt
Let me know if this what you have been trying at your end.

How can I read json values from a file?

So basically I have these json values in my config.json file, but how can I read them from a .txt file, for example:
{"prefix": $}
This would set a variable configPrefix to $. Any help?
You can use require() to read and parse your JSON file in one step:
let configPrefix = require("./config.json").prefix;
Or, if you wanted to get multiple values from that config:
const configData = require("./config.json");
let configPrefix = configData.prefix;
If your data is not actually JSON formatted, then you have to read the file yourself with something like fs.readFile() or fs.readFileSync() and then parse it yourself according to whatever formatting rules you have for the file.
If you are going to be reading this file just as the start of the program then go ahead and use require or import if you have babel. just a tip, suround the require with a try catch block to handle possible errors.
let config
try {
config = require('path.to.file.json')
} catch (error) {
// handle error
config = {}
}
If you will be changing this file externally and you feel the need to source it then apart from reading it at the start you will need a function that uses fs.readFile. consider doing it like this and not with readFileAsync unless you need to block the program until you are done reading the config file.
After all of that you can do const configPrefix = config.prefix which will have the value '$'.

neo4j, nodejs, session expire error, how to fix it?

I was trying to use neo4j at backend. First I want to import csv to neo4j. (first tried to see how many lines csv file has)
But having problem, the code is following
var neo4j = require('neo4j-driver').v1;
var driver = neo4j.driver("bolt://localhost", neo4j.auth.basic("neo4j", "neo4j"));
function createGraphDataBase(csvfilepath)
{
var session = driver.session();
return session
.run( 'LOAD CSV FROM {csvfilepath} AS line RETURN count(*)',
{csvfilepath}
)
.then(result => {
session.close();
console.log(' %d lines in csv.file', result);
return result;
})
.catch(error => {
session.close();
console.log(error);
return error;
});
}
the "csvfilepath" is the path of csv file, it is as follows.
'/Users/.../Documents/Project/.../test/spots.csv';
is there something wrong with giving path like this?
I am calling that function on other module as
var api = require('./neo4j.js');
const csvFile = path.join(__dirname,csvFileName);
api.createGraphDataBase(csvFile);
I am having error as
Error: Connection was closed by server
....
I am new to these, please help!
The URL that you specify in a LOAD CSV clause must be a legal URL.
As stated in this guide:
Make sure to use the right URLs esp. file URLs.+ On OSX and Unix use
file:///path/to/data.csv, on Windows, please use
file:c:/path/to/data.csv
In your case, csvfilepath needs to specify the file:/// protocol (since you seem to be running on OSX) for a local file. Based on your example, the value should be something like this:
'file:///Users/.../Documents/Project/.../test/spots.csv'

Node.js import csv with blank fields

I'm trying to import & parse a CSV file using the csv-parse package, but having difficulty with requireing the csv file in the first place.
When I do input = require('../../path-to-my-csv-file')
I get an error due to consecutive commas because some fields are blank:
e","17110","CTSZ16","Slitzerâ„¢ 16pc Cutlery Set in Wood Block",,"Spice up
^
SyntaxError: Unexpected token ,
How do I import the CSV file into the node environment to begin with?
Package examples are Here
To solve your first problem, reading CSV with empty entries:
Use the 'fast-csv' node package. It will parse csv with emtpy entries.
To answer your second question, how to import a CSV into node:
You don't really "import" csv files into node. You should fs.open the file
or use fs.createReadStream to read the csv file at the appropriate location.
Below is a script that uses fs.createReadStream to parse a CSV called 'test.csv' that is one directory up from the script that is running it.
The first section sets up our program, makes basic declarations of the objects were going use to store our parsed list.
var csv = require('fast-csv') // require fast-csv module
var fs = require('fs') // require the fs, filesystem module
var uniqueindex = 0 // just an index for our array
var dataJSON = {} // our JSON object, (make it an array if you wish)
This next section declares a stream that will intercept data as it's read from our CSV file and do stuff to it. In this case we're intercepting the data and storing it in a JSON object and then saving that JSON object once the stream is done. It's basically a filter that intercepts data and can do what it wants with it.
var csvStream = csv() // - uses the fast-csv module to create a csv parser
.on('data',function(data){ // - when we get data perform function(data)
dataJSON[uniqueindex] = data; // - store our data in a JSON object dataJSON
uniqueindex++ // - the index of the data item in our array
})
.on('end', function(){ // - when the data stream ends perform function()
console.log(dataJSON) // - log our whole object on console
fs.writeFile('../test.json', // - use fs module to write a file
JSON.stringify(dataJSON,null,4), // - turn our JSON object into string that can be written
function(err){ // function(err) only gets performed once were done saving the file and err will be nil if there is no error
if(err)throw err //if there's an error while saving file throw it
console.log('data saved as JSON yay!')
})
})
This section creates what is called a "readStream" from our csv file. The path to the file is relative. A stream is just a way of reading a file. It's pretty powerful though because the data from a stream can be piped into another stream.
So we'll create a stream that reads the data from our CSV file, and then well pipe it into our pre-defined readstream / filter in section 2.
var stream = fs.createReadStream('../test.csv')
stream.pipe(csvStream)
This will create a file called 'test.json' one directory up from the place where our csv parsing script is. test.json will contain the parsed CSV list inside a JSON object. The order in which the code appears here is how it should appear in a script you make.

export via mongoose streams to node-csv

trying to get a CSV dump of some data (~500Mb) in mongodb. Thought streams would be the way to go, to avoid building up an array in memory and then building the csv at once.
But, it seems the stream that mongoose creates and the one that csv expects are not the same thing.
var stream = Subscriber.find().stream()
stream.setEncoding = function() { }
csv().from.stream(stream).on('record', function(record, index) {
console.log(record)
console.log(index)
})
without the setEncoding() stub above, I get an error about that when csv calls setEncoding on the stream. With it, results in
TypeError: Object #<Object> has no method 'indexOf'
at [object Object].stringify (/home/project/node_modules/csv/lib/stringifier.js:98:35)
So, is this even the right approach? if so, what is the problem with the streams?
As zeMirco said: to get a CSV dump of a collection, I'd use the mongoexport tool that comes with MongoDB. Here's an example of exporting a collection called "users" in a database "mydatabase" to CSV format:
$ mongoexport --csv --host localhost:27017 --db mydatabase --collection users --fields name,email,age -o output.csv
And you'll get something that looks like this:
$ cat output.csv
name,email,age
renold,renold.ronaldson#gmail.com,21
jacob,xXxjacobxXx#hotmail.com,16
Something like this should work. Replace process.stdout with a filestream to write it to a file.
var csv = require('csv')
var through = require('through')
var Model = require('...')
_ = require('underscore')
var modelStream = Model.find().stream();
modelStream.pipe(through(write, end)).pipe(csv()).pipe(process.stdout);
function end(){ console.log('done'); }
function write(doc) {
this.queue(_.values(doc.toObject({getters:true, virtuals:false})));
}
If you want to download the csv from a webserver by accessing a URL and your using express you can do this:
var through = require('through');
var csv = require('csv')
var MyModel = require('./my_model');
app.get('/download_csv/', function(req, res) {
res.setHeader('Content-disposition', 'attachment; filename=attendances.csv');
res.contentType('csv');
res.write('property 1,property 2\n');
var modelStream = MyModel.find().stream();
modelStream.
pipe(through(write, end)).
pipe(csv.stringify()).
pipe(res);
function end() {
res.end();
console.log('done outputting file');
}
function write(doc) {
var myObject = doc.toObject({getters:true, virtuals:false});
this.queue([
myObject.property_1,
myObject.property_2
]);
}
});
NOTE: This is using the latest version of the csv module (v0.4) whereas the previous answers are using an older version of the module.

Resources