I'm using websockets in NodeJS to create a server and I want to read a CSV file or parse a CSV file and print data on a web page but that file is updated frequently so is there a way to keep track of the updating file and print the updated data like appending the print data whenever updated?
I am able to parse csv and print data but want to keep track of updation
Have a look at using fs to watch a file for changes.
When this event fires for your file, you can send a socket event with the latest version of the contents of the file to the client. You will need to re-fill your file stream when this event fires, and then send the results over the socket.
Related
I have a rather large JSON file that stores user information, and when my server starts it loads the entire file into memory. Obviously, this is not ideal. I have looked into using read/write streams, but I can't seem to understand quite how they'd work.
The data in the JSON file is formatted as such:
"accountName": {
"favoriteColor": "blue"
}
The process currently goes in this order:
Server starts, and data.json is loaded into a variable (dataVar),
User johndoe logs in, and their data is used to make an object.
The user changes their object's data, and dataVar is updated.
Server autosaves to the data.json file with the new contents.
I want to continue being able to access user data as needed, without loading everything into memory. I assume there are stream equivalents of things like dataVar.johndoe, but I can't seem to find that information.
I am new to node js
I want to print excel data without using any library as using libraries takes time and we have to deal with large memory files.
I tried reading a .xlsx file using fs.createReadStream() and logged the data to console, it prints some different characters but the actual data.
nodejs code
const stream=fs.createReadStream('example.xlsx');
stream.on('data',function(data){
console.log(Buffer.from(data, 'base64').toString('utf8'))
})
Can I know how to get the actual data or any library that can read large excel files?
let dataCer = '0�\u0007\u00060�\u0006��\u0003\u0002\u0001\u0002\u0002\u0010Q��\u0000����K��Z�Q��0\n\u0006\b*�\u0003\u0007\u0001\u0001\u0003\u00020�\u0001l1\u001e0\u001c\u0006\.............'
fs.writeFile('111.cer', dataCer);
let dataPdf = '%PDF-1.4\r\n1 0 obj\r\n<< \r\n/Length 9947\r\n/Filter /FlateDecode\r\n>>\r\nstream\r\nX��]�n#9p}���\u000f���\u0005\b\u0002X��<\'X \u001f�\u001b\u0010 \u0001���H�,6�R�Z�\u0014�N`�\n�T�t�ڼT\u0015���?ԋz��_�{IN_Bz�����O.............'
fs.writeFile('111.pdf', dataPdf);
The data dataCer and dataPdf I get from the application using the GET requests. I can only get this data in this encoding.
And now I need to save them as files.
Also, I will need to then save any data to the file in the same way (zip, rar, png, jpeg, ...).
When i use fs.writeFile, I get files that do not open.
fs.writeFile, can not keep the original state data, ignoring the encoding does not give me the desired result.
Please tell me how to get around this error?
Or which library can save data to any file in node.js, while ignoring the encoding?
I have an application that streams data to a file, can I use Node.js to read the file while it's being streamed to?
I tried using createReadStrem, but it only read one chunk and the stream ended
You could try watching for file changes with fs.watchFile(filename[, options], listener) or node-watch. In file change you could just read last line with read-last-lines.
Although I'm not sure how efficient it would be.
I'm trying to read the contents of some .gz log files using streams in node.
I've started simply with: fs.createReadStream('log.gz').pipe(zlib.createUnzip().
This works and I can pipe to process.stdout to verify. I'd like to pipe to this a new writeableStream, that I can have a data event to actually work with the contents. I guess I just don't fully understand how the streams work. I tried just creating a new writable stream, var writer = fs.createWriteStream() but this doesn't work because it requires a path.
Any ideas how I can go about doing this (without creating any other files to write to)?
var unzipStream = zlib.createUnzip()
unzipStream.on('data', myDataHandler).on('end', myEndHandler)
fs.createReadStream('log.gz').pipe(unzipStream)
That will get you the data and end events. Since it's log data you may also find split useful to get events line by line.