Read file using in vue template - node.js

I need to read a file from the system in using vue.js
Something like:
import fs from "fs";
var fs = require("fs");
fs.readFile("path");
I Have an error with importing fs. what am I doing wrong?

I'm using Axios for this purposes like this:
axios.get('/' + 'settings.json').then(response => {
console.log(response.data);
//do anything you want with response.data (content of the file)
})
Your file must be in a public folder

Related

How to store image from url to server folder using node js?

I am new to NodeJS and try to practice file upload from URL like this:
I have One url on which one image is placed. I want to upload that image into my server without download it anywhere. How can I do that ?
I tried node-fetch package to fetch buffer of image from URL and then try to write in my server using createWriteStream But failed.
Please help me. I don't know what is the correct way to upload file.
What I have tried :
import { createWriteStream } from 'fs';
import { pipeline } from 'stream';
import { promisify } from 'util'
import fetch from 'node-fetch';
const streamPipeline = promisify(pipeline);
const response = await fetch('http://dummy.com/5.png');
if (!response.ok) throw new Error(`unexpected response ${response.statusText}`);
await streamPipeline(response.body, createWriteStream('./img.png'));

Node Express Fast CSV download to client

I've set up a small node js BE app, built with express and fastCsv module on top of it. The desired outcome would be to be able to download a csv file to the client side, without storing it anywhere inside the server, since the data is generated depending on user criteria.
So far I've been able to get somewhere it it, Im using streams, since that csv file could be pretty large depending on the user selection. Im pretty sure something is missing inside the code bellow:
const fs = require('fs');
const fastCsv = require('fast-csv');
.....
(inside api request)
.....
router.get('/', async(req, res) => {
const gatheredData ...
const filename = 'sometest.csv'
res.writeHead(200, {
'Content-Type': 'text/csv',
'Content-Disposition': 'attachment; filename=' + filename
})
const csvDataStream = fastCsv.write(data, {headers: true}).pipe(res)
})
The above code 'works' in some way as it does deliver back the response, but not the actual file, but the contents of the csv file, which I can view in the preview tab as a response. To sum up, Im trying to stream in that data, into a csv and push it to download file to client, and not store it on the server. Any tips or pointers are very much appreciated.
Here's what worked for me after created a CSV file on the server using the fast-csv package. You need to specify the full, absolute directory path where the output CSV file was created:
const csv = require("fast-csv");
const csvDir = "abs/path/to/csv/dir";
const filename = "my-data.csv";
const csvOutput = `${csvDir}/${filename}`;
console.log(`csvOutput: ${csvOutput}`); // full path
/*
CREATE YOUR csvOutput FILE USING 'fast-csv' HERE
*/
res.type("text/csv");
res.header("Content-Disposition", `attachment; filename="${filename}"`);
res.header("Content-Type", "text/csv");
res.sendFile(filename, { root: csvDir });
You need to make sure to change the response content-type and headers to "text/csv", and try enclosing the filename=... part in double-quotes, like in the above example.

Uncaught TypeError: fs.readFile is not a function

Node.js, Webpack
In this project using webpack, where installed FS.
This code need to read file, but returns error "Uncaught TypeError: fs.readFile is not a function"
const bookForm = document.querySelector(".book-form");
const select = document.querySelector(".select"); const fs = require("fs");
export function abc() { bookForm.addEventListener("submit", e => {
console.log(select.options[select.selectedIndex].text);
e.preventDefault();
fs.readFile("file.txt", function(error, data) {
console.log("file read");
if (error) throw error;
console.log(data); });
}); }
You cannot import the fs module in the browser, because the browser environment does not have access to the user's file system. fs is only available in the Node.js context (on the server) but not on the client (browser).
If you want to send files from the browser to the server, you can use <input type="file"> and let the user manually select the files they have to send. If you want to send a server file's contents to the browser, you can use HTTP communication (AJAX) or you can render it's content in a server-side computed HTML template.
Within your config file you can set the way in which assests like your txt file is uploaded. Then you simply use require('file.txt') to load it - no need to use fs.

How to download a .gz file with Node.js without any third party libraries

I simply want to download a .gz file from a URL and save it in a folder. I would like to do this without any third party libraries if possible. Here's what I have so far, but it only downloads an empty file:
const fs = require('fs')
const https = require('https')
let file = fs.createWriteStream('./folder/filename.gz')
let request = https.get('https://someurl/somefile.gz', function(res) {
res.pipe(file)
})
you can try this, using HTTP module for nodesJS,it looks similar to downloading any other file, just remember to mention the extension of the downloaded file when calling instead....Here is an example:
NOTE: IF you are trying to download from an HTTPS link, use the HTTPS
module instead, its exactly the same, but just replace all the
HTTP in the following code with HTTPS
const http = require('http');
const fs = require('fs');
//I added './' assuming that you want to download it where the server
//file is located, just change it to your desired path, followed by the
//filename and the EXTENSION
const file = fs.createWriteStream("./result.tar.gz");
const request = http.get("http://alpha.gnu.org/gnu/gzip/gzip-1.3.6.tar.gz", (response) => {
response.pipe(file);
});

How to dynamically import data in a nodejs app?

I would like to use require in a node/express app with typescript to import a json. I tried it like this:
const url = `./data/${resource}.json`;
const data = require(url);
but I get the error Cannot find module './data/my-data.json'.
I'd like to use require instead of an import in order to create the data variable dynamically depending on the value of the resource variable.
const path = require('path');
const url = path.resolve(__dirname, `./data/${resource}.json`);
const data = require(url);
The require keyword is a special keyword in nodejs. It is used to load modules, and since your json file is not a module, hence the error. Try this, this way you can dynamically load your json.
import fs from 'fs';
const file = fs.readFileSync(`./data/${resource}.json`).toString();
const data = JSON.parse(file);
There may be better ways to write this function, read mode about the fs module here.
Edit: As someone had alredy pointed out, it is actually possible to dynamicallyrequire json file. Here's how,
import path from 'path';
const uri = path.resolve(__dirname, `<path_to_json_file>`);
const data = require(uri);
However, as a standard practice, use the fs module to load static assets to your project.
import fs from 'fs';
const file = fs.readFileSync(`./data/${resource}.json`).toString();
const data = JSON.parse(file);

Resources