I have couple of images in images/source directory which is at root level. I want to read all the images in the folder and get its path to something like below:
var images = ['images/source/first.jpg', 'images/source/second.png', 'images/source/third.png'];
I am trying to read the directory something like below, but its not working!
const imgPath= fs.readdirSync(path.join(__dirname, '/images/source')).sort();
The above code reads all the images, but i need all the images with their path in an array.
Read this forum. It may helpful.
How do you get a list of the names of all files present in a directory in Node.js?
const testFolder = './tests/';
const fs = require('fs');
fs.readdir(testFolder, (err, files) => {
files.forEach(file => {
console.log(file);
});
});
Related
I have several files on Google Cloud Storage that are named as 0.jpg, 1.jpg, 2.jpg, etc. I want to get the metadata for each file without setting file names separately. Then, want to send these metadata information to React application. In React application, when one clicks on an image, the popup displays the metadata information for this clicked image.
For only one file, I used the following code:
const express = require("express");
const cors = require("cors");
// Imports the Google Cloud client library
const { Storage } = require("#google-cloud/storage");
const bucketName = "bitirme_1";
const filename = "detected/0.jpg";
const storage = new Storage();
const app = express();
app.get("/api/metadata", cors(), async (req, res, next) => {
try {
// Gets the metadata for the file
const [metadata] = await storage
.bucket(bucketName)
.file(filename)
.getMetadata();
const metadatas = [
{id: 0, name: `Date: ${metadata.updated.substring(0,10)}, Time: ${metadata.updated.substring(11,19)}`},
{id: 1, name: metadata.contentType}
];
res.json(metadatas);
} catch (e) {
next(e);
}
});
const port = 5000;
app.listen(port, () => console.log(`Server started on port ${port}`));
I first set the bucket name. Then, set filename array as (89 is the number of files):
const filename = Array(89).fill(1).map((_, i) => ('detected/' + i + '.jpg'));
These files are in detected folder. When I try this, it gives me this error:
Error: No such object: bitirme_1/detected/0.jpg, detected/1.jpg, detected/2.jpg, detected/3.jpg, detected/4.jpg,detected/5.jpg,detected/6.jpg, ....
How can I solve the getting multiple files' metadata issue?
Also, I want to get the number of files in a bucket (or, in the detected folder). I searched the API but cannot found anything. I do not want to enter the total number of files as 89, want to get it from the API.
I found the solution for finding the number of files in a bucket or a folder in a bucket. This is the solution:
const [files] = await storage.bucket(bucketName).getFiles();
const fileStrings = files.map(file => file.name);
const fileSliced = fileStrings.map(el => el.slice(9, 11));
for (i = 0; i < fileSliced.length; i++) {
if (fileSliced[i].includes('.')) {
fileSliced[i] = fileSliced[i].slice(0, 1);
}
}
const fileNumbers = fileSliced.map(function(item) {
return parseInt(item, 10);
});
const numOfFiles = Math.max(...fileNumbers) + 1;
console.log(numOfFiles);
First, I got the all files with file names in a string array. In my case, file names are detected/0.jpg, detected/1.jpg, detected/2.jpg, etc. I just only want to the number part of the file name; hence, I sliced the string array starting from 9th index up to 11th index(not included). As a result, I got only the numbers except one digit numbers.
To handle one digit case, which have '.' at the end of the sliced name, I also removed '.' from these one digit file names.
As a result, I got ['0', '1', '2', '3', ...]. Next, I convert this string array to number array using parseInt function. Finally, to get the number of files, I got the maximum of the array and add 1 to this number.
I have an image detail component that includes location of sender ip, download option and exiting the popup page. This detail popup page opens at /#i where i is the name of image file, such as 1.jpg, 2.jpg. So, for example when I click the first image, the popup page opens at /#1. In this popup page, I want to get metadata information for the opened image. But, I could not find a solution for this.
I've set up a small node js BE app, built with express and fastCsv module on top of it. The desired outcome would be to be able to download a csv file to the client side, without storing it anywhere inside the server, since the data is generated depending on user criteria.
So far I've been able to get somewhere it it, Im using streams, since that csv file could be pretty large depending on the user selection. Im pretty sure something is missing inside the code bellow:
const fs = require('fs');
const fastCsv = require('fast-csv');
.....
(inside api request)
.....
router.get('/', async(req, res) => {
const gatheredData ...
const filename = 'sometest.csv'
res.writeHead(200, {
'Content-Type': 'text/csv',
'Content-Disposition': 'attachment; filename=' + filename
})
const csvDataStream = fastCsv.write(data, {headers: true}).pipe(res)
})
The above code 'works' in some way as it does deliver back the response, but not the actual file, but the contents of the csv file, which I can view in the preview tab as a response. To sum up, Im trying to stream in that data, into a csv and push it to download file to client, and not store it on the server. Any tips or pointers are very much appreciated.
Here's what worked for me after created a CSV file on the server using the fast-csv package. You need to specify the full, absolute directory path where the output CSV file was created:
const csv = require("fast-csv");
const csvDir = "abs/path/to/csv/dir";
const filename = "my-data.csv";
const csvOutput = `${csvDir}/${filename}`;
console.log(`csvOutput: ${csvOutput}`); // full path
/*
CREATE YOUR csvOutput FILE USING 'fast-csv' HERE
*/
res.type("text/csv");
res.header("Content-Disposition", `attachment; filename="${filename}"`);
res.header("Content-Type", "text/csv");
res.sendFile(filename, { root: csvDir });
You need to make sure to change the response content-type and headers to "text/csv", and try enclosing the filename=... part in double-quotes, like in the above example.
I've got multiple json files contained within a directory that will dynamically be updated by users. The users can add categories which will create new json files in that directory, and they can also remove categories which would delete json files in that directory. I'm looking for a method to read all json files contained in that folder directory, and push all the json files into a single object array. I imagine asynchronously would be desirable too.
I'm very new to using fs. I've management to read single json files by directory using
const fs = require('fs');
let data = fs.readFileSync('./sw_lbi/categories/category1.json');
let categories = JSON.parse(data);
console.log(categories);
But of course this will only solve the synchronous issue when using require()
As I'll have no idea what json files will be contained in the directory because the users will also name them, I'll need a way to read all the json files by simply calling the folder directory which contains them.
I'm imagining something like this (which obviously is foolish)
const fs = require('fs');
let data = fs.readFileSync('./sw_lbi/categories');
let categories = JSON.parse(data);
console.log(categories);
What would be the best approach to achieve this?
Thanks in advance.
First of all you need to scan this directory for files, next you need to filter them and select only JSONs, and at the end just read every file and do what you need to do
const fs = require('fs');
const path = require('path')
const jsonsInDir = fs.readdirSync('./sw_lbi/categories').filter(file => path.extname(file) === '.json');
jsonsInDir.forEach(file => {
const fileData = fs.readFileSync(path.join('./sw_lbi/categories', file));
const json = JSON.parse(fileData.toString());
});
I am using node js to download azure blob storage files into our local machine. I am able to download it into my project path but not able to download into my local machine. I am using html, Express,and node js. Currently working on localhost only. How to download ?
Below is the code that i am using to download blob file to local folder.
app.get("/downloadImage", function (req, res) {
var fileName = req.query.fileName;
var downloadedImageName = util.format('CopyOf%s', fileName);
blobService.getBlobToLocalFile(containerName, fileName, downloadedImageName, function (error, serverBlob) {
});
});
I am able to to download it to my project folder but i want to download it to my downloads folder. Please help me on this ?
To download a file from Azure blob storage
ConnectionString: Connection string to blob storage
blobContainer: Blob Container Name
sourceFile: Name of file in container i.e sample-file.zip
destinationFilePath: Path to save file i.e ${appRoot}/download/${sourceFile}
const azure = require('azure-storage');
async function downloadFromBlob(
connectionString,
blobContainer,
sourceFile,
destinationFilePath,
) {
logger.info('Downloading file from blob');
const blobService = azure.createBlobService(connectionString);
const blobName = blobContainer;
return new Promise((resolve, reject) => {
blobService.getBlobToLocalFile(blobName, sourceFile, destinationFilePath, (error, serverBlob) => {
if (!error) {
logger.info(`File downloaded successfully. ${destinationFilePath}`);
resolve(serverBlob);
}
logger.info(`An error occured while downloading a file. ${error}`);
reject(serverBlob);
});
});
};
According to the reference of the method blobService.getBlobToLocalFile, as below, the value of parameter localFileName should be the local file path with related dir path.
localFileName string The local path to the file to be downloaded.
So I created a directory named downloadImages and changed your code as below.
var downloadDirPath = 'downloadImages'; // Or the absolute dir path like `D:/downloadImages`
app.get("/downloadImage", function (req, res) {
var fileName = req.query.fileName;
var downloadedImageName = util.format('%s/CopyOf%s', path, fileName);
blobService.getBlobToLocalFile(containerName, fileName, downloadedImageName, function (error, serverBlob) {
});
});
It works for me, the image file was downloaded into my downloadImages directory, not under the path of my node app.js running.
Note: If you want to deploy it on Azure WebApp later, please must use the absolute directory path like D:/home/site/wwwroot/<your defined directory for downloading images>, because the related directory path is always related the path of IIS started up node.
I want to have persistent memory (store the user's progress) in a .json file in %AppData%. I tried doing this according to this post, but it doesn't work. For testing purposes I'm only working with storing one object.
The code below doesn't work at all. If I use fs.open(filePath, "w", function(err, data) { ... instead of readFile(..., it does create a json file in %AppData%, but then it doesn't write anything to it, it's always 0 bytes.
var nw = require('nw.gui');
var fs = require('fs');
var path = require('path');
var file = "userdata.json";
var filePath = path.join(nw.App.dataPath, file);
console.log(filePath); // <- This shows correct path in Application Data.
fs.readFile(filePath ,function (err, data) {
var idVar = "1";
var json = JSON.parse(data);
json.push("id :" + idVar);
fs.writeFile(filePath, JSON.stringify(json));
});
If anyone has any idea where I'm messing this up, I'd be grateful..
EDIT:
Solved, thanks to kailniris.
I was simply trying to parse an empty file
There is no json in the file you try to read. Before parsing data check if the file is empty. If it is then create an empty json, push the new data into it then write it to the file else parse the json in the file.