Node js unzip local file and delete it - node.js

i have a problem with node.js, i have a script that download from a sftp server some zip file, every zip file are a compress csv file. My task is to download the file, unzip it and delete the zip file.
I have already a working script to download all the files from the sftp server. Now i would like to add to this script a function to unzip all the files and store only the csv files.
For doing that i started to work to a local script that open directly a single file and it had tried to unzip it. But i can't figured out how to do.
This is the portion of code that i wrote, after this code starts working I would like to put it in a helper function class where i can call from my script after the process of the download from sftp was completed.
Anyone can help me to understand on what i am wrong?
const logger = require("./utils/logger");
const path = require("path");
const fs = require("fs");
const unzipper = require("unzipper");
const { LOCAL_IN_DIR_PATH } = require("./utils/consts");
const unzipAll = async (pathToSearch) => {
console.log("unzipAll");
try {
const compFiles = fs.readdirSync(pathToSearch).forEach(function (file) {
if (file.endsWith(".zip")) {
const path = LOCAL_IN_DIR_PATH + `/${file}`;
fs.createReadStream(path).pipe(
unzipper.Extract({ path: path })
);
}
});
} catch (err) {
console.log(err);
}
};
const run = async () => {
try {
const LOCAL_IN_DIR_PATH = path.resolve(__dirname, "IN");
const result = await unzipAll(LOCAL_IN_DIR_PATH);
console.log(result);
} catch (error) {
console.log(error);
}
};
run();

Related

how to pipe file during download with puppteer?

Is that possible to pipe during download file with Puppter?
Code attached are example of download with puppteer while the second part is how i extract file during download.
i want to include part 2 in part 1 somehow.
const page = await browser.newPage();//skiped other configs
const client = await page.target().createCDPSession();//set directory of files
await client.send("Page.setDownloadBehavior", {
behavior: "allow",
downloadPath: process.cwd() + "\\src\\tempDataFiles\\rami",
});
//array of links from page
const fileUrlArray = await page.$$eval("selector", (files) => {
return files.map((link) => {link.getAttribute("href")});
//download files
const filtredFiles = fileUrlArray.filter((url) => url !== null);
for (const file of filtredFiles) {
await page.click(`[href="${file}"]`);
}
This code works perfect but the files are zip files and i want to extract them before save.
When i download file without puppter, the extraction is as next code.
**(in this case im not able use https request yet due to lake of knowledge)
The code of unzip file directly when download without puppter (simple http request)
const file = fs.createWriteStream(`./src/tempDataFiles/${store}/${fileName}.xml`);
const request = http.get(url, function (response) {
response.pipe(zlib.createGunzip()).pipe(file);
file.on("error", async (e) => {
Log.error(`downloadAndUnZipFile error : ${e}`);
await fileOnDb.destroy();
file.end();
});
file.on("finish", () => {
Log.success(`${fileName} download Completed`);
file.close();
});
});

Error uploading image to firebase storage in React Native [ Firebase 9.6.2]

so i'm trying to upload an image to firebase storage (a local photo from the source file directory - path: same path as .js). The problem is that in firebase storage the image appears to be corrupted and every picture have 9 bytes firebase storage image . The authentication and firestore works perfectly, this is my configuration file: firebase config file and this is the code:
const uploadPhoto = async() => {
// console.log(image);
// const uploadUri = image;
// let filename = uploadUri.substring(uploadUri.lastIndexOf('/') + 1);
const metadata = {
//path: '../../firebase_image.jpg',
contentType: 'image/jpeg'
};
const photog = `./photo.jpg`;
console.log(photog);
console.log(uploadUri);
const storageRef;
storageRef=ref(storage, 'photogra.jpg');//filename+
uploadBytes(storageRef, photog, metadata).then((snapshot) => {
console.log('Uploaded a blob or file!');
});
}
I hope you're fine!
I got the information from this amazing tutorial and worked fine for me
https://www.youtube.com/watch?v=H-yXO46WDak&lc=z22ph5dhssfqufkcxacdp430segloszlmvuqlp1seplw03c010c
Try with this:
const uploadImageFirebase = async () =>{
const nameImage = new Date().toISOString();
const img = await fetch(image);
const bytes = await img.blob();
try {
await uploadBytes(ref_con, bytes);
} catch (error) {
console.log(error);
} finally {
//
}
};
If you check the contents of your 9-byte file, it'll like be "photo.jpg".
Since you're passing "photo.jpg" to uploadBytes, it uploads that strings as the contents of the new file. It does not know how to load the file at that path.
You will either need to pass a local File or Buffer (which you'll usually get from a file picker or something like that), or load the data from the file yourself and pass the contents to uploadBytes.

Directory creation not working, basic-ftp module in Node Js

I am trying to upload file to FTP server, but I see few entries but rest are skipped, even no error is generating. I don't know where exactly I am getting wrong.is it a synchronization issue or an issue from the package itself? I even used jsFtp package which can also put a buffer in the server but not works as expected. below are my code and output.
const unzipper = require("unzipper");
const ftp = require("basic-ftp");
const client = new ftp.Client();
await client.access({...options});
const zip = fs.createReadStream(path.join(filePath, `code.zip`)).pipe(unzipper.Parse({ raw: true, forceStream: true}));
for await (const entry of zip) {
await client.cd("/");
const type = entry.type; // 'Directory' or 'File'
const size = entry.vars.uncompressedSize; // There is also compressedSize;
let fileArray = entry.path.split("/");
if(size > 0 ) {
let fileName = fileArray.pop();
let dir = fileArray.splice(1).join("/");
await client.uploadFrom(entry, dir + "/" + fileName);
}
if(type === 'Directory') {
let dir = fileArray.splice(1).join("/");
await client.ensureDir(`${dir}`);
// await client.clearWorkingDir();
}
}
console.log("Entry Read Finished");
.gitignore
LICENSE
README.md
app/
app/bootstrap.php
app/config.php
composer.json
console.php
src/
src/SuperBlog/
src/SuperBlog/Command/
src/SuperBlog/Command/ArticleDetailComm
src/SuperBlog/Controller/
src/SuperBlog/Controller/ArticleControl
src/SuperBlog/Controller/HomeController
src/SuperBlog/Model/
src/SuperBlog/Model/Article.php
src/SuperBlog/Model/ArticleRepository.p
src/SuperBlog/Persistence/
src/SuperBlog/Persistence/InMemoryArtic
src/SuperBlog/Views/
src/SuperBlog/Views/article.twig
src/SuperBlog/Views/home.twig
src/SuperBlog/Views/layout.twig
web/
web/.htaccess
web/index.php
Creating Directory /
Uploading .gitignore
File: '' '.gitignore'
Uploading LICENSE
File: '' 'LICENSE'
Uploading README.md
File: '' 'README.md'
Creating Directory /app/
Uploading bootstrap.php
File: 'app' 'bootstrap.php'
Uploading config.php
File: 'app' 'config.php'
Entry Read Finished
can any one suggest what wrong with the code. Zip is perfectly fine no error with that.
You can simplify this using basic-ftp's support for uploadFromDir which will automatically create the required folder structure and upload the file contents. Here's a working example (tested successfully with the public ftp-server https://dlptest.com/ftp-test/):
const unzipper = require("unzipper");
const ftp = require("basic-ftp");
const fs = require('fs');
const path = require('path');
(async () => {
try {
await fs.createReadStream(path.resolve('./testzip.zip'))
.pipe(unzipper.Extract({path: './tmp-extract'}))
.promise();
const client = new ftp.Client();
await client.access({
host: "<host>",
user: "<user>",
password: "<password>",
});
await client.uploadFromDir("./tmp-extract");
console.log("upload successful");
} catch (err) {
console.log(err);
}
})();

How to read file from createReadStream in Node.js?

I have web appication that can upload excel file. If user upload, the app should parse it and will return some rows that file have. So, The application don't need to save file to its filesystem. Parsing file and return rows is a job. But below code, I wrote this morning, it save file to its server and then parse it.. I think it's waste server resource.
I don't know how to read excel file with createReadStream. Without saving file, how can I parse excel directly? I am not familiar with fs, of course, I can delete file after the job finished, but is there any elegant way?
import { createWriteStream } from 'fs'
import path from 'path'
import xlsx from 'node-xlsx'
// some graphql code here...
async singleUpload(_, { file }, context) {
try {
console.log(file)
const { createReadStream, filename, mimetype, encoding } = await file
await new Promise((res) =>
createReadStream()
.pipe(createWriteStream(path.join(__dirname, '../uploads', filename)))
.on('close', res)
)
const workSheetsFromFile = xlsx.parse(path.join(__dirname, '../uploads', filename))
for (const row of workSheetsFromFile[0].data) {
console.log(row)
}
return { filename }
} catch (e) {
throw new Error(e)
}
},
Using express-fileupload library which provides a buffer representation for uploaded files (through data property), combined with excel.js which accepts a buffers will get you there.
see Express-fileupload and Excel.js
// read from a file
const workbook = new Excel.Workbook();
await workbook.xlsx.readFile(filename);
// ... use workbook
// read from a stream
const workbook = new Excel.Workbook();
await workbook.xlsx.read(stream);
// ... use workbook
// load from buffer // this is what you're looking for
const workbook = new Excel.Workbook();
await workbook.xlsx.load(data);
// ... use workbook
Here's a simplified example:
const app = require('express')();
const fileUpload = require('express-fileupload');
const { Workbook } = require('exceljs');
app.use(fileUpload());
app.post('/', async (req, res) => {
if (!req.files || Object.keys(req.files).length === 0) {
return res.status(400).send('No files were uploaded.');
}
// The name of the input field (i.e. "myFile") is used to retrieve the uploaded file
await new Workbook().xlsx.load(req.files.myFile.data)
});
app.listen(3000)
var xlsx = require('xlsx')
//var workbook = xlsx.readFile('testSingle.xlsx')
var workbook = xlsx.read(fileObj);
You just need to use xlsx.read method to read a stream of data.
you can add an event listener before you pipe the data, so you can do something with your file before it uploaded, it look like this
async singleUpload(_, { file }, context) {
try {
console.log(file)
const { createReadStream, filename, mimetype, encoding } = await file
await new Promise((res) =>
createReadStream()
.on('data', (data)=>{
//do something with your data/file
console.log({data})
//your code here
})
.pipe(createWriteStream(path.join(__dirname, '../uploads', filename)))
.on('close', res)
)
},
you can see the documentation
stream node js

download and untar file than check the content, async await problem, node.js

I am downloading a file in tar format with request-promise module. Then I untar that file with tar module using async await syntax.
const list = new Promise(async (resolve, reject) => {
const filePath = "somedir/myFile.tar.gz";
if (!fs.existsSync(filePath)) {
const options = {
uri: "http://tarFileUrl",
encoding: "binary"
};
try {
console.log("download and untar");
const response = await rp.get(options);
const file = await fs.createWriteStream(filePath);
file.write(response, 'binary');
file.on('finish', () => {
console.log('wrote all data to file');
//here is the untar process
tar.x(
{
file: filePath,
cwd: "lists"
}
);
console.log("extracted");
});
file.end();
} catch(e) {
reject();
}
console.log("doesn't exist");
}
}
//here I am checking if the file exists no need to download either extract it (the try catch block)
//then the Array is created which includes the the list content line by line
if (fs.existsSync(filePath)) {
const file = await fs.readFileSync("lists/alreadyExtractedFile.list").toString().match(/[^\r\n]+/g);
if (file) {
file.map(name => {
if (name === checkingName) {
blackListed = true;
return resolve(blackListed);
}
});
}
else {
console.log("err");
}
}
The console.log output sequence is like so:
download and untar
file doesn't exist
UnhandledPromiseRejectionWarning: Error: ENOENT: no such file or directory, open '...lists/alreadyExtractedFile.list'
wrote all data to file
extracted
So the file lists/alreadyExtractedFile.list is being checked before it's created. My guess is I am doing some wrong async await actions. As console.logs pointed that out the second checking block is somehow coming earlier than the file creating and untaring processes.
Please help me to figure out what I am doing wrong.
Your problem is here
const file = await fs.readFileSync("lists/alreadyExtractedFile.list").toString().match(/[^\r\n]+/g);
the readFileSync function doesn't return a promise, so you shouldn't await it:
const file = fs.readFileSync("lists/alreadyExtractedFile.list")
.toString().match(/[^\r\n]+/g);
This should solve the issue
You need to call resolve inside new Promise() callback.
If you write a local utility and use some sync methods, you can use sync methods whenever possible (in fs, tar etc).
This is a small example where a small archive from the Node.js repository is asynchronously downloaded, synchronously written and unpacked, then a file is synchronously read:
'use strict';
const fs = require('fs');
const rp = require('request-promise');
const tar = require('tar');
(async function main() {
try {
const url = 'https://nodejs.org/download/release/latest/node-v11.10.1-headers.tar.gz';
const arcName = 'node-v11.10.1-headers.tar.gz';
const response = await rp.get({ uri: url, encoding: null });
fs.writeFileSync(arcName, response, { encoding: null });
tar.x({ file: arcName, cwd: '.', sync: true });
const fileContent = fs.readFileSync('node-v11.10.1/include/node/v8-version.h', 'utf8');
console.log(fileContent.match(/[^\r\n]+/g));
} catch (err) {
console.error(err);
}
})();

Resources