NodeJs fs.createWriteStream: 'wx' flag not working - node.js

I have this method, that starts saving an image from a stream:
saveFromStream() {
const imageName = this.getImageName();
console.log('flag of stream:',this.flag);
this.response.data.pipe(fs.createWriteStream(this.dest + imageName), { encoding: 'binary',flags:this.flag })
return new Promise((resolve, reject) => {
this.response.data.on('end', () => {
resolve()
})
this.response.data.on('error', () => {
reject()
})
})
}
The "this.flag" comes from user config object. When i pass 'wx', i see that existing files are still being overwritten. I don't understand why. As u can see, i'm passing the flag in the options object.
It works perfectly fine when i use fs.writeFile(which throws an 'EEXIST' error if the file exists), but not with fs.createWriteStream.
Any idea why?

Related

How to access data passed from dyamic route from express.js to react.js

Here is my backend route, i'm getting a file name as parameter from URL and accessing the file. how do I access the data passed from this route into my frontend react.js??
router.route("/list/:filename").get((req, res) => {
fs.readFile("./api/assignment_data/" + req.params.filename + ".json", function read(err, data) {
if (err) {
throw err;
}
const content = data;
foundFile => res.json(foundFile)
console.log("sent")
})
})
In my frontend I am doing something like this,
useEffect(() => {
fetch("/list/:filename").then(res => {
if (res.ok) {
console.log("all ok")
return res.json()
}
}).then(jsonRes => setMetrics(jsonRes))
})
I think you are doing everything right as long as you are sending correct file name which in react you have marked as :filename(like data.json) if you are doing that it should work, I just dont understand this line foundFile => res.json(foundFile) because you are not sending any file in return maybe you can try replacing that line with res.status(200).json({data}); this should work
You are using useeffect wrong this way it will infinitely run rerendering your application everytime there is any change in DOM, so do this
useEffect(() => {
fetch("/list/:filename").then(res => {
if (res.ok) {
console.log("all ok")
return res.json()
}
}).then(jsonRes => setMetrics(jsonRes))
},[])
this way it'll only run once

Converting HTML to PDF buffer in Nodejs

I am trying to convert an HTML code that is returned by the "returnDefaultOfferLetter" function here into PDF buffer(that I will use for sending attachments in a mail) using html-pdf package. So, the problem is it works on localhost but on AWS elastic beanstalk server it throws me ASSERTION ERROR. So after some research, I got to know I need to specify phantomPath. I tried everything I could, but I haven't got any solution.
BTW one week before it was working on AWS, so don't know what's wrong now. Help me in finding some solution or suggest me any method or package to convert HTML into pdf BUFFER. (Please, don't ignore buffer)
const htmlToBase64Pdf = (req, res) => {
const promise = new Promise((resolve, reject) => {
const offerLetterHTML = returnDefaultOfferLetter(req.body).toString(
"utf8"
);
const pdfOptions = {
format: "A3",
phantomPath: "../../node_modules/phantomjs-prebuilt/bin/phantomjs",
};
pdf.create(offerLetterHTML, pdfOptions).toBuffer(function (
err,
buffer
) {
if (err) {
// console.log("err", err);
reject(err);
} else {
// console.log("buffer", buffer);
const base64Attachment = buffer.toString("base64");
resolve(base64Attachment);
}
});
});
promise
.then((resp) => res.send(resp))
.catch((e) => {
res.send(e);
});
};

How to stream a file in post/put request with error handling?

This is not a question about "what is the best way to refactor the following code". it's about "how can I refactor the following code to have any control over both of the exceptions".
I have the following code which stream a file in a PUT request.
import fs from 'fs'
import got from 'got' // it doesn't really matters if it's `axious` or `got`
async function sendFile(addressToSend: string, filePath: string) {
const body = fs.createReadStream(filePath)
body.on('error', () => {
console.log('we cached the error in block-1')
})
try {
const result = await client.put(addressToSend, {
body,
})
} catch (e) {
console.log('we cached the error in block-2')
}
}
I'm trying to refactor this code in such a way that will give me the opportunity to catch all the errors from in a single place.
The above solution does not give me a way to test a failure of the stream. for example, if I pass a file that does not exist, the function will print both we cached the error in block-1 and we cached the error in block-2 but i don't have a way to re-throw that first error or use it in the tests in anyway.
Note:
I'm not sure if the best way to solve it is to do this:
Because when I pass a filePath that does not exist, the rej function will be called twice which is very bad practice.
function sendFile(addressToSend: string, filePath: string) {
return new Promise(async (res, rej) => {
const body = fs.createReadStream(filePath)
body.on('error', () => {
console.log('we cached the error in block-1')
rej('1')
})
try {
const result = await client.put(addressToSend, {
body,
})
res()
} catch (e) {
console.log('we cached the error in block-2')
rej('2')
}
})
}
I don't like it that much but this is the best I could think of:
function streamFilePut(client: Got, url: string, filePath: string) {
const body = fs.createReadStream(filePath)
const streamErrorPromise = new Promise((_, rej) => body.on('error', rej))
const resultPromise = new Promise((res, rej) => {
return client
.put(url, {
body,
})
.then(res, rej)
})
return Promise.race([streamErrorPromise, resultPromise])
}

Correct way to use promise in promised loop

Hello,
I use Promise for a initialize node project.
I want to insert in my MongoDb the name of files on all of my branch on my git repository.
I use nodegit to manipulate repo, for every nodegit method the return is a Promise. But i need to loop on all branch reference for get all files and branch name.
After that i can prepare my array for insert in database on next promise.
Code look like :
// List all branchs
.then((branchs) => {
let promises = [];
let allBranchFiles = [];
branchs.map((branch) => {
let q = repo.checkoutBranch(branch, config.checkoutOpts)
.then(() => repo.getCurrentBranch())
.then((ref) => {
return new Promise((resolve, reject) => {
fs.readdir('./myRepo/', (err, files) => {
files.map((file) => {
allBranchFiles.push({fileName: file, branch: branch});
});
resolve(allBranchFiles);
});
});
});
promises.push(q);
});
return Promise.all(promises);
})
That code finish by two way :
First :
{ Error: the index is locked; this might be due to a concurrent or crashed process
at Error (native) errno: -14 }
I'm sure no other process use git in this repo!
Second :
All of my files get the same value for "branch" but they are on separate branchs.
Thank you for helping me guys !
Cya.
I try something and it's work but it's ugly.
I mean, I use a recursiv function for doing my stuff. I wait better for change that.
I let you know how I replace my code on the question.
// List all branchs
.then((branchs) => {
return new Promise((resolve, reject) => recursiv(branchs).then(() => resolve(allBranchFiles)));
})
And my function :
let recursiv = (branchs, index=0) => {
return new Promise((resolved, rejected) => {
repo.checkoutBranch(branchs[index], config.checkoutOpts)
.then(() => repo.getCurrentBranch())
.then((user) => {
return new Promise((resolve, reject) => {
fs.readdir('./myRepo/', (err, files) => {
files.map((file) => {
allBranchFiles.push({fileName: file, branch: branchs[index]});
});
resolve();
});
});
})
.done(() => {
if (index < branchs.length-1) resolved(recursiv(branchs, index+1));
else resolved();
});
});
};

No end event when piping inside "open"

I am piping a download into a file, but wanting to make sure the file doesn't already exist. I've put the code up here for an easier exploration: https://tonicdev.com/tolmasky/streaming-piping-on-open-tester <-- this will show you the outputs (code also below inline).
So the thing is, it seems to work fine except for the done (end) event. The file ends up on the hard drive fine, each step is followed correctly (the structure is to ensure no "parallel" steps happen that aren't necessary -- if I do got.stream(url).pipe(fs.createWriteStream({ flags: ... })), then the download will actually get kicked off even if the createWriteStream returns an error because the file is already there -- undesirable for the network).
The code is the following:
var fs = require("fs");
var got = require("got");
await download("https://www.apple.com", "./index.html");
function download(aURL, aDestinationFilePath)
{
return new Promise(function(resolve, reject)
{
fs.createWriteStream(aDestinationFilePath, { flags: "wx" })
.on("open", function()
{
const writeStream = this;
console.log("SUCCESSFULLY OPENED!");
got.stream(aURL)
.on("response", function(aResponse)
{
const contentLength = +aResponse.headers["content-length"] || 0;
console.log(aResponse.headers);
console.log("STARTING DOWNLOAD! " + contentLength);
this.on("data", () => console.log("certainly getting data"))
this.pipe(writeStream)
.on("error", reject)
.on("end", () => console.log("DONE!"))
.on("end", resolve);
})
})
.on("error", function(anError)
{
if (anError.code === "EEXIST") { console.log("oh");
resolve();}
else
reject(anError);
});
});
}
According to the stream docs, readable.pipe returns the destination Writable stream, and the correct event emitted when a Writable is done would be Event: 'finish'.

Resources