in firebase cloud function the bucket.upload promise resolves too early - node.js

I wrote a function that work like this
onNewZipFileRequested
{get all the necessary data}
.then{download all the files}
.then{create a zipfile with all those file}
.then{upload that zipfile} (*here is the problem)
.than{update the database with the signedUrl of the file}
Here is the relevant code
[***CREATION OF ZIP FILE WORKING****]
}).then(() =>{
zip.generateNodeStream({type:'nodebuffer',streamFiles:true})
.pipe(fs.createWriteStream(tempPath))
.on('finish', function () {
console.log("zip written.");
return bucket.upload(tempPath, { //**** problem****
destination: destinazionePath
});
});
}).then(()=>{
const config = {
action:'read',
expires:'03-09-2391'
}
return bucket.file(destinazionePath).getSignedUrl(config)
}).then(risultato=>{
const daSalvare ={
signedUrl: risultato[0],
status : 'fatto',
dataInserimento : zipball.dataInserimento
}
return event.data.ref.set(daSalvare)
})
On the client side, as soon as the app see the status change and the new Url, a download button (pointing to the new url) appears
Everything is working, but if I try to download the file immediately... there is no file yet!!!
If I wait same time and retry the file is there.
I noted that the time I have to wait depend on the size of the zipfile.
The bucket.upload promise should resolve on the end of the upload, but apparently fires too early.
Is there a way to know exactly when the file is ready?
I may have to make same very big file, it's not a problem if the process takes several minutes, but I need to know when it's over.
* EDIT *
there was a unnecessary nesting in the code. While it was not the error (results are the same before and after refactoring) it was causing some confusion in the answers, so i edited it out.
Id' like to point out that i update the database only after getting the signed url, and i get that only after the upload (i could not otherwise), so to get any result at all the promise chain MUST work, and in fact it does. When on the client side the download button appears (happens when 'status' become 'fatto') it is already linked to the correct signed url, but if i press it too early the file is not there (Failed - No file). If i wait some second (the bigger the file the longer i have to wait) then the file is there.
(English is not my mother language, if i have been unclear ask and i will try to explain myself better)

It looks like the problem could be that the braces are not aligned properly, causing a then statement to be embedded within another. Here is the code with the then statements separated:
[***CREATION OF ZIP FILE WORKING****]}).then(() => {
zip.generateNodeStream({type: 'nodebuffer', streamFiles: true})
.pipe(fs.createWriteStream(tempPath))
.on('finish', function () {
console.log('zip written.')
return bucket.upload(tempPath, {
destination: destinazionePath
})
})
}).then(() => {
const config = {
action: 'read',
expires: '03-09-2391'
}
return bucket.file(destinazionePath).getSignedUrl(config)
}).then(risultato => {
const daSalvare = {
signedUrl: risultato[0],
status : 'fatto',
dataInserimento : zipball.dataInserimento
}
return event.data.ref.set(daSalvare)
})

Related

How to I extract the contents of a variable and place them into a constant? Node.js

Im trying to extract the contents of variable topPost and place it into const options under url. I cant seem to get it to work. Im using the snoowrap/Reddit API and image-downloader.
var subReddit = r.getSubreddit('dankmemes');
var topPost = subReddit.getTop({time: 'hour' , limit: 1,}).map(post => post.url).then(console.log);
var postTitle = subReddit.getTop({time: 'hour' , limit: 1 }).map(post => post.title).then(console.log);
const options = {
url: topPost,
dest: './dank_memes/photo.jpg'
}
async function downloadIMG() {
try {
const { filename, image } = await download.image(options)
console.log(filename) // => /path/to/dest/image.jpg
} catch (e) {
console.error(e)
}
}
the recommended formatting for the image downloader is as follows:
const options = {
url: 'http://someurl.com/image.jpg',
dest: '/path/to/dest'
}
async function downloadIMG() {
try {
const { filename, image } = await download.image(options)
console.log(filename) // => /path/to/dest/image.jpg
} catch (e) {
console.error(e)
}
}
downloadIMG()
so it looks like i have to have my url formatted in between ' ' but i have no idea how to get the url from var topPost and place it in between those quotes.
any ideas would be greatly appreciated.
Thanks!
topPost is a Promise, not the final value.
Promises existence is to work with asynchronous data easily. Asynchronous data is data that returns at a point in the future, not instantly, and that's why they have a then method. When a Promise resolves to a value, the then callback is called.
In this case, the library will connect to Reddit and download data from it, which is not something that can done instantly, so the code will continue running and later will call the then callback, when the data has finished downloading. So:
var subReddit = r.getSubreddit('dankmemes');
// First we get the top posts, and register a "then" callback to receive all these posts
subReddit.getTop({time: 'hour' , limit: 1,}).map(post => post.url).then((topPost) => {
// When we got the top posts, we connect again to Reddit to get the top posts title.
subReddit.getTop({time: 'hour' , limit: 1 }).map(post => post.title).then((postTitle) => {
// Here you have both topPost and postTitle (which will be both arrays. You must access the first element)
console.log("This console.log will be called last");
});
});
// The script will continue running at this point, but the script is still connecting to Reddit and downloading the data
console.log("This console.log will be called first");
With this code you have a problem. You first connect to Reddit to get the top post URL, and then you connect to Reddit again to get the post Title. Is like pressing F5 in between. Simply think that if a new post is added between those queries, you will get the wrong title (and also you are consuming double bandwidth consumption, which is not optimal too). The correct way of doing this is to get both the title and the url on the same query. How to do so?, like this:
var subReddit = r.getSubreddit('dankmemes');
// We get the top posts, and map BOTH the url and title
subReddit.getTop({time: 'hour' , limit: 1,}).map(post => {
return {
url: post.url,
title: post.title
};
}).then((topPostUrlAndTitle) => {
// Here you have topPostUrlAndTitle[0].url and topPostUrlAndTitle[0].title
// Note how topPostUrlAndTitle is an array, as you are actually asking for "all top posts" although you are limiting to only one.
});
BUT this is also weird to do. Why don't you just get the post data directly? Like so:
var subReddit = r.getSubreddit('dankmemes');
// We get the top posts
subReddit.getTop({time: 'hour' , limit: 1,}).then((posts) => {
// Here you have posts[0].url and posts[0].title
});
There's a way to get rid of JavaScript callback hell with async/await, but I'm not going to enter into matter because for a newbie is a bit difficult to explain why is not synchronous code although it seems to look like so.

JSON file can't be modified

I am currently creating a web that uses a variable that I can store in a JSON format. My plan is to modify the value of the JSON every time there's a connection to a certain route. The problem is it just won't write.
I have tried to use fs.writeFile and fs.writeFileSync but none of them seem to work.
// Code I Have tried
const kwitansi = require('./no_kwitansi.json')
app.get('', async (req, res) => {
kwitansi.no_kwitansi += await 1
await fs.writeFile('../no_kwitansi.json', JSON.stringify(kwitansi, null, 2), function (e) {
if (e) {
throw new Error
} else {
console.log('Wrote to file')
}
})
await console.log(kwitansi)
await res.send(kwitansi)
})
// An Example of my JSON File
{
"no_kwitansi":4
}
You are trying to write to a place where you do not have permission. Note that you opened ./no_kwitansi.json, but you are trying to write to ../no_kwitansi.json (one directory back). If you are sure you can replace the original file, remove the extra . in the write line.
If the error persists, you also need to be sure that you have the proper permissions to write the file. If you are using *nix or mac, you can check this link.

How to handle sync browser emulations in node.js

I'm writing a script that is intended to load some stuff from .txt files and then perform multiple ( in a loop) requests to a website with node.js` browser emulator nightmare.
I have no problem with reading from the txt files and so no, but managing to make it run sync and without exceptions.
function visitPage(url, code) {
new Promise((resolve, reject) => {
Nightmare
.goto(url)
.click('.vote')
.insert('input[name=username]', 'testadmin')
.insert('.test-code-verify', code)
.click('.button.vote.submit')
.wait('.tag.vote.disabled,.validation-error')
.evaluate(() => document.querySelector('.validation -error').innerHTML)
.end()
.then(text => {
return text;
})
});
}
async function myBackEndLogic() {
try {
var br = 0, user, proxy, current, agent;
while(br < loops){
current = Math.floor(Math.random() * (maxLoops-br-1));
/*...getting user and so on..*/
const response = await visitPage('https://example.com/admin/login',"code")
br++;
}
} catch (error) {
console.error('ERROR:');
console.error(error);
}
}
myBackEndLogic();
The error that occurs is:
UnhandledPromiseRejectionWarning: TypeError: Cannot read property 'webContents' of undefined
So the questions are a few:
1) How to fix the exception
2) How to make it actually work sync and emulate everytime the address ( as in a previous attempt, which I didn't save, I fixed the exception, but the browser wasn't actually openning and it was basically skipped
3) (Not so important) Is it possible to select a few objects with
.wait('.class1,.class2,.validation-error')
and save each value in different variables or just get the text from the first that occured? ( if no any of these has occurred, then return 0 for example )
I see a few issues with the code above.
In the visitPage function, you are returning a Promise. That's fine, except you don't have to create the wrapping promise! It looks like nightmare returns a promise for you. Today, you're dropping an errors that promise returns by wrapping it. Instead - just use an async function!
async function visitPage(url, code) {
return Nightmare
.goto(url)
.click('.vote')
.insert('input[name=username]', 'testadmin')
.insert('.test-code-verify', code)
.click('.button.vote.submit')
.wait('.tag.vote.disabled,.validation-error')
.evaluate(() => document.querySelector('.validation -error').innerHTML)
.end();
}
You probably don't want to wrap the content of this method in a 'try/catch'. Just let the promises flow :)
async function myBackEndLogic() {
var br = 0, user, proxy, current, agent;
while(br < loops){
current = Math.floor(Math.random() * (maxLoops-br-1));
const response = await visitPage('https://example.com/admin/login',"code")
br++;
}
}
When you run your method - make sure to include a catch! Or a then! Otherwise, your app may exit early.
myBackEndLogic()
.then(() => console.log('donesies!'))
.catch(console.error);
I'm not sure if any of this will help with your specific issue, but hopefully it gets you on the right path :)

Download files asynchrounsly and parse them synchronously with Node JS

I have a gulp task that downloads a few JSON files from GitHub, then prompts the user for values to replace in those files. For example, I have an .ftpconfig that gets download, and then the user is asked to enter hostname, username, password, and path.
Because the file first needs to be downloaded before it can be configured, and each file needs to be configured sequentially, I'm using quite a few nested callbacks. I'd like to change this "callback hell" system so that it utilizes async/await and/or promises instead, but I'm having a lot of difficulty understanding exactly why my code isn't working; it seems that promises fire their .then() functions asynchronously, which doesn't make sense to me.
My goals are as follows:
Download all config files asynchronously
Wait for all config files to finish downloading
Read existing settings from the config files
Prompt the user for changed settings in each config file synchronously
I've tried a number of approaches, none of which worked. I discarded the code I've used, but here's a rough recreation of the things I've tried:
Attempt #1:
return new Promise((resolve) => {
// download files...
}).then((resolve) => {
// configure first file...
}).then((resolve) => {
// configure second file...
}).then((resolve) => {
// configure thrid file...
});
Attempt #2:
const CONFIG_FILES = async () => {
const bs_download = await generate_config("browsersync");
const ftp_download = await generate_config("ftp");
const rsync_download = await generate_config("rsync");
return new Promise(() => {
configure_json("browsersync");
}).then(() => {
configure_json("ftp");
}).then(() => {
configure_json("rsync");
});
};
I'm sure I'm doing something very obviously wrong, but I'm not adapt enough at JavaScript to see the problem. Any help would be great appreciated.
My gulp task can be found here:
gulpfile.js
gulp-tasks/config.js
Thanks to #EricB, I was able to figure out what I was doing wrong. It was mostly a matter of making my functions return promises as well.
https://github.com/JacobDB/new-site/blob/d119b8b3c22aa7855791ab6b0ff3c2e33988b4b2/gulp-tasks/config.js

Creating a file only if it doesn't exist in Node.js

We have a buffer we'd like to write to a file. If the file already exists, we need to increment an index on it, and try again. Is there a way to create a file only if it doesn't exist, or should I just stat files until I get an error to find one that doesn't exist already?
For example, I have files a_1.jpg and a_2.jpg. I'd like my method to try creating a_1.jpg and a_2.jpg, and fail, and finally successfully create a_3.jpg.
The ideal method would look something like this:
fs.writeFile(path, data, { overwrite: false }, function (err) {
if (err) throw err;
console.log('It\'s saved!');
});
or like this:
fs.createWriteStream(path, { overwrite: false });
Does anything like this exist in node's fs library?
EDIT: My question isn't if there's a separate function that checks for existence. It's this: is there a way to create a file if it doesn't exist, in a single file system call?
As your intuition correctly guessed, the naive solution with a pair of exists / writeFile calls is wrong. Asynchronous code runs in unpredictable ways. And in given case it is
Is there a file a.txt? — No.
(File a.txt gets created by another program)
Write to a.txt if it's possible. — Okay.
But yes, we can do that in a single call. We're working with file system so it's a good idea to read developer manual on fs. And hey, here's an interesting part.
'w' - Open file for writing. The file is created (if it does not
exist) or truncated (if it exists).
'wx' - Like 'w' but fails if path exists.
So all we have to do is just add wx to the fs.open call. But hey, we don't like fopen-like IO. Let's read on fs.writeFile a bit more.
fs.readFile(filename[, options], callback)#
filename String
options Object
encoding String | Null default = null
flag String default = 'r'
callback Function
That options.flag looks promising. So we try
fs.writeFile(path, data, { flag: 'wx' }, function (err) {
if (err) throw err;
console.log("It's saved!");
});
And it works perfectly for a single write. I guess this code will fail in some more bizarre ways yet if you try to solve your task with it. You have an atomary "check for a_#.jpg existence, and write there if it's empty" operation, but all the other fs state is not locked, and a_1.jpg file may spontaneously disappear while you're already checking a_5.jpg. Most* file systems are no ACID databases, and the fact that you're able to do at least some atomic operations is miraculous. It's very likely that wx code won't work on some platform. So for the sake of your sanity, use database, finally.
Some more info for the suffering
Imagine we're writing something like memoize-fs that caches results of function calls to the file system to save us some network/cpu time. Could we open the file for reading if it exists, and for writing if it doesn't, all in the single call? Let's take a funny look on those flags. After a while of mental exercises we can see that a+ does what we want: if the file doesn't exist, it creates one and opens it both for reading and writing, and if the file exists it does so without clearing the file (as w+ would). But now we cannot use it neither in (smth)File, nor in create(Smth)Stream functions. And that seems like a missing feature.
So feel free to file it as a feature request (or even a bug) to Node.js github, as lack of atomic asynchronous file system API is a drawback of Node. Though don't expect changes any time soon.
Edit. I would like to link to articles by Linus and by Dan Luu on why exactly you don't want to do anything smart with your fs calls, because the claim was left mostly not based on anything.
What about using the a option?
According to the docs:
'a+' - Open file for reading and appending. The file is created if it does not exist.
It seems to work perfectly with createWriteStream
This method is no longer recommended. fs.exists is deprecated. See comments.
Here are some options:
1) Have 2 "fs" calls. The first one is the "fs.exists" call, and the second is "fs.write / read, etc"
//checks if the file exists.
//If it does, it just calls back.
//If it doesn't, then the file is created.
function checkForFile(fileName,callback)
{
fs.exists(fileName, function (exists) {
if(exists)
{
callback();
}else
{
fs.writeFile(fileName, {flag: 'wx'}, function (err, data)
{
callback();
})
}
});
}
function writeToFile()
{
checkForFile("file.dat",function()
{
//It is now safe to write/read to file.dat
fs.readFile("file.dat", function (err,data)
{
//do stuff
});
});
}
2) Or Create an empty file first:
--- Sync:
//If you want to force the file to be empty then you want to use the 'w' flag:
var fd = fs.openSync(filepath, 'w');
//That will truncate the file if it exists and create it if it doesn't.
//Wrap it in an fs.closeSync call if you don't need the file descriptor it returns.
fs.closeSync(fs.openSync(filepath, 'w'));
--- ASync:
var fs = require("fs");
fs.open(path, "wx", function (err, fd) {
// handle error
fs.close(fd, function (err) {
// handle error
});
});
3) Or use "touch": https://github.com/isaacs/node-touch
Todo this in a single system call you can use the fs-extra npm module.
After this the file will have been created as well as the directory it is to be placed in.
const fs = require('fs-extra');
const file = '/tmp/this/path/does/not/exist/file.txt'
fs.ensureFile(file, err => {
console.log(err) // => null
});
Another way is to use ensureFileSync which will do the same thing but synchronous.
const fs = require('fs-extra');
const file = '/tmp/this/path/does/not/exist/file.txt'
fs.ensureFileSync(file)
With async / await and Typescript I would do:
import * as fs from 'fs'
async function upsertFile(name: string) {
try {
// try to read file
await fs.promises.readFile(name)
} catch (error) {
// create empty file, because it wasn't found
await fs.promises.writeFile(name, '')
}
}
Here's a synchronous way of doing it:
try {
await fs.truncateSync(filepath, 0);
} catch (err) {
await fs.writeFileSync(filepath, "", { flag: "wx" });
}
If the file exists it will get truncated, otherwise it gets created if an error is raised.
This works for me.
// Use the file system fs promises
const {access} = require('fs/promises');
// File Exist returns true
// dont use exists which is no more!
const fexists =async (path)=> {
try {
await access(path);
return true;
} catch {
return false;
}
}
// Wrapper for your main program
async function mainapp(){
if( await fexists("./users.json")){
console.log("File is here");
} else {
console.log("File not here -so make one");
}
}
// run your program
mainapp();
Just keep eye on your async - awaits so everthing plays nice.
hope this helps.
You can do something like this:
function writeFile(i){
var i = i || 0;
var fileName = 'a_' + i + '.jpg';
fs.exists(fileName, function (exists) {
if(exists){
writeFile(++i);
} else {
fs.writeFile(fileName);
}
});
}

Resources