Firebase Storage-How to delete file from storage with node.js? - node.js

I want to delete a folder in firebase storage with node js because this is a firebase function.
For example :
storageRef.child(child1).child(child2).delete();
something like this, but firebase documentation doesn't tell anything.
One more question:
When initialize storage documentation node js requires my admin json, but realtime database doesn't want this wonder why?

Have a look at the Node.js client API Reference for Google Cloud Storage and in particular at the delete() method for a File.

You can do it like this using Node.js:
const firebase = require('firebase-admin');
async function deleteImageFromFirebase(imageName) {
await firebase.storage().bucket().file("folderName/"+imageName).delete();
}
And like this client side:
// Create a reference to the file to delete
var desertRef = storageRef.child('images/desert.jpg');
// Delete the file
desertRef.delete().then(function() {
// File deleted successfully
}).catch(function(error) {
// Uh-oh, an error occurred!
});
View this info on the Firebase website:
how to delete files Firebase-storage

This might be late but at least on Web (so basically what you need), there is new API to delete the whole folder.
I tested deleting a folder with 2 pictures inside and it works. I then tried a folder-A with contents: folder-B + picture-A. Folder-B also has a picture-B inside; it still deleted folder-A with all of its contents.
Solution:
const bucket = admin.storage().bucket();
return bucket.deleteFiles({
prefix: `posts/${postId}`
);
I couldn't find this on the official documentation (perhaps is really new API) but really cool article where I found the solution:
Automatically delete your Firebase Storage Files from Firestore with Cloud Functions for Firebase

import { storage } from "./firebaseClient";
import { bucket } from "./firebaseServer";
//Let's assume this is the URL of the image we want to delete
const downloadUrl = "https://storage.googleapis.com/storage/v1/b/<projectID>.appspot.com/o/<location>?"
//firebase delete function
const deleteImages = async ({ downloadUrl }) => {
const httpsRef = storage.refFromURL(downloadUrl).fullPath;
return await bucket
.file(httpsRef)
.delete()
.then(() => "success")
.catch(() => "error")
}
//call the deleteImages inside async function
const deleteStatus = await deleteImages({ downloadUrl: oldImage });
console.log(deleteStatus) //=> "success"

Related

Generating rss.xml for Angular 8 app locally works fine, but not on prod

I am trying to generate a rss.xml file for my angular 8 app with ssr + firebase + gcp inside the domain.
I've created a RssComponent which can be reached at /rss route. There i call getNews() method and receive an array of objects. Then I make a http request to /api/rss and in server.ts i handle that request:
app.post('/api/rss', (req, res) => {
const data = req.body.data;
const feedOptions = // defining options here
const feed = new RSS(feedOptions);
data.forEach((item) => {
feed.item({
title: item.headingUa,
description: item.data[0].dataUa,
url: item.rssLink,
guid: item.id,
date: item.utcDate,
enclosure: {url: item.mainImg.url.toString().replace('&', '&'), type: 'image/jpeg'}
});
});
const xml = feed.xml({indent: true});
fs.chmod('dist/browser/rss.xml', 0o600, () => {
fs.writeFile('dist/browser/rss.xml', xml, 'utf8', function() {
res.status(200).end();
});
});
});
And finally on response i'm opening the recently generated rss.xml file in RssComponent. Locally everything is working fine but on Google Cloud Platform it's not generating a file.
As explained in the Cloud Functions docs:
The only writeable part of the filesystem is the /tmp directory
Try changing the path to the file to the /tmp directory.
Nevertheless, using local files in a serverless environment is a really bad idea. You should assume the instance handling the following request will not be the same as the previous one.
The best way to handle this would be to avoid writing local files and instead storing the generated file in GCP Storage or Firebase Storage, and then retrieving it from there when needed.
This will ensure your functions are idempotent, and also will comply with the best practices.

Download file from GCF function

I am running a NodeJS script using puppeteer in my local machine to download some assets from Internet. I wanted that script to to be running as Google Cloud function.
I just wanted to know, is there any local space associated with GFC where we can save this files and can be accessed later or can we specify any cloud storage bucket URL where this download can save.
#!/usr/bin/env node
const { program } = require('commander');
const puppeteer = require('puppeteer');
program
.option('-e, --email <email>', 'Login Email Address', process.env.LOOKER_EMAIL || '')
.option('-p, --password <password>', 'Login Password', process.env.LOOKER_PASSWORD || '')
.option('-d, --dashboard <id>', 'Dashboard To Download');
program.parse(process.argv);
const fs = require('fs');
const basePath = 'C:\\card\\'
(async () => {
const loginEmail = program.email;
const loginPassword = program.password;
const dashboardId = program.dashboard;
// used puppeteer to download some files
const browser = await puppeteer.launch({
headless: true
})
let pages = await browser.pages();
const page = await browser.newPage();
await page.setViewport({ width: 1920, height: 1080 });
await page.goto(loginUrl);
await page.waitForSelector(loginEmailSelector);
await page.type(loginEmailSelector, loginEmail);
await page.type(loginPasswordSelector, loginPassword);
await Promise.all([
page.waitForNavigation(),
page.click(loginButtonSelector)
]);
await page.goto(`https://somewebsite/${dashboardId}`);
await page.waitForSelector(menuSelector, {
visible: true
});
await page.click(menuSelector);
await page.waitForSelector(downloadSelector, {
visible: true
});
const ts = Date.now()
const downloadLoc = basePath + ts + '\\'
console.log('downloadLoc ', downloadLoc)
await page._client.send('Page.setDownloadBehavior', {
behavior: 'allow',
downloadPath: downloadLoc
})
console.log(`your file's on the way!`)
})();
So here in the script I am just downloading the file in C drive, I wanted this to store in some cloud storage if possible, Please let me know if you have any suggestions.
Concept of Cloud Function assumes that code should be stateless, which means that any data should be stored outside, although there is possibility to use /tmp directory, but this is only for temporary purposes. Recommended solution is Cloud Storage (reference).
However not only Cloud Storage can be used to keep the state. This would be the best in case of binary objects, meaning files.
On the other hand, if those files contain data, you could try to choose one of Google NoSQL databases like Firestore, Datastore (actually Firestore in Datastore mode) and Firebase Realtime database. All of them have nice API for many languages, including of course node.js. Additionally, if you plan to create larger solutions, it's possible to even use BigTable for massive data and BigQuery if you need analytics. All of this depends on what you need.
Nice and very convenient in above mentioned Google API's is that in Cloud Functions there is no need to authenticate to the particular products saving a lot of code and resources. All of the solutions are server-less, so you do not have to care about server underneath and scaling when your solution will grow. Also you can get extremely fast network speed between resources when you will do it inside GCP.

Cloud function to export Firestore backup data. Using firebase-admin or #google-cloud/firestore?

I'm currently trying to build a cloud function to export my Firestore data to my Storage Bucket.
The only example I've found on the Firebase DOCs on how to do this:
https://googleapis.dev/nodejs/firestore/latest/v1.FirestoreAdminClient.html#exportDocuments
EXAMPLE
const firestore = require('#google-cloud/firestore');
const client = new firestore.v1.FirestoreAdminClient({
// optional auth parameters.
});
const formattedName = client.databasePath('[PROJECT]', '[DATABASE]');
client.exportDocuments({name: formattedName})
.then(responses => {
const response = responses[0];
// doThingsWith(response)
})
.catch(err => {
console.error(err);
});
From that example, it seems that I need to install #google-cloud/firestore as a dependency to my cloud function.
But I was wondering if I can access these methods using only the firebase-admin package.
I've thought of that because the firebase-admin has the #google-cloud/firestore as a dependency already.
> firebase-admin > package.json
"dependencies": {
"#firebase/database": "^0.4.7",
"#google-cloud/firestore": "^2.0.0", // <---------------------
"#google-cloud/storage": "^3.0.2",
"#types/node": "^8.0.53",
"dicer": "^0.3.0",
"jsonwebtoken": "8.1.0",
"node-forge": "0.7.4"
},
QUESTION:
Is it possible to get an instance of the FirestoreAdminClient and use the exportDocuments method using just the firebase-admin ?
Or do I really need to install the #google-cloud/firestore as a direct dependency and work with it directly?
The way you're accessing the admin client is correct as far as I can tell.
const client = new admin.firestore.v1.FirestoreAdminClient({});
However, you probably won't get any TypeScript/intellisense help beyond this point since the Firestore library does not actually define detailed typings for v1 RPCs. Notice how they are declared with any types: https://github.com/googleapis/nodejs-firestore/blob/425bf3d3f5ecab66fcecf5373e8dd03b73bb46ad/types/firestore.d.ts#L1354-L1364
Here is an implementation I'm using that allows you to do whatever operations you need to do, based on the template provided by firebase here https://firebase.google.com/docs/firestore/solutions/schedule-export
In my case I'm filtering out collections from firestore I don't want the scheduler to automatically backup
const { Firestore } = require('#google-cloud/firestore')
const firestore = new Firestore()
const client = new Firestore.v1.FirestoreAdminClient()
const bucket = 'gs://backups-user-data'
exports.scheduledFirestoreBackupUserData = async (event, context) => {
const databaseName = client.databasePath(
process.env.GCLOUD_PROJECT,
'(default)'
)
const collectionsToExclude = ['_welcome', 'eventIds', 'analyticsData']
const collectionsToBackup = await firestore.listCollections()
.then(collectionRefs => {
return collectionRefs
.map(ref => ref.id)
.filter(id => !collectionsToExclude.includes(id))
})
return client
.exportDocuments({
name: databaseName,
outputUriPrefix: bucket,
// Leave collectionIds empty to export all collections
// or define a list of collection IDs:
// collectionIds: ['users', 'posts']
collectionIds: [...collectionsToBackup]
})
.then(responses => {
const response = responses[0]
console.log(`Operation Name: ${response['name']}`)
return response
})
.catch(err => {
console.error(err)
})
}
firebase-admin just wraps the Cloud SDK and re-exports its symbols. You can use the wrapper, or use the Cloud SDK directly, or even a combination of the two if you want. If you want to use both, you have to declare an explicit dependency on #google-cloud/firestore in order to be able to import it directly into your code.
Here is the full explanation with code (I use it and it works very well) on how to do automated Firestore backups by mixing Cloud Scheduler, PubSub and Firebase Function https://firebase.google.com/docs/firestore/solutions/schedule-export

Delete a file from firebase storage using download url with Cloud Functions

I have a collection of profiles in my Firestore db and a field named "profilePicture" with a downloadUrl as the value.
Im using cloud functions and been trying for a long time to figure out how to delete the profilePicture when the profile is deleted.
I know how to create a trigger when the profile is deleted and get the profile picture downloadUrl, but how do I delete the file from storage with only the downloadUrl?
The firebase storage documentation provides a method refFromURL(url) that can be used on a Storage instance. It states the url argument can be:
A URL in the form:
1) a gs:// URL, for example gs://bucket/files/image.png
2) a download URL taken from object metadata.
Based (2) above, it seems like an HTTP URL should also work. However it probably is better practise to store a path string, as the tokens on the HTTP URLs can get rotated by Firebase.
In Angular I use this to delete file from Cloud Firestore by downloadURL
constructor(private storage: AngularFireStorage) {}
onDeleteAttachment(downloadURL: string) {
this.storage.storage.refFromURL(downloadURL).delete();
}
My understanding is that the node SDK for Cloud Storage can't convert HTTP URLs into file paths within a storage bucket. Instead, you should be storing the file path along with the download URL in document. This will make it possible for to build a File object that can be used to delete the image when it's time to do so.
for admin.storage.Storage is no build in method for get reference from url for storage
but you can extract file path from URL ,by remove baseUrl and do some code replace on URL
im create method for this task to accept url from storage project and return path
function getPathStorageFromUrl(url:String){
const baseUrl = "https://firebasestorage.googleapis.com/v0/b/project-80505.appspot.com/o/";
let imagePath:string = url.replace(baseUrl,"");
const indexOfEndPath = imagePath.indexOf("?");
imagePath = imagePath.substring(0,indexOfEndPath);
imagePath = imagePath.replace("%2F","/");
return imagePath;
}
NOTE : You must replace baseUrl for every project, you can find it by open any image in you storage , and copy it from URL in browser from start to end of last slash '/'
Ex :
Some image link on my storage :
https://firebasestorage.googleapis.com/v0/b/project-80505.appspot.com/o/RequestsScreenshot%2F-M6CA-2bG2aP_WwOF-dR__1i5056O335?alt=media&token=d000fab7
the base URL will be
https://firebasestorage.googleapis.com/v0/b/project-80505.appspot.com/o/
now after get path call file to delete it from storage
const storage = admin.storage();
const imagePath:string = getPathStorageFromUrl(obj.imageUrl);
storage.bucket().file(imagePath).delete().catch((err) => console.error(err));
NOTE : There is no documentation explaining the format of the URL,
which implies that the Firebase team might feel the need to change it
some day , mean maybe will not work in the future if format is change.
confing.js
import firebase from 'firebase/app'
import "firebase/firestore";
import "firebase/storage";
const firebaseConfig = {
apiKey: "XXXX",
authDomain: "XXXXX.firebaseapp.com",
databaseURL: "https://XXXX-app-web.firebaseio.com",
projectId: "XXXX",
storageBucket: "XXXX-app-web.appspot.com",
messagingSenderId: "XXXXXX",
appId: "1:XXX:web:XXXX",
measurementId: "G-XXXX"
};
firebase.initializeApp(firebaseConfig);
export const firestore = firebase.firestore();
export const storageRef = firebase.storage();
export default firebase;
Button.js
import React from 'react';
import {firestore,storageRef} from './Config';
function removeFile(id,downloadUrl) {
const storageRefa = storageRef.refFromURL(downloadUrl);
storageRefa.delete().then(() => {
firestore.collection("All_Files").doc(id).delete().then((response) => {
console.log('delete response', response)
}).catch((error) => {
console.log('delete error', error)
})
}).catch((error) => {
console.log('delete error', error)
});
}
export default function MediaCard(props) {
return (
<>
<Button
onClick={() =>{
removeFile(props.ID,props.downloadUrl)
}}
variant="contained"
color="secondary"
>
Delete
</Button>
</>
);
}
Mahmoud's answer need a little edit .. it works tho .. He is doing the replacements wrongly and might not work if you have nested directories or spaced filenames in your storage
getPathStorageFromUrl(url:String){
const baseUrl = "https://firebasestorage.googleapis.com/v0/b/project-80505.appspot.com/o/";
let imagePath:string = url.replace(baseUrl,"");
const indexOfEndPath = imagePath.indexOf("?");
imagePath = imagePath.substring(0,indexOfEndPath);
imagePath = imagePath.replace(/%2F/g,"/");
imagePath = imagePath.replace(/%20/g," ");
return imagePath;
}

Retrieving a Firebase storage image link via Cloud Function

How do I retrieve the download links to stored images in Firebase via a cloud function?
I've tried all kinds of variations including the next one:
exports.getImgs = functions.https.onRequest((req, res) => {
var storage = require('#google-cloud/storage')();
var storageRef = storage.ref;
console.log(storageRef);
storageRef.child('users/user1/avatar.jpg').getDownloadURL().then(function(url) {
});
});
It annoyed me, so I will put the solution with a straight forward explanation to those who are looking for it.
1st, install GCD Storage using the firebase command line:
npm install --save #google-cloud/storage
Cloud function code:
const gcs = require('#google-cloud/storage')({keyFilename: 'service-account.json'});
const bucket = gcs.bucket('name-of-bucket.appspot.com');
const file = bucket.file('users/user1/avatar.jpg');
return file.getSignedUrl({
action: 'read',
expires: '03-09-2491'
}).then(signedUrls => {
console.log('signed URL', signedUrls[0]); // this will contain the picture's url
});
The name of your bucket can be found in the Firebase console under the Storage section.
The 'service-account.json' file can be created and downloaded from here:
https://console.firebase.google.com/project/_/settings/serviceaccounts/adminsdk
And should be stored locally in your Firebase folder under the functions folder. (or other as long as change the path in the code above)
That's it.

Resources