What is an alternative to updating an image in firebase storage bucket than deleting and uploaded? - node.js

I'm storing blog posts in a firebase database. Blog post can have an image. I have a storage bucket where I store the images, and I save the url to the image with the post in the database. Now I'm working on editing posts. Part of this involves uploading a different image. I don't want the images to pile up indefinitely in my storage bucket. I'd prefer to replace the old image with the new one. This is how I'm doing it:
const bucket = fbAdmin.storage().bucket('***');
if (fields.filename) {
await bucket.file(fields.filename).delete();
}
const result = await bucket.upload(files.image.path, {metadata: {contentType: files.image.type}});
In other words, I delete the old file then upload the new one.
But I've got to think there must be a simpler way of doing this. Rather than delete and upload, isn't there an update function? Something like:
bucket.update(files.image.path, {name: oldFileName});
Thanks

You can just write to the same path in the bucket without first deleting the file.
const bucket = fbAdmin.storage().bucket('***');
const result = await bucket.upload(files.image.path, {metadata: {contentType: files.image.type}});
If the file already exists, this will overwrite it.

While Frank's solution does technically work, if you have any metadata associated with your object, his approach removes it.
I found that I was able to maintain the metadata by using a similar approach. This also preserves custom metadata.
const bucket = admin.storage().bucket('***');
const destObjPath = 'some-path.jpg';
const fileRef = bucket.file(destObjPath);
// Get the object's existing metadata
const existingMetadata = (await fileRef.getMetadata())[0];
// Delete checksum and other calculated metadata fields
delete existingMetadata.crc32c;
delete existingMetadata.etag;
delete existingMetadata.md5Hash;
delete existingMetadata.size;
delete existingMetadata.timeCreated;
delete existingMetadata.timeStorageClassUpdated;
delete existingMetadata.updated;
// Provide the copied metadata along with the object update
await bucket.upload(files.image.path, {
destination: destObjPath,
metadata: existingMetadata
});

Related

How to upload a base64 image URL to Firebase and retrieve its access token?

In a MERN + Firebase project, I have an image data string that I want to upload and then get the access token of that file.
The image data string is of the following form:
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAOoAAAClCAYAAABSmmH3AAAAAXNSR0IArs4c6QAABFJJREFUeF7t1oENg0AQA8F8/6XSAx8pVWTloQIzZwvOvfd+PAQI/LXAMdS/vo9wBH4ChqoIBAIChho4kogEDFUHCAQEDDVwJBEJGKoOEAgIGGrgSCISMFQdIBAQMNTAkUQkYKg6QCAgYKiBI4lIwFB1gEBAwFADRxKRgKHqAIGAgKEGjiQiAUPVAQIBAUMNHElEAoaqAwQCAoYaOJKIBAxVBwgEBAw1cCQRCRiqDhAICBhq4EgiEjBUHSAQEDDUwJFEJGCoOkAgIGCogSOJSMBQdYBAQMBQA0cSkYCh6gCBgIChBo4kIgFD1QECAQFDDRxJRAKGqgMEAgKGGjiSiAQMVQcIBAQMNXAkEQkYqg4QCAgYauBIIhIwVB0gEBAw1MCRRCRgqDpAICBgqIEjiUjAUHWAQEDAUANHEpGAoeoAgYCAoQaOJCIBQ9UBAgEBQw0cSUQChqoDBAIChho4kogEDFUHCAQEDDVwJBEJGKoOEAgIGGrgSCISMFQdIBAQMNTAkUQkYKg6QCAgYKiBI4lIwFB1gEBAwFADRxKRgKHqAIGAgKEGjiQiAUPVAQIBAUMNHElEAoaqAwQCAoYaOJKIBAxVBwgEBAw1cCQRCRiqDhAICBhq4EgiEjBUHSAQEDDUwJFEJGCoOkAgIGCogSOJSMBQdYBAQMBQA0cSkYCh6gCBgIChBo4kIgFD1QECAQFDDRxJRAKGqgMEAgKGGjiSiAQMVQcIBAQMNXAkEQkYqg4QCAgYauBIIhIwVB0gEBAw1MCRRCRgqDpAICBgqIEjiUjAUHWAQEDAUANHEpGAoeoAgYCAoQaOJCIBQ9UBAgGB8zzPDeQUkcC0wHnf11CnK+DlCwJ+fQtXknFewBd1vgIACgK+qIUryTgvYKjzFQBQEDDUwpVknBcw1PkKACgIGGrhSjLOCxjqfAUAFAQMtXAlGecFDHW+AgAKAoZauJKM8wKGOl8BAAUBQy1cScZ5AUOdrwCAgoChFq4k47yAoc5XAEBBwFALV5JxXsBQ5ysAoCBgqIUryTgvYKjzFQBQEDDUwpVknBcw1PkKACgIGGrhSjLOCxjqfAUAFAQMtXAlGecFDHW+AgAKAoZauJKM8wKGOl8BAAUBQy1cScZ5AUOdrwCAgoChFq4k47yAoc5XAEBBwFALV5JxXsBQ5ysAoCBgqIUryTgvYKjzFQBQEDDUwpVknBcw1PkKACgIGGrhSjLOCxjqfAUAFAQMtXAlGecFDHW+AgAKAoZauJKM8wKGOl8BAAUBQy1cScZ5AUOdrwCAgoChFq4k47yAoc5XAEBBwFALV5JxXsBQ5ysAoCBgqIUryTgvYKjzFQBQEDDUwpVknBcw1PkKACgIGGrhSjLOCxjqfAUAFAQMtXAlGecFDHW+AgAKAoZauJKM8wKGOl8BAAUBQy1cScZ5AUOdrwCAgoChFq4k47yAoc5XAEBBwFALV5JxXsBQ5ysAoCDwBaT9ke70WG4vAAAAAElFTkSuQmCC
This is the reference to where the file should be uploaded:
const imageRef: StorageReference = ref(
storage,
`/issueImages/${firebaseImageId}`
);
So far, I have attempted to use the 'put' function with the imageRef, and when I try using uploadBytes() of firebase, I have to upload it as a Buffer, and even then I cannot seem to find the access token in the metadata.
To upload a data URL to Firebase, you would use storageRef.putString(url, 'DATA_URL') (legacy) or uploadString(storageRef, url, 'DATA_URL') (modern) depending on the SDK you are using.
When you upload a file to a Cloud Storage bucket, it will not be issued an access token until a client calls its version of getDownloadURL(). So to fix your issue, you would call getDownloadURL() immediately after upload.
If Node is running on a client's machine, you would use:
// legacy syntax
import * as firebase from "firebase";
// reference to file
const imageStorageRef = firebase.storage()
.ref(`/issueImages/${firebaseImageId}`);
// perform the upload
await imageStorageRef.putString(dataUrl, 'DATA_URL');
// get the download URL
const imageStorageDownloadURL = await imageStorageRef.getDownloadURL();
// modern syntax
import { getStorage, getDownloadURL, ref, uploadString } from "firebase/storage";
// reference to file
const imageStorageRef = ref(
getStorage(),
`/issueImages/${firebaseImageId}`
);
// perform the upload
await uploadString(imageStorageRef, dataUrl, 'DATA_URL');
// get the download URL
const imageStorageDownloadURL = await getDownloadURL(imageStorageRef);
If Node is running on a private server you control, you should opt to use the Firebase Admin SDK instead as it bypasses the rate limits and restrictions applied to the client SDKs.
As mentioned before, the download URLs aren't created automatically. Unfortunately for us, getDownloadURL is a feature of the client SDKs and the Admin SDK doesn't have it. So we can either let a client call getDownloadURL when it is needed or we can manually create the download URL if we want to insert it into a database.
Nico has an excellent write up on how Firebase Storage URLs work, where they collated information from the Firebase Extensions GitHub and this StackOverflow thread. In summary, to create (or recreate) a download URL once it has been uploaded, you can use the following function:
import { uuid } from "uuidv4";
// Original Credit: Nico (#nicomqh)
// https://www.sentinelstand.com/article/guide-to-firebase-storage-download-urls-tokens
// "file" is an instance of the File class from the Cloud Storage SDK
// executing this function more than once will revoke all previous tokens
function createDownloadURL(file) {
const downloadToken = uuid();
await file.setMetadata({
metadata: {
firebaseStorageDownloadTokens: downloadToken
}
});
return `https://firebasestorage.googleapis.com/v0/b/${file.bucket.name}/o/${encodeURIComponent(file.name)}?alt=media&token=${downloadToken}`;
}
The allows us to change the client-side code above into the following so it can run using the Admin SDK:
// assuming firebase-admin is initialized already
import { getStorage } from "firebase-admin/storage";
// reference to file
const imageStorageFile = getStorage()
.bucket()
.file(`/issueImages/${firebaseImageId}`);
// perform the upload
await imageStorageFile.save(dataUrl);
// get the download URL
const imageStorageDownloadURL = await createDownloadURL(imageStorageFile);
In all of the above examples, a download URL is retrieved and saved to the imageStorageDownloadURL variable. You should store this value as-is in your database. However, if you instead want to store only the access token and reassemble the URL on an as-needed basis, you can extract the token from its ?token= parameter using:
const downloadToken = new URL(imageStorageDownloadURL).searchParams.get('token');

Write file and then upload to cloud storage - NodeJS

I'm trying to upload a file to my bucket after it's written but I'm not sure how to do it.
I confirm that code to write the file is ok as I tested it locally and it's working normally.
bucket.upload doesn't seem to work as the file is saved locally.
bucket.file.save is also not working
the file is saved at "./public/fileName.xlsx".
When I use:
storage.bucket("bucketName").file("bucketFileName").save("./public/fileName.xlsx")
There's indeed a file been uploaded to the storage, but its content is the path string that I'm passing inside .save()
So to resume my question is: How do I write a file and then upload it to my bucket?
ps: the file is an excel worksheet
If you confirmed that the file is saved locally and just want to upload it to the bucket, you may refer to the sample code below:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
// Change to your bucket name
const bucketName = 'bucket-name';
async function uploadFile(path, filename) {
// Path where to save the file in Google Cloud Storage.
const destFileName = `public/${filename}`;
const options = {
destination: destFileName,
// Optional:
// Set a generation-match precondition to avoid potential race conditions
// and data corruptions. The request to upload is aborted if the object's
// generation number does not match your precondition. For a destination
// object that does not yet exist, set the ifGenerationMatch precondition to 0
// If the destination object already exists in your bucket, set instead a
// generation-match precondition using its generation number.
preconditionOpts: {ifGenerationMatch: generationMatchPrecondition},
};
// The `path` here is the location of the file that you want to upload.
await storage.bucket(bucketName).upload(path, options);
console.log(`${path} uploaded to ${bucketName}`);
}
uploadFile('./public/fileName.xlsx', 'fileName.xlsx').catch(console.error);
Added some comments on the sample code.
For more information, you may check this documentation.

Is it safe to export Firestore data multiple times to the same storage folder? Could the overwrite break the exported data?

Here is how I'm exporting my Firestore data:
import { today } from "#utils/dates/today";
import * as admin from "firebase-admin";
admin.initializeApp({
credential: admin.credential.cert(
MY_SERVICE_ACCOUNT as admin.ServiceAccount
)});
const client = new admin.firestore.v1.FirestoreAdminClient();
const BUCKET = "gs://MY_PROJECT_ID.appspot.com/firestore-backup";
const PROJECT_ID = "MY_PROJECT_ID";
const DB_NAME = client.databasePath(PROJECT_ID, "(default)");
export const backupData = async () : Promise<void> => {
const todayDate = today(); // THIS IS YYYY-MM-DD STRING
// const hashId = generateId().slice(0,5);
const responses = await client.exportDocuments({
name: DB_NAME,
outputUriPrefix: `${BUCKET}/${todayDate}`,
collectionIds: []
});
const response = responses[0];
console.log(`Operation Name: ${response['name']}`);
return;
};
You see I'm exporting to the following path:
/firestore-backup/YYYY-MM-DD/
If I'm going to backup multiple times over the same day, can I use same date folder? Is it safe to do it? Or should I add a hash to the folder name to avoid overwriting the previous export?
PS: The overwrite on a single day is not a problem. I just don't want to break the exported data.
If you go to the bucket and check the exports you'll see that the files exported seem to follow the same pattern every time. If we were to rely only on the write/update semantics of Cloud Storage, whenever there's a write to a location where a file already exists it is overwritten. Therefore, at first it doesn't seem it would cause data corruption.
However, the assumption above relies on the internal behavior of the export operations, which may be subject to future change (let aside that I can't even guarantee them as of now). Therefore, the best practice would be appending a hash to the folder name to prevent any unexpected behavior.
As an additional sidenote, it's worth mentioning that exports could incur in huge costs depending on the size of your Firestore data.

How to access a specific file and its contents from within Google Cloud Storage

I need to access a file stored on Google Cloud Storage, this is not a file that is uploaded, instead it is created by a cloud function. I cannot figure out how to access this file using Node.
I have tried many of the things recommended on Stackoverflow.
google-cloud TypeError: gcs.bucket is not a function
Python GAE - How to check if file exists in Google cloud storage
How to read content of JSON file uploaded to google cloud storage using node js
Similarly, I have checked out the sample Google project but that only uses read streams and because I am not uploading the file, I couldn't figure out how to use that.
The closest I have gotten is modifing the first link to get this
var {Storage} = require('#google-cloud/storage');
var gcs = new Storage({
keyFilename: path.join("_file.json"),
projectId: 'ProjectId'
});
const chosenBucket = gcs.bucket("bucket_name");
var file = chosenBucket('file.json');
This causes a TypeError:
TypeError: bucket is not a function
How can I access and read the json located within the file?
const chosenBucket = gcs.bucket("bucket_name");
var file = chosenBucket('file.json');
chosenBucket is not a function. It's a Bucket object. Do something like this:
const chosenBucket = gcs.bucket("bucket_name");
var file = chosenBucket.file('file.json');
const download_options = {
// The path to which the file should be downloaded, e.g. "./file.txt"
destination: destFilename,
};
await file.download(download_options)
See an example: https://github.com/googleapis/nodejs-storage/blob/master/samples/files.js

Firebase Storage and Cloud functions: how to load content from url and save into storage?

I've realtime databse when user can write url of an image.
When created/updated I'm already able to trigger a cloud function that can read the url of the image from realtime database
I now need to download the image from web (so it's not an upload) and save it on firebase storage.
I cannot find a single example of fetching a web resource and store into firebase storage.
Can please you point me to right solution?
My idea was to reacting to create/update of the url on the db, then fetch (can I use fetch npm package??) and then save the fetched content into the storage bucket using url as key
But fetch + save fetched data is what I am not able to do now
Before someone closes this question because is 'off-topc', I write my own solution.
const bucket = admin.storage().bucket();
const axios = require('axios');
const response = await axios.post(BASE_URL, data_to_post, config);
console.log("Response.status", response.status);
const cache_file_name = `page-cache/page-${pageNumber}.html`;
const cache_file_options = {
metadata : {
contentType : 'text/html'
}
};
const cache_file = bucket.file(cache_file_name);
await cache_file.save(response.data, cache_file_options);

Resources