Write file and then upload to cloud storage - NodeJS - node.js

I'm trying to upload a file to my bucket after it's written but I'm not sure how to do it.
I confirm that code to write the file is ok as I tested it locally and it's working normally.
bucket.upload doesn't seem to work as the file is saved locally.
bucket.file.save is also not working
the file is saved at "./public/fileName.xlsx".
When I use:
storage.bucket("bucketName").file("bucketFileName").save("./public/fileName.xlsx")
There's indeed a file been uploaded to the storage, but its content is the path string that I'm passing inside .save()
So to resume my question is: How do I write a file and then upload it to my bucket?
ps: the file is an excel worksheet

If you confirmed that the file is saved locally and just want to upload it to the bucket, you may refer to the sample code below:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
// Change to your bucket name
const bucketName = 'bucket-name';
async function uploadFile(path, filename) {
// Path where to save the file in Google Cloud Storage.
const destFileName = `public/${filename}`;
const options = {
destination: destFileName,
// Optional:
// Set a generation-match precondition to avoid potential race conditions
// and data corruptions. The request to upload is aborted if the object's
// generation number does not match your precondition. For a destination
// object that does not yet exist, set the ifGenerationMatch precondition to 0
// If the destination object already exists in your bucket, set instead a
// generation-match precondition using its generation number.
preconditionOpts: {ifGenerationMatch: generationMatchPrecondition},
};
// The `path` here is the location of the file that you want to upload.
await storage.bucket(bucketName).upload(path, options);
console.log(`${path} uploaded to ${bucketName}`);
}
uploadFile('./public/fileName.xlsx', 'fileName.xlsx').catch(console.error);
Added some comments on the sample code.
For more information, you may check this documentation.

Related

How to upload a base64 image URL to Firebase and retrieve its access token?

In a MERN + Firebase project, I have an image data string that I want to upload and then get the access token of that file.
The image data string is of the following form:
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAOoAAAClCAYAAABSmmH3AAAAAXNSR0IArs4c6QAABFJJREFUeF7t1oENg0AQA8F8/6XSAx8pVWTloQIzZwvOvfd+PAQI/LXAMdS/vo9wBH4ChqoIBAIChho4kogEDFUHCAQEDDVwJBEJGKoOEAgIGGrgSCISMFQdIBAQMNTAkUQkYKg6QCAgYKiBI4lIwFB1gEBAwFADRxKRgKHqAIGAgKEGjiQiAUPVAQIBAUMNHElEAoaqAwQCAoYaOJKIBAxVBwgEBAw1cCQRCRiqDhAICBhq4EgiEjBUHSAQEDDUwJFEJGCoOkAgIGCogSOJSMBQdYBAQMBQA0cSkYCh6gCBgIChBo4kIgFD1QECAQFDDRxJRAKGqgMEAgKGGjiSiAQMVQcIBAQMNXAkEQkYqg4QCAgYauBIIhIwVB0gEBAw1MCRRCRgqDpAICBgqIEjiUjAUHWAQEDAUANHEpGAoeoAgYCAoQaOJCIBQ9UBAgEBQw0cSUQChqoDBAIChho4kogEDFUHCAQEDDVwJBEJGKoOEAgIGGrgSCISMFQdIBAQMNTAkUQkYKg6QCAgYKiBI4lIwFB1gEBAwFADRxKRgKHqAIGAgKEGjiQiAUPVAQIBAUMNHElEAoaqAwQCAoYaOJKIBAxVBwgEBAw1cCQRCRiqDhAICBhq4EgiEjBUHSAQEDDUwJFEJGCoOkAgIGCogSOJSMBQdYBAQMBQA0cSkYCh6gCBgIChBo4kIgFD1QECAQFDDRxJRAKGqgMEAgKGGjiSiAQMVQcIBAQMNXAkEQkYqg4QCAgYauBIIhIwVB0gEBAw1MCRRCRgqDpAICBgqIEjiUjAUHWAQEDAUANHEpGAoeoAgYCAoQaOJCIBQ9UBAgGB8zzPDeQUkcC0wHnf11CnK+DlCwJ+fQtXknFewBd1vgIACgK+qIUryTgvYKjzFQBQEDDUwpVknBcw1PkKACgIGGrhSjLOCxjqfAUAFAQMtXAlGecFDHW+AgAKAoZauJKM8wKGOl8BAAUBQy1cScZ5AUOdrwCAgoChFq4k47yAoc5XAEBBwFALV5JxXsBQ5ysAoCBgqIUryTgvYKjzFQBQEDDUwpVknBcw1PkKACgIGGrhSjLOCxjqfAUAFAQMtXAlGecFDHW+AgAKAoZauJKM8wKGOl8BAAUBQy1cScZ5AUOdrwCAgoChFq4k47yAoc5XAEBBwFALV5JxXsBQ5ysAoCBgqIUryTgvYKjzFQBQEDDUwpVknBcw1PkKACgIGGrhSjLOCxjqfAUAFAQMtXAlGecFDHW+AgAKAoZauJKM8wKGOl8BAAUBQy1cScZ5AUOdrwCAgoChFq4k47yAoc5XAEBBwFALV5JxXsBQ5ysAoCBgqIUryTgvYKjzFQBQEDDUwpVknBcw1PkKACgIGGrhSjLOCxjqfAUAFAQMtXAlGecFDHW+AgAKAoZauJKM8wKGOl8BAAUBQy1cScZ5AUOdrwCAgoChFq4k47yAoc5XAEBBwFALV5JxXsBQ5ysAoCDwBaT9ke70WG4vAAAAAElFTkSuQmCC
This is the reference to where the file should be uploaded:
const imageRef: StorageReference = ref(
storage,
`/issueImages/${firebaseImageId}`
);
So far, I have attempted to use the 'put' function with the imageRef, and when I try using uploadBytes() of firebase, I have to upload it as a Buffer, and even then I cannot seem to find the access token in the metadata.
To upload a data URL to Firebase, you would use storageRef.putString(url, 'DATA_URL') (legacy) or uploadString(storageRef, url, 'DATA_URL') (modern) depending on the SDK you are using.
When you upload a file to a Cloud Storage bucket, it will not be issued an access token until a client calls its version of getDownloadURL(). So to fix your issue, you would call getDownloadURL() immediately after upload.
If Node is running on a client's machine, you would use:
// legacy syntax
import * as firebase from "firebase";
// reference to file
const imageStorageRef = firebase.storage()
.ref(`/issueImages/${firebaseImageId}`);
// perform the upload
await imageStorageRef.putString(dataUrl, 'DATA_URL');
// get the download URL
const imageStorageDownloadURL = await imageStorageRef.getDownloadURL();
// modern syntax
import { getStorage, getDownloadURL, ref, uploadString } from "firebase/storage";
// reference to file
const imageStorageRef = ref(
getStorage(),
`/issueImages/${firebaseImageId}`
);
// perform the upload
await uploadString(imageStorageRef, dataUrl, 'DATA_URL');
// get the download URL
const imageStorageDownloadURL = await getDownloadURL(imageStorageRef);
If Node is running on a private server you control, you should opt to use the Firebase Admin SDK instead as it bypasses the rate limits and restrictions applied to the client SDKs.
As mentioned before, the download URLs aren't created automatically. Unfortunately for us, getDownloadURL is a feature of the client SDKs and the Admin SDK doesn't have it. So we can either let a client call getDownloadURL when it is needed or we can manually create the download URL if we want to insert it into a database.
Nico has an excellent write up on how Firebase Storage URLs work, where they collated information from the Firebase Extensions GitHub and this StackOverflow thread. In summary, to create (or recreate) a download URL once it has been uploaded, you can use the following function:
import { uuid } from "uuidv4";
// Original Credit: Nico (#nicomqh)
// https://www.sentinelstand.com/article/guide-to-firebase-storage-download-urls-tokens
// "file" is an instance of the File class from the Cloud Storage SDK
// executing this function more than once will revoke all previous tokens
function createDownloadURL(file) {
const downloadToken = uuid();
await file.setMetadata({
metadata: {
firebaseStorageDownloadTokens: downloadToken
}
});
return `https://firebasestorage.googleapis.com/v0/b/${file.bucket.name}/o/${encodeURIComponent(file.name)}?alt=media&token=${downloadToken}`;
}
The allows us to change the client-side code above into the following so it can run using the Admin SDK:
// assuming firebase-admin is initialized already
import { getStorage } from "firebase-admin/storage";
// reference to file
const imageStorageFile = getStorage()
.bucket()
.file(`/issueImages/${firebaseImageId}`);
// perform the upload
await imageStorageFile.save(dataUrl);
// get the download URL
const imageStorageDownloadURL = await createDownloadURL(imageStorageFile);
In all of the above examples, a download URL is retrieved and saved to the imageStorageDownloadURL variable. You should store this value as-is in your database. However, if you instead want to store only the access token and reassemble the URL on an as-needed basis, you can extract the token from its ?token= parameter using:
const downloadToken = new URL(imageStorageDownloadURL).searchParams.get('token');

How to download file from Google Cloud Storage?

I'm following up on this article to download objects from GCP Cloud storage bucket: https://cloud.google.com/storage/docs/downloading-objects#storage-download-object-nodejs
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
async function downloadIntoMemory() {
// Downloads the file into a buffer in memory.
const contents = await storage.bucket(bucketName).file(fileName).download();
return contents;
);
}
downloadIntoMemory().catch(console.error);
I'm currently getting a buffer data in contents. I've this code hooked upto a API on NodeJS backend. I'm using React Typescript on frontend. Calling the API, gives me data buffer. How can I use it to download the file instead of the data buffer?
I tried the above method explicitly providing file destination, but I'm still getting the following error: EISDIR: illegal operation on a directory, open '{file_path_which_i_was_set}. Err: -21
As rightly pointed out by #John Hanley, you are referring to the documentation, where the code sample downloads an object into memory/ buffer in memory. If you want to download an object from a bucket to a file, refer to this code sample, where the ‘options’ parameter has to be passed to the download() method.
The code goes like this :
// The ID of your GCS bucket
const bucketName = 'your-unique-bucket-name';
// The ID of your GCS file
const fileName = 'your-file-name';
// The path to which the file should be downloaded
const destFileName = '/local/path/to/file.txt';
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
async function downloadFile() {
const options = {
destination: destFileName,
};
// Downloads the file to the destination file path
await storage.bucket(bucketName).file(fileName).download(options);
console.log(
`gs://${bucketName}/${fileName} downloaded to ${destFileName}.`
);
}
downloadFile().catch(console.error);

What is an alternative to updating an image in firebase storage bucket than deleting and uploaded?

I'm storing blog posts in a firebase database. Blog post can have an image. I have a storage bucket where I store the images, and I save the url to the image with the post in the database. Now I'm working on editing posts. Part of this involves uploading a different image. I don't want the images to pile up indefinitely in my storage bucket. I'd prefer to replace the old image with the new one. This is how I'm doing it:
const bucket = fbAdmin.storage().bucket('***');
if (fields.filename) {
await bucket.file(fields.filename).delete();
}
const result = await bucket.upload(files.image.path, {metadata: {contentType: files.image.type}});
In other words, I delete the old file then upload the new one.
But I've got to think there must be a simpler way of doing this. Rather than delete and upload, isn't there an update function? Something like:
bucket.update(files.image.path, {name: oldFileName});
Thanks
You can just write to the same path in the bucket without first deleting the file.
const bucket = fbAdmin.storage().bucket('***');
const result = await bucket.upload(files.image.path, {metadata: {contentType: files.image.type}});
If the file already exists, this will overwrite it.
While Frank's solution does technically work, if you have any metadata associated with your object, his approach removes it.
I found that I was able to maintain the metadata by using a similar approach. This also preserves custom metadata.
const bucket = admin.storage().bucket('***');
const destObjPath = 'some-path.jpg';
const fileRef = bucket.file(destObjPath);
// Get the object's existing metadata
const existingMetadata = (await fileRef.getMetadata())[0];
// Delete checksum and other calculated metadata fields
delete existingMetadata.crc32c;
delete existingMetadata.etag;
delete existingMetadata.md5Hash;
delete existingMetadata.size;
delete existingMetadata.timeCreated;
delete existingMetadata.timeStorageClassUpdated;
delete existingMetadata.updated;
// Provide the copied metadata along with the object update
await bucket.upload(files.image.path, {
destination: destObjPath,
metadata: existingMetadata
});

How to Upload PDF File on Azure Blob Storage via Node.js?

const blobServiceClient = await BlobServiceClient.fromConnectionString(connectionString);
const containerClient = await blobServiceClient.getContainerClient(container);
const blockBlobClient = containerClient.getBlockBlobClient(fileName);
const uploadBlobResponse = await blockBlobClient.upload(content, content.length);
console.log(uploadBlobResponse);
console.log(`FIle upload successfully on cloud ${uploadBlobResponse.requestId}`);
i am trying like this, but blockBlobClient.upload() needs content, i converted file in base64 and sent it into content, but i am having issue, file is uplaoded but corrupted. any help please.
Check the SDK, the upload method construct is upload(HttpRequestBody, number, BlockBlobUploadOptions), the content is HttpRequestBody, check the parameter it requires
Blob, string, ArrayBuffer, ArrayBufferView or a function which returns
a new Readable stream whose offset is from data source beginning.
So maybe you could try uploadFile, just use the file path to upload, I have tried this way it works.
Also, you could use uploadStream to upload the file readable stream.

How to access a specific file and its contents from within Google Cloud Storage

I need to access a file stored on Google Cloud Storage, this is not a file that is uploaded, instead it is created by a cloud function. I cannot figure out how to access this file using Node.
I have tried many of the things recommended on Stackoverflow.
google-cloud TypeError: gcs.bucket is not a function
Python GAE - How to check if file exists in Google cloud storage
How to read content of JSON file uploaded to google cloud storage using node js
Similarly, I have checked out the sample Google project but that only uses read streams and because I am not uploading the file, I couldn't figure out how to use that.
The closest I have gotten is modifing the first link to get this
var {Storage} = require('#google-cloud/storage');
var gcs = new Storage({
keyFilename: path.join("_file.json"),
projectId: 'ProjectId'
});
const chosenBucket = gcs.bucket("bucket_name");
var file = chosenBucket('file.json');
This causes a TypeError:
TypeError: bucket is not a function
How can I access and read the json located within the file?
const chosenBucket = gcs.bucket("bucket_name");
var file = chosenBucket('file.json');
chosenBucket is not a function. It's a Bucket object. Do something like this:
const chosenBucket = gcs.bucket("bucket_name");
var file = chosenBucket.file('file.json');
const download_options = {
// The path to which the file should be downloaded, e.g. "./file.txt"
destination: destFilename,
};
await file.download(download_options)
See an example: https://github.com/googleapis/nodejs-storage/blob/master/samples/files.js

Resources