How to access a specific file and its contents from within Google Cloud Storage - node.js

I need to access a file stored on Google Cloud Storage, this is not a file that is uploaded, instead it is created by a cloud function. I cannot figure out how to access this file using Node.
I have tried many of the things recommended on Stackoverflow.
google-cloud TypeError: gcs.bucket is not a function
Python GAE - How to check if file exists in Google cloud storage
How to read content of JSON file uploaded to google cloud storage using node js
Similarly, I have checked out the sample Google project but that only uses read streams and because I am not uploading the file, I couldn't figure out how to use that.
The closest I have gotten is modifing the first link to get this
var {Storage} = require('#google-cloud/storage');
var gcs = new Storage({
keyFilename: path.join("_file.json"),
projectId: 'ProjectId'
});
const chosenBucket = gcs.bucket("bucket_name");
var file = chosenBucket('file.json');
This causes a TypeError:
TypeError: bucket is not a function
How can I access and read the json located within the file?

const chosenBucket = gcs.bucket("bucket_name");
var file = chosenBucket('file.json');
chosenBucket is not a function. It's a Bucket object. Do something like this:
const chosenBucket = gcs.bucket("bucket_name");
var file = chosenBucket.file('file.json');
const download_options = {
// The path to which the file should be downloaded, e.g. "./file.txt"
destination: destFilename,
};
await file.download(download_options)
See an example: https://github.com/googleapis/nodejs-storage/blob/master/samples/files.js

Related

Write file and then upload to cloud storage - NodeJS

I'm trying to upload a file to my bucket after it's written but I'm not sure how to do it.
I confirm that code to write the file is ok as I tested it locally and it's working normally.
bucket.upload doesn't seem to work as the file is saved locally.
bucket.file.save is also not working
the file is saved at "./public/fileName.xlsx".
When I use:
storage.bucket("bucketName").file("bucketFileName").save("./public/fileName.xlsx")
There's indeed a file been uploaded to the storage, but its content is the path string that I'm passing inside .save()
So to resume my question is: How do I write a file and then upload it to my bucket?
ps: the file is an excel worksheet
If you confirmed that the file is saved locally and just want to upload it to the bucket, you may refer to the sample code below:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
// Change to your bucket name
const bucketName = 'bucket-name';
async function uploadFile(path, filename) {
// Path where to save the file in Google Cloud Storage.
const destFileName = `public/${filename}`;
const options = {
destination: destFileName,
// Optional:
// Set a generation-match precondition to avoid potential race conditions
// and data corruptions. The request to upload is aborted if the object's
// generation number does not match your precondition. For a destination
// object that does not yet exist, set the ifGenerationMatch precondition to 0
// If the destination object already exists in your bucket, set instead a
// generation-match precondition using its generation number.
preconditionOpts: {ifGenerationMatch: generationMatchPrecondition},
};
// The `path` here is the location of the file that you want to upload.
await storage.bucket(bucketName).upload(path, options);
console.log(`${path} uploaded to ${bucketName}`);
}
uploadFile('./public/fileName.xlsx', 'fileName.xlsx').catch(console.error);
Added some comments on the sample code.
For more information, you may check this documentation.

How to download file from Google Cloud Storage?

I'm following up on this article to download objects from GCP Cloud storage bucket: https://cloud.google.com/storage/docs/downloading-objects#storage-download-object-nodejs
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
async function downloadIntoMemory() {
// Downloads the file into a buffer in memory.
const contents = await storage.bucket(bucketName).file(fileName).download();
return contents;
);
}
downloadIntoMemory().catch(console.error);
I'm currently getting a buffer data in contents. I've this code hooked upto a API on NodeJS backend. I'm using React Typescript on frontend. Calling the API, gives me data buffer. How can I use it to download the file instead of the data buffer?
I tried the above method explicitly providing file destination, but I'm still getting the following error: EISDIR: illegal operation on a directory, open '{file_path_which_i_was_set}. Err: -21
As rightly pointed out by #John Hanley, you are referring to the documentation, where the code sample downloads an object into memory/ buffer in memory. If you want to download an object from a bucket to a file, refer to this code sample, where the ‘options’ parameter has to be passed to the download() method.
The code goes like this :
// The ID of your GCS bucket
const bucketName = 'your-unique-bucket-name';
// The ID of your GCS file
const fileName = 'your-file-name';
// The path to which the file should be downloaded
const destFileName = '/local/path/to/file.txt';
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
async function downloadFile() {
const options = {
destination: destFileName,
};
// Downloads the file to the destination file path
await storage.bucket(bucketName).file(fileName).download(options);
console.log(
`gs://${bucketName}/${fileName} downloaded to ${destFileName}.`
);
}
downloadFile().catch(console.error);

compress and write a file in another container with azure blob storage trigger in nodejs

i have to make an api call passing a compressed file as input. i have a working example on premise but I would like to move the solution in cloud. i was thinking to use azure blob storage and the azure function trigger. i have the below code that works for file but I don't know how to do the same with azure blob storage and azure function in nodejs
const zlib = require('zlib');
const fs = require('fs');
const def = zlib.createDeflate();
input = fs.createReadStream('claudio.json')
output = fs.createWriteStream('claudio-def.json')
input.pipe(def).pipe(output)
this code read a file as stream , compress the file and write another file as stream.
what I would like to do is reading the file any time I upload it in a container of azure blob storage, then I want to compress it and save in a different container with different name, then make an API call passing as input the compressed file saved in the other container
I tried this code for compressing the incoming file
const fs = require("fs");
const zlib = require('zlib');
const {Readable, Writable} = require('stream');
module.exports = async function (context, myBlob) {
context.log("JavaScript blob trigger function processed blob \n Blob:", context.bindingData.blobTrigger, "\n Blob Size:", myBlob.length, "Bytes");
// const fin = fs.createReadStream(context.bindingData.blobTrigger);
const def = zlib.createDeflate();
const s = Readable.from(myBlob.toString())
context.log(myBlob);
context.bindings.outputBlob = s.pipe(def)
};
the problem with this approach is that in the last line of the code
context.bindings.outputBlob = s.pipe(def)
i don't have the compressed file, while if i use this
s.pipe(def).pipe(process.stdout)
i can read the compressed file
as you can see above i also tried to use the fs.createReadStream(context.bindingData.blobTrigger) that contains the name of the uploaded file with the container name, but it doesn't work
any idea?
thank you
this is the solution
var input = context.bindings.myBlob;
var inputBuffer = Buffer.from(input);
var deflatedOutput = zlib.deflateSync(inputBuffer);
context.bindings.myOutputBlob = deflatedOutput;
https://learn.microsoft.com/en-us/answers/questions/500368/compress-and-write-a-file-in-another-container-wit.html

Google Storage is not a constructor error

I am building an App and my objective is every time someone upload an image to firebase storage, the cloud function resize that image.
...
import * as Storage from '#google-cloud/storage'
const gcs = new Storage()
...
exports.resizeImage = functions.storage.object().onFinalize( async object => {
const bucket = gcs.bucket(object.bucket);
const filePath = object.name;
const fileName = filePath.split('/').pop();
const bucketDir = dirname(filePath);
....
And when I tried to deploy this funtion I get this error:
Error: Error occurred while parsing your function triggers.
TypeError: Storage is not a constructor
I tried with "new Storage()" or just "Storage" and nothing works.
I'm newbie around here so if there's anything I forgot for you to debug this just let me know.
Thanks!
google-cloud/storage: 2.0.0
Node js: v8.11.4
The API documentation for Cloud Storage suggests that you should use require to load the module:
const Storage = require('#google-cloud/storage');
This applied to versions of Cloud Storage prior to version 2.x.
In 2.x, there was a breaking API change. You need to do this now:
const { Storage } = require('#google-cloud/storage');
If you would like TypeScript bindings, consider using Cloud Storage via the Firebase Admin SDK. The Admin SDK simply wraps the Cloud Storage module, and also exports type bindings to go along with it. It's easy to use:
import * as admin from 'firebase-admin'
admin.initializeApp()
admin.storage().bucket(...)
admin.storage() gives you a reference to the Storage object that you're trying to work with.
On node 14, using commonJS with babel (although I don't think Babel was interfering here), this is how I eventually got it working on an older project bumping GCS from 1.x to 5.x:
const Storage = require('#google-cloud/storage');
const gcs = new Storage({project_id});
didn't see this captured anywhere on the web
Just for knowledge purpose if you guys are using the ES module approach and node 12, the below snippet would work. I couldn't make none of other syntax working~
import storagePackage from '#google-cloud/storage';
const { Storage } = storagePackage;
const storage = new Storage();
If you are using electron and still don't work above solutions,try this.
const {Storage} = window.require('#google-cloud/storage');
const storage = new Storage({ keyFilename: "./uploader-credentials.json" });

Firebase Storage and Cloud functions: how to load content from url and save into storage?

I've realtime databse when user can write url of an image.
When created/updated I'm already able to trigger a cloud function that can read the url of the image from realtime database
I now need to download the image from web (so it's not an upload) and save it on firebase storage.
I cannot find a single example of fetching a web resource and store into firebase storage.
Can please you point me to right solution?
My idea was to reacting to create/update of the url on the db, then fetch (can I use fetch npm package??) and then save the fetched content into the storage bucket using url as key
But fetch + save fetched data is what I am not able to do now
Before someone closes this question because is 'off-topc', I write my own solution.
const bucket = admin.storage().bucket();
const axios = require('axios');
const response = await axios.post(BASE_URL, data_to_post, config);
console.log("Response.status", response.status);
const cache_file_name = `page-cache/page-${pageNumber}.html`;
const cache_file_options = {
metadata : {
contentType : 'text/html'
}
};
const cache_file = bucket.file(cache_file_name);
await cache_file.save(response.data, cache_file_options);

Resources