How to download file from Google Cloud Storage? - node.js

I'm following up on this article to download objects from GCP Cloud storage bucket: https://cloud.google.com/storage/docs/downloading-objects#storage-download-object-nodejs
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
async function downloadIntoMemory() {
// Downloads the file into a buffer in memory.
const contents = await storage.bucket(bucketName).file(fileName).download();
return contents;
);
}
downloadIntoMemory().catch(console.error);
I'm currently getting a buffer data in contents. I've this code hooked upto a API on NodeJS backend. I'm using React Typescript on frontend. Calling the API, gives me data buffer. How can I use it to download the file instead of the data buffer?
I tried the above method explicitly providing file destination, but I'm still getting the following error: EISDIR: illegal operation on a directory, open '{file_path_which_i_was_set}. Err: -21

As rightly pointed out by #John Hanley, you are referring to the documentation, where the code sample downloads an object into memory/ buffer in memory. If you want to download an object from a bucket to a file, refer to this code sample, where the ‘options’ parameter has to be passed to the download() method.
The code goes like this :
// The ID of your GCS bucket
const bucketName = 'your-unique-bucket-name';
// The ID of your GCS file
const fileName = 'your-file-name';
// The path to which the file should be downloaded
const destFileName = '/local/path/to/file.txt';
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
async function downloadFile() {
const options = {
destination: destFileName,
};
// Downloads the file to the destination file path
await storage.bucket(bucketName).file(fileName).download(options);
console.log(
`gs://${bucketName}/${fileName} downloaded to ${destFileName}.`
);
}
downloadFile().catch(console.error);

Related

How to check existence of soft deleted file in Azure blob container with node js?

I have file which was stored in some Azure blob directory "folder1/folder2/file.txt". This file was soft deleted - I can see it in Azure web console. I need to have function which checks this file existence.
I tried library "azure-storage". It perfectly works with NOT removed files:
const blobService = azure.createBlobService(connectingString);
blobService.doesBlobExist(container, blobPath, callback)
May be anyone knows how use same approach with soft removed files?
I tied with lib "#azure/storage-blob".
But I stuck with endless entities there (BlobServiceClient, ContainerItem, BlobClient, ContainerClient, etc) and couldn't find way to see particular file in particular blob directory.
Following this MSDOC, I got to restore the Soft deleted blobs and their names with the below code snippet.
const { BlobServiceClient } = require('#azure/storage-blob');
const connstring = "DefaultEndpointsProtocol=https;AccountName=kvpstorageaccount;AccountKey=<Storage_Account_Key>;EndpointSuffix=core.windows.net"
if (!connstring) throw Error('Azure Storage Connection string not found');
const blobServiceClient = BlobServiceClient.fromConnectionString(connstring);
async function main(){
const containerName = 'kpjohncontainer';
const blobName = 'TextFile05.txt';
const containerClient = blobServiceClient.getContainerClient(containerName);
undeleteBlob(containerClient, blobName)
}
main()
.then(() => console.log(`done`))
.catch((ex) => console.log(ex.message));
async function undeleteBlob(containerClient, blobName){
const blockBlobClient = await containerClient.getBlockBlobClient(blobName);
await blockBlobClient.undelete(); //to restore the deleted blob
console.log(`undeleted blob ${blobName}`);
}
Output:
To check if the blob exists and if exists but in Soft-deleted state, I found the relevant code but it’s in C# provided by #Gaurav Mantri. To achieve the same in NodeJS refer here.

Write file and then upload to cloud storage - NodeJS

I'm trying to upload a file to my bucket after it's written but I'm not sure how to do it.
I confirm that code to write the file is ok as I tested it locally and it's working normally.
bucket.upload doesn't seem to work as the file is saved locally.
bucket.file.save is also not working
the file is saved at "./public/fileName.xlsx".
When I use:
storage.bucket("bucketName").file("bucketFileName").save("./public/fileName.xlsx")
There's indeed a file been uploaded to the storage, but its content is the path string that I'm passing inside .save()
So to resume my question is: How do I write a file and then upload it to my bucket?
ps: the file is an excel worksheet
If you confirmed that the file is saved locally and just want to upload it to the bucket, you may refer to the sample code below:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
// Change to your bucket name
const bucketName = 'bucket-name';
async function uploadFile(path, filename) {
// Path where to save the file in Google Cloud Storage.
const destFileName = `public/${filename}`;
const options = {
destination: destFileName,
// Optional:
// Set a generation-match precondition to avoid potential race conditions
// and data corruptions. The request to upload is aborted if the object's
// generation number does not match your precondition. For a destination
// object that does not yet exist, set the ifGenerationMatch precondition to 0
// If the destination object already exists in your bucket, set instead a
// generation-match precondition using its generation number.
preconditionOpts: {ifGenerationMatch: generationMatchPrecondition},
};
// The `path` here is the location of the file that you want to upload.
await storage.bucket(bucketName).upload(path, options);
console.log(`${path} uploaded to ${bucketName}`);
}
uploadFile('./public/fileName.xlsx', 'fileName.xlsx').catch(console.error);
Added some comments on the sample code.
For more information, you may check this documentation.

compress and write a file in another container with azure blob storage trigger in nodejs

i have to make an api call passing a compressed file as input. i have a working example on premise but I would like to move the solution in cloud. i was thinking to use azure blob storage and the azure function trigger. i have the below code that works for file but I don't know how to do the same with azure blob storage and azure function in nodejs
const zlib = require('zlib');
const fs = require('fs');
const def = zlib.createDeflate();
input = fs.createReadStream('claudio.json')
output = fs.createWriteStream('claudio-def.json')
input.pipe(def).pipe(output)
this code read a file as stream , compress the file and write another file as stream.
what I would like to do is reading the file any time I upload it in a container of azure blob storage, then I want to compress it and save in a different container with different name, then make an API call passing as input the compressed file saved in the other container
I tried this code for compressing the incoming file
const fs = require("fs");
const zlib = require('zlib');
const {Readable, Writable} = require('stream');
module.exports = async function (context, myBlob) {
context.log("JavaScript blob trigger function processed blob \n Blob:", context.bindingData.blobTrigger, "\n Blob Size:", myBlob.length, "Bytes");
// const fin = fs.createReadStream(context.bindingData.blobTrigger);
const def = zlib.createDeflate();
const s = Readable.from(myBlob.toString())
context.log(myBlob);
context.bindings.outputBlob = s.pipe(def)
};
the problem with this approach is that in the last line of the code
context.bindings.outputBlob = s.pipe(def)
i don't have the compressed file, while if i use this
s.pipe(def).pipe(process.stdout)
i can read the compressed file
as you can see above i also tried to use the fs.createReadStream(context.bindingData.blobTrigger) that contains the name of the uploaded file with the container name, but it doesn't work
any idea?
thank you
this is the solution
var input = context.bindings.myBlob;
var inputBuffer = Buffer.from(input);
var deflatedOutput = zlib.deflateSync(inputBuffer);
context.bindings.myOutputBlob = deflatedOutput;
https://learn.microsoft.com/en-us/answers/questions/500368/compress-and-write-a-file-in-another-container-wit.html

How to Upload PDF File on Azure Blob Storage via Node.js?

const blobServiceClient = await BlobServiceClient.fromConnectionString(connectionString);
const containerClient = await blobServiceClient.getContainerClient(container);
const blockBlobClient = containerClient.getBlockBlobClient(fileName);
const uploadBlobResponse = await blockBlobClient.upload(content, content.length);
console.log(uploadBlobResponse);
console.log(`FIle upload successfully on cloud ${uploadBlobResponse.requestId}`);
i am trying like this, but blockBlobClient.upload() needs content, i converted file in base64 and sent it into content, but i am having issue, file is uplaoded but corrupted. any help please.
Check the SDK, the upload method construct is upload(HttpRequestBody, number, BlockBlobUploadOptions), the content is HttpRequestBody, check the parameter it requires
Blob, string, ArrayBuffer, ArrayBufferView or a function which returns
a new Readable stream whose offset is from data source beginning.
So maybe you could try uploadFile, just use the file path to upload, I have tried this way it works.
Also, you could use uploadStream to upload the file readable stream.

How to access a specific file and its contents from within Google Cloud Storage

I need to access a file stored on Google Cloud Storage, this is not a file that is uploaded, instead it is created by a cloud function. I cannot figure out how to access this file using Node.
I have tried many of the things recommended on Stackoverflow.
google-cloud TypeError: gcs.bucket is not a function
Python GAE - How to check if file exists in Google cloud storage
How to read content of JSON file uploaded to google cloud storage using node js
Similarly, I have checked out the sample Google project but that only uses read streams and because I am not uploading the file, I couldn't figure out how to use that.
The closest I have gotten is modifing the first link to get this
var {Storage} = require('#google-cloud/storage');
var gcs = new Storage({
keyFilename: path.join("_file.json"),
projectId: 'ProjectId'
});
const chosenBucket = gcs.bucket("bucket_name");
var file = chosenBucket('file.json');
This causes a TypeError:
TypeError: bucket is not a function
How can I access and read the json located within the file?
const chosenBucket = gcs.bucket("bucket_name");
var file = chosenBucket('file.json');
chosenBucket is not a function. It's a Bucket object. Do something like this:
const chosenBucket = gcs.bucket("bucket_name");
var file = chosenBucket.file('file.json');
const download_options = {
// The path to which the file should be downloaded, e.g. "./file.txt"
destination: destFilename,
};
await file.download(download_options)
See an example: https://github.com/googleapis/nodejs-storage/blob/master/samples/files.js

Resources