How to get the URL to a Google Cloud Storage file using gcloud-node? - node.js

Using the gcloud Node library, how do I get the URL for a file within a Cloud Storage bucket?
Consider the following instantation of a file object:
let bucket = gcs.bucket(`aBucket`)
let cloudFile = bucket.file(`aFile`)
I would like to get the URL for downloading cloudFile.

You can use a variety of request URIs, including storage.googleapis.com/<bucket>/<object>.

If file is public, then you can use a corresponding method:
cloudFile.publicUrl()

Related

Unable to use data from Google Cloud Storage in App Engine using Python 3

How can I read the data stored in my Cloud Storage bucket of my project and use it in my Python code that I am writing in App Engine?
I tried using:
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(source_blob_name)
But I am unable to figure out how to extract actual data from the code to get it in a usable form.
Any help would be appreciated.
Getting a file from a Google Cloud Storage bucket means that you are just getting an object. This concept abstract the file itself from your code. You will either need to store locally the file to perform any operation on it or depending on the extension of your file put that object inside of a file readstreamer or the method that you need to read the file.
Here you can see a code example on how to read a file from app engine:
def read_file(self, filename):
self.response.write('Reading the full file contents:\n')
gcs_file = gcs.open(filename)
contents = gcs_file.read()
gcs_file.close()
self.response.write(contents)
You have a couple of options.
content = blob.download_as_string() --> Converts the content of your Cloud Storage object to String.
blob.download_to_file(file_obj) --> Updates an existing file_obj to include the Cloud Storage object content.
blob.download_to_filename(filename) --> Saves the object in a file. On App Engine Standard environment, you can store files in /tmp/ directory.
Refer this link for more information.

Downloading folders from Google Cloud Storage Bucket with NodeJS

I need to download folders with NodeJS from my Bucket from my Google Cloud Storage. I read all the documentation and I only found a way to download files and not folders. I need to get/download the folder to provide user's download files.
Could someone help me?
As Doug said, Google Cloud Storage would show you the structure of different directories, but there are actually no folders within the buckets.
However, you can find perform some workarounds within your code to create that very same folder structure yourself. For the workaround I came up with, you need to use libraries such as shelljs, which will allow you to create folders in your system.
Following this GCP tutorial on Cloud Storage, you will find examples on, for instance, how to list or download files from your bucket.
Now, putting all this together, you can get the full path of the file you are going to download, parse it to separate the folders from the actual file, then create the folder structure using the method mkdir from shelljs.
For me, modifying the method for downloading files in the tutorial, was something like this:
var shell = require('shelljs');
[...]
async function downloadFile(bucketName, srcFilename, destFilename) {
// [START storage_download_file]
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
//Find last separator index
var index = srcFilename.lastIndexOf('/');
//Get the folder route as string using previous separator
var str = srcFilename.slice(0, index);
//Create recursively the folder structure in the current directory
shell.mkdir('-p', './'+str);
//Path of the downloaded file
var destPath = str+'/'+destFilename;
const options = {
destination: destPath,
};
// Downloads the file
await storage
.bucket(bucketName)
.file(srcFilename)
.download(options);
console.log(
`gs://${bucketName}/${srcFilename} downloaded to ${destPath}.`
);
// [END storage_download_file]
}
You will want to use the getFiles method of Bucket to query for the files you want to download, then download each one of them individually. Read more about how to use the underlying list API. There are no folder operations in Cloud Storage (as there are not actually any folders, there are just file paths the look like they're organized as folders).

How to store files in firebase using node.js

I have a small assignment where I will have a URL to a document or a file like google drive link or dropbox link.
I have to use this link to store that file or doc in firebase using nodejs. How should i start?
Little head's up might help. What should i use? Please help I'm stuck here.
The documentation for using the admin SDK is mostly covered in GCP documentation.
Here's a snippet of code that shows how you could upload a image directly to Cloud Storage if you have a URL for it. Any public link works, whether it's shared from Dropbox or somewhere else on the internet.
Edit 2020-06-01 The option to upload directly from URL was dropped in v2.0 of the SDK (4 September 2018): https://github.com/googleapis/nodejs-storage/releases/tag/v2.0.0
const fileUrl = 'https://www.dropbox.com/some/file/download/link.jpg';
const opts = {
destination: 'path/to/file.jpg',
metadata: {
contentType: 'image/jpeg'
}
};
firebase.storage().bucket().upload(fileUrl, opts);
This example is using the default bucket in your application and the opts object provides file upload options for the API call.
destination is the path that your file will be uploaded to in Google Cloud Storage
metadata should describe the file that you're uploading (see more examples here)
contentType is the file MIME type that you are uploading

Node.js: multi-part file upload via REST API

I would like to upload invoking a REST endpoint in multi-part.
In particular, I am looking at this API: Google Cloud Storage: Objects: insert
I did read about using multer, however I did not find any complete example showing me how to perform this operation.
Could someone help me with that?
https://cloud.google.com/nodejs/getting-started/using-cloud-storage#uploading_to_cloud_storage
^^ this is a a good example of how to use multer to upload a single image to Google Cloud Storage. Use multer to create filestream for each file ( storage: multer.memoryStorage() ), and handle the file stream by sending it to your GCS bucket in your callback.
However link only shows an example for one image. If you want to do an array of images, create a for-loop, where you create a stream for each file in your request, but only put the next() function after the for loop ends. If you keep the next(); in each loop cycle you will get the error: Error: Can't set headers after they are sent.
There is an example for uploading files with the nodejs client library and multer. You can modify this example and set the multipart option:
Download the sample code and cd into the folder:
git clone https://github.com/GoogleCloudPlatform/nodejs-docs-samples/
cd nodejs-docs-samples/appengine/storage
Edit the app.yaml file and include your bucket name:
GCLOUD_STORAGE_BUCKET: YOUR_BUCKET_NAME
Then in the source code, you can modify the publicUrl variable according to Objects: insert example:
const publicUrl = format(`https://www.googleapis.com/upload/storage/v1/b/${bucket.name}/o?uploadType=multipart`);
Download a key file for your service account and set the environment variable:
Go to the Create service account key page in the GCP Console.
From the Service account drop-down list, select New service account.
Input a name into the Service account name field.
From the Role drop-down list, select Project > Owner.
Click Create. A JSON file that contains your key downloads to your computer. And finally export the environment variable:
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/your/key/file
After that, yo're ready to run npm start and go to the app's frontend and upload your file:

Delete folder in Google Cloud Storage using nodejs gcloud api

I am using gcloud nodejs api to access Google Cloud Storage. I can save/delete/exists files individually, but I didn't find a way to delete a folder or even to list the files in a folder using gcloud nodejs api.
I have seen people say that the folder hierachy in GCS is not a real tree structure, but just names. So I tried to use wildcard to match the file name string, which did not succeed.
I wonder if there is any way to do it. If not, what tool should I use?
The code to list files in a directory should look something like this:
bucket.getFiles({ prefix: 'directoryName/' }, function(err, files) {})
And to delete:
bucket.deleteFiles({ prefix: 'directoryName/' }, function(err) {})
getFiles API documentation
deleteFiles API documentation
Instead of using gcloud nodejs api, there are two other ways to do this.
Use the googleapis package to access the standard JSON API and XML API of gcs. googleapis is a lower level API tool which includes interacting with google cloud services. We can create/list/delete files on gcs. Documentation and examples:
https://cloud.google.com/storage/docs/json_api/v1/objects/delete
https://cloud.google.com/storage/docs/json_api/v1/objects/list
Use childe_process to execute the gsutil commmand line tool. This is not a standard way of programatically accessing the google api, but still a viable solution.Wildcard is allowed when issuing the command. Note that is may not work on the google app engine. Here is an example.
Nodejs
var exec = require('child_process').exec;
exec("gsutil rm gs://[bucketname]/[directory ]/*" , function(error,stdout,stderr){});
As Stephen suggested, using standard gcloud method bucket.getFiles and bucket.deleteFiles is the most desirable approach. Since gcs don't have the concept of directories, the manipulation of multiple files obviously should be considered as a bucket level operation.

Categories

Resources