Delete folder in Google Cloud Storage using nodejs gcloud api - node.js

I am using gcloud nodejs api to access Google Cloud Storage. I can save/delete/exists files individually, but I didn't find a way to delete a folder or even to list the files in a folder using gcloud nodejs api.
I have seen people say that the folder hierachy in GCS is not a real tree structure, but just names. So I tried to use wildcard to match the file name string, which did not succeed.
I wonder if there is any way to do it. If not, what tool should I use?

The code to list files in a directory should look something like this:
bucket.getFiles({ prefix: 'directoryName/' }, function(err, files) {})
And to delete:
bucket.deleteFiles({ prefix: 'directoryName/' }, function(err) {})
getFiles API documentation
deleteFiles API documentation

Instead of using gcloud nodejs api, there are two other ways to do this.
Use the googleapis package to access the standard JSON API and XML API of gcs. googleapis is a lower level API tool which includes interacting with google cloud services. We can create/list/delete files on gcs. Documentation and examples:
https://cloud.google.com/storage/docs/json_api/v1/objects/delete
https://cloud.google.com/storage/docs/json_api/v1/objects/list
Use childe_process to execute the gsutil commmand line tool. This is not a standard way of programatically accessing the google api, but still a viable solution.Wildcard is allowed when issuing the command. Note that is may not work on the google app engine. Here is an example.
Nodejs
var exec = require('child_process').exec;
exec("gsutil rm gs://[bucketname]/[directory ]/*" , function(error,stdout,stderr){});
As Stephen suggested, using standard gcloud method bucket.getFiles and bucket.deleteFiles is the most desirable approach. Since gcs don't have the concept of directories, the manipulation of multiple files obviously should be considered as a bucket level operation.

Related

Firebase Cloud Storage Do a WildCard Search Nodejs

Suppose I have this directory structure in the firebase cloud storage gs://my-bucket/userData/, and inside this directory I have multiple files with a similar name, like some-file-1.json, some-file-21.json, some-file-34.json.
My end goal is to get all these similar named files.
I'm guessing to do that, I need to do a wildcard search, as described here. There it shows how to do it using the gcloud command line. How do I do that in Nodejs or JavaScript?
Google Cloud Storage APIs do not support wildcard characters search of files, however, you can use prefixes to get a list of files that start with the same name prefix, which I assume will suite the needs of your app, here is an example implementation:
async function readFiles () {
const [files] = await bucket.getFiles({ prefix: 'userData/some-file-'});
files.forEach(file => {
console.log(file.name);
});
};
In this example you will log all the file names that start with some-file-.

Jimp write image to Google cloud storage node js

I'm currently writing images to my filesystem using jimp.
For example: image.write('path')
This cannot be used on Google App Engine because its read only filesystem.
How can I write this to a Google storage bucket? I've tried writestreams but keep getting read only errors so I feel like jimp is still writing to the drive.
Thanks heaps
As I can understand you are modifying some images on your App Engine and you want to upload them to a bucket but you didn't mention if you are using standard or flex environment. In order to do this you want to make your publicly readable so it can serve files.
Following this Google Cloud Platform Node.js documentation you can see that to upload a file to a bucket you need to create object first using:
const blob = bucket.file(yourFileName);
Then using createWriteStream you can upload the file

Cloud Functions: how to upload additional file for use in code?

I need to get access to protoc file in my code. Locally I just put it in the folder but how to get this file from deployed Firebase functions?
const grpc = require('grpc');
const PROTO_PATH = __dirname + '\\protos\\prediction_service.proto';
exports.helloWorld = functions.https.onRequest((request, response){
var tensorflow_serving = grpc.load(PROTO_PATH).tensorflow.serving;
...
}
You'd like to upload 3 files to deploy your Cloud Function:
index.js
package.json
prediction_service.proto
In order to do so via the Developer Console, you'll need to:
Go to the Google Cloud Developer Console > Cloud Functions > Create Function
In the "Source Code" field, choose either:
"ZIP upload" and select a zip file including your 3 files,
"ZIP from Cloud Storage" and select file location on GCS,
"Cloud Source repository" and provide your repo details
Fill in the remaining fields and click "Create"
Once deployed, in the source tab for your function you'll see the three files displayed.
Alternatively, you can use gcloud to deploy your files via the following command:
gcloud beta functions deploy <functionName> --source=SOURCE
where source can be a ZIP file on Google Cloud Storage, a reference to source repository or a local filesystem path. I'd recommend to have a look at the doc for this command for full details.
When you are using Firebase Cloud Functions with TypeScript (your code is in functions/src/index.ts), you need to put the additional files in functions/lib
I find this way the easiest when it comes to Firebase Functions:
Put your .proto file into functions folder of your firebase project (where index.js and package.json is located).
Deploy your functions as normal with the CLI command firebase deploy --only functions
As you can see here the file is automatically added to the project in the GCP:
And you can access it in your node.js project:
protobuf.load(__dirname + '/schema.proto')
While it is possible to use GCS, it's simple to include files in your source.
Put your package.json, index.js (or whatever file is specified in package.json's 'main' field) and other dependent files in a directory.
When you create your function, provide that directory including your other files via the ZIP upload or Cloud Source repository.
Your other files are available at path.join(__dirname, 'your/path/file.ext')

IFileProvider Azure File storage

I am thinking about implementing IFileProvider interface with Azure File Storage.
What i am trying to find in docs is if there is a way to send the whole path to the file to Azure API like rootDirectory/sub1/sub2/example.file or should that actually be mapped to some recursion function that would take path and traverse directories structure on file storage?
just want to make sure i am not missing something and reinvent the wheel for something that already exists.
[UPDATE]
I'm using Azure Storage Client for .NET. I would not like to mount anything.
My intentention is to have several IFileProviders which i could switch based on Environment and other conditions.
So, for example, if my environment is Cloud then i would use IFileProvider implementation that uses Azure File Services through Azure Storage Client. Next, if i have environment MyServer then i would use servers local file system. Third option would be environment someOther with that particular implementation.
Now, for all of them, IFileProvider operates with path like root/sub1/sub2/sub3. For Azure File Storage, is there a way to send the whole path at once to get sub3 info/content or should the path be broken into individual directories and get reference/content for each step?
I hope that clears the question.
Now, for all of them, IFileProvider operates with path like ˙root/sub1/sub2/sub3. For Azure File Storage, is there a way to send the whole path at once to getsub3` info/content or should the path be broken into individual directories and get reference/content for each step?
For access the specific subdirectory across multiple sub directories, you could use the GetDirectoryReference method for constructing the CloudFileDirectory as follows:
var fileshare = storageAccount.CreateCloudFileClient().GetShareReference("myshare");
var rootDir = fileshare.GetRootDirectoryReference();
var dir = rootDir.GetDirectoryReference("2017-10-24/15/52");
var items=dir.ListFilesAndDirectories();
For access the specific file under the subdirectory, you could use the GetFileReference method to return the CloudFile instance as follows:
var file=rootDir.GetFileReference("2017-10-24/15/52/2017-10-13-2.png");

Upload filestream to cloud-files with Node.js

I know that it's possible to upload files to my cloud-files account in Node.js, using the following module: node-cloudfiles.
But is it also possible to upload a filestream directly?
In my case I am dowloading an image from a certain location in Node.js and want to upload this directly to my cloud-files account without saving the image temporary on my server.
Of course it is possible - you can just read the documentation on Rackspace Cloud Files API ( http://docs.rackspacecloud.com/files/api/cf-devguide-latest.pdf ) and implement the necessary parts yourself.
However, I'd suggest to wait until https://github.com/nodejitsu/node-cloudfiles/pull/11 gets integrated into the trunk - then node-cloudfiles library will support uploading files using streams so you won't have to create files before uploading.

Resources