I would like to upload invoking a REST endpoint in multi-part.
In particular, I am looking at this API: Google Cloud Storage: Objects: insert
I did read about using multer, however I did not find any complete example showing me how to perform this operation.
Could someone help me with that?
https://cloud.google.com/nodejs/getting-started/using-cloud-storage#uploading_to_cloud_storage
^^ this is a a good example of how to use multer to upload a single image to Google Cloud Storage. Use multer to create filestream for each file ( storage: multer.memoryStorage() ), and handle the file stream by sending it to your GCS bucket in your callback.
However link only shows an example for one image. If you want to do an array of images, create a for-loop, where you create a stream for each file in your request, but only put the next() function after the for loop ends. If you keep the next(); in each loop cycle you will get the error: Error: Can't set headers after they are sent.
There is an example for uploading files with the nodejs client library and multer. You can modify this example and set the multipart option:
Download the sample code and cd into the folder:
git clone https://github.com/GoogleCloudPlatform/nodejs-docs-samples/
cd nodejs-docs-samples/appengine/storage
Edit the app.yaml file and include your bucket name:
GCLOUD_STORAGE_BUCKET: YOUR_BUCKET_NAME
Then in the source code, you can modify the publicUrl variable according to Objects: insert example:
const publicUrl = format(`https://www.googleapis.com/upload/storage/v1/b/${bucket.name}/o?uploadType=multipart`);
Download a key file for your service account and set the environment variable:
Go to the Create service account key page in the GCP Console.
From the Service account drop-down list, select New service account.
Input a name into the Service account name field.
From the Role drop-down list, select Project > Owner.
Click Create. A JSON file that contains your key downloads to your computer. And finally export the environment variable:
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/your/key/file
After that, yo're ready to run npm start and go to the app's frontend and upload your file:
Related
Right now I have an Azure Function that runs Puppeteer to fill out some forms. At one page, I need to upload a pdf. Until now, I've been using a test pdf that's deployed to the same directory as the function, meaning I could do something similar to the Puppeteer documentation for waitForFileChooser:
const [fileChooser] = await Promise.all([
page.waitForFileChooser(),
page.click('#upload-file-button'),
]);
await fileChooser.accept(['/data/myfile.pdf']);
But I would like the file I upload to be one chosen by the user that is uploaded and stored in a blob container.
I tried to downloadToFile(userInput.filename); but Azure is a read only system. Is there a way around this? Can I pass a blob into a fileChooser like this, or make my directory rw but clear it out at the end of the function?
How can I read the data stored in my Cloud Storage bucket of my project and use it in my Python code that I am writing in App Engine?
I tried using:
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(source_blob_name)
But I am unable to figure out how to extract actual data from the code to get it in a usable form.
Any help would be appreciated.
Getting a file from a Google Cloud Storage bucket means that you are just getting an object. This concept abstract the file itself from your code. You will either need to store locally the file to perform any operation on it or depending on the extension of your file put that object inside of a file readstreamer or the method that you need to read the file.
Here you can see a code example on how to read a file from app engine:
def read_file(self, filename):
self.response.write('Reading the full file contents:\n')
gcs_file = gcs.open(filename)
contents = gcs_file.read()
gcs_file.close()
self.response.write(contents)
You have a couple of options.
content = blob.download_as_string() --> Converts the content of your Cloud Storage object to String.
blob.download_to_file(file_obj) --> Updates an existing file_obj to include the Cloud Storage object content.
blob.download_to_filename(filename) --> Saves the object in a file. On App Engine Standard environment, you can store files in /tmp/ directory.
Refer this link for more information.
I''m developing an android app, a web server using flask and firebase.
When client app uploads an image file, the web server saves an image url in database.
And then, client app gets the image url and open image from firebase storage.
So my web server, and app don't know file name.
However, to delete a file in storage,
Knowing the file's name is needed.
Only what I can get is file url.
How can my web server delete a file using file url?
The following code is for deleteting file using filename.
I wanna change this code to deleteting file using file url.
bucket = storage.bucket('<BUCKET_NAME>')
def deleteFile(imageName):
try:
bucket.delete_blob('profile_images/' + imageName)
bucket.delete()
return True
except Exception as e:
print(e)
return False
The Storage client in the Firebase Admin SDK is a thin wrapper around the Node.js client for Google Cloud Storage. And as far as I know the latter doesn't have a way to map the download URL back to a File.
This means that you'll have to find the path from the client using FirebaseStorage.getReferenceFromUrl(), and then pass that path to your web server. That way the JavaScript code can use the path to create a reference to the File.
I need to get access to protoc file in my code. Locally I just put it in the folder but how to get this file from deployed Firebase functions?
const grpc = require('grpc');
const PROTO_PATH = __dirname + '\\protos\\prediction_service.proto';
exports.helloWorld = functions.https.onRequest((request, response){
var tensorflow_serving = grpc.load(PROTO_PATH).tensorflow.serving;
...
}
You'd like to upload 3 files to deploy your Cloud Function:
index.js
package.json
prediction_service.proto
In order to do so via the Developer Console, you'll need to:
Go to the Google Cloud Developer Console > Cloud Functions > Create Function
In the "Source Code" field, choose either:
"ZIP upload" and select a zip file including your 3 files,
"ZIP from Cloud Storage" and select file location on GCS,
"Cloud Source repository" and provide your repo details
Fill in the remaining fields and click "Create"
Once deployed, in the source tab for your function you'll see the three files displayed.
Alternatively, you can use gcloud to deploy your files via the following command:
gcloud beta functions deploy <functionName> --source=SOURCE
where source can be a ZIP file on Google Cloud Storage, a reference to source repository or a local filesystem path. I'd recommend to have a look at the doc for this command for full details.
When you are using Firebase Cloud Functions with TypeScript (your code is in functions/src/index.ts), you need to put the additional files in functions/lib
I find this way the easiest when it comes to Firebase Functions:
Put your .proto file into functions folder of your firebase project (where index.js and package.json is located).
Deploy your functions as normal with the CLI command firebase deploy --only functions
As you can see here the file is automatically added to the project in the GCP:
And you can access it in your node.js project:
protobuf.load(__dirname + '/schema.proto')
While it is possible to use GCS, it's simple to include files in your source.
Put your package.json, index.js (or whatever file is specified in package.json's 'main' field) and other dependent files in a directory.
When you create your function, provide that directory including your other files via the ZIP upload or Cloud Source repository.
Your other files are available at path.join(__dirname, 'your/path/file.ext')
I know that it's possible to upload files to my cloud-files account in Node.js, using the following module: node-cloudfiles.
But is it also possible to upload a filestream directly?
In my case I am dowloading an image from a certain location in Node.js and want to upload this directly to my cloud-files account without saving the image temporary on my server.
Of course it is possible - you can just read the documentation on Rackspace Cloud Files API ( http://docs.rackspacecloud.com/files/api/cf-devguide-latest.pdf ) and implement the necessary parts yourself.
However, I'd suggest to wait until https://github.com/nodejitsu/node-cloudfiles/pull/11 gets integrated into the trunk - then node-cloudfiles library will support uploading files using streams so you won't have to create files before uploading.