I want to write a CLI app in nodejs that authenticates and uploads a file to firebase, I installed the firebase npm package but it is not well suited for NodeJS (client-side) use.
How can I authenticate and upload a file to firebase storage?
There are two Node.js modules for Firebase, one for server-side Node.js code, and another one for client-side Node.js code (like on IoT devices).
If you're running the code on a server, you'll want to use the Firebase Admin SDK to access Cloud Storage. Note that this part of the Admin SDK is a fairly thin wrapper around the regular Node.js SDK for Cloud Storage, so I also recommend keeping the documentation for that package handy.
If you're running the code on a client, unfortunately the Node.js module does not have built-in support for uploading files to Storage. You'll have to look for another way, such as creating a custom API that your code calls and that uses a supported SDK for uploading the file to Storage.
Related
I'm creating a small upload site for a group of colleagues using App Engine - the site simply provides an interface for a user to upload a CSV file to a cloud storage bucket.
The app uses a React front end service, and a Node Express backend service.
I've got everything running perfectly fine locally (using the keyfile of a service account, can confirm not using default/local credentials), selecting a file, processing it and uploading it to the storage bucket from localhost works as expected (200 response)
The interesting part is when I deploy this to App Engine, the site appears to work as expected, when selecting and uploading a file the console shows a 200 OK response (very quickly might I add, compared to the local version) yet it doesn't upload anything.
I'm struggling to find backend logs to suggest why this isn't working, can't seem to get my node logs from express to appear anywhere in stack driver.
Is there something I'm overlooking here?
I'm using #google-cloud/storage npm package to access bucket in GCP storage, everything works fine in production environment, but I would like to use local file system during local development, so that on the one hand any garbage won't appear in prod env on the other hand I won't impact other developers.
If this is not achievable what is the best way to use #google-cloud/storage during local development?
Some services have emulators:
https://cloud.google.com/sdk/gcloud/reference/beta/emulators
It appears there's any emulator with limited (!) Cloud Storage functionally available through Firebase but I've not used it:
https://firebase.google.com/docs/emulator-suite/connect_storage
I'm unclear how you'd use the Cloud Storage emulator from a non-Firebase Storage SDK but it should work as it will implement the underlying API. Perhaps using ClientOptions to override the default service endpoint.
Im hosting a node.js web app with firebase, and i need to run a powershell script. I have installed the node module "node-powershell" which works perfectly locally, however when deployed, it tells me that i need to install powershell (install it in the firebase 'computer'). Is there any way to do this?
Firebase Hosting is a so-called static hosting service. This means it serves the content as is, it does not interpret/execute that content in any way.
So most likely you're using the Cloud Functions integration with Firebase Hosting to run those Node scripts. And that turns this into a question whether Cloud Functions can run Powershell scripts.
I don't immediately seen an answer there, although you could potentially upload the binary yourself if that is available for the platform Cloud Functions runs on (Debian). For an example of this, see Can you call out to FFMPEG in a Firebase Cloud Function
I have an Azure node.js "App Service" already set up and running, working just fine. It's connected to an Azure DB, all that works great.
What I don't have is any sort of Storage / Blob service on my account whatsoever, and I'm having trouble finding documentation about the best way to set Blob Storage up to work with my App Service.
My goal is to be able to store and retrieve files, including primarily image files (.png, .jpg) and pdfs. I think Blob storage is what I'm looking for, and I'll want to set up an API on my node.js App Service for web clients to be able to upload and download files on the Storage service.
There are 2 Azure Storage blob npm packages you can use and add to your project dependencies:
azure-storage https://www.npmjs.com/package/azure-storage
#azure/storage-blob https://www.npmjs.com/package/#azure/storage-blob
The 1st package is the most widely used azure-storage Node.js SDK. Supports blob/queue/file/table.
The 2nd package is the latest released package for blob, more lightweight and based on the latest Azure Storage architecture. Supports async methods, promise and HTTP pipeline injection.
You can go to their npm package or GitHub page for their samples.
I have created a cloud foundry app to deploy node js application in bluemix.
I have used the commands which they given in Getting Started, to deploy my node js application from local to bluemix. Application is deployed successfully and app starts running.
Now I have to download the code (the entire project folder) I have deployed, how can I do this?
It isn't recommended to write files to the filesystem in a Cloud Foundry app on Bluemix, it's purely a runtime. If you need to store files, try using the Object Storage offering on Bluemix https://console.bluemix.net/catalog/services/Object-Storage which will keep your files safe and allow you to access them, back them up, or whatever you need.