How to deploy all functions in single command using GCLOUD - node.js

I am aware of this gcloud functions deploy hello --entry-point helloworld --runtime python37 --trigger-http which would deploy only the hello function.
But I have multiple functions in my project
Is there a single command like firebase to deploy all functions like firebase deploy --only functions -P default

Right now it is not possible to deploy multiple functions with a single command. There is already an open issue requesting the same, but it is quite old.
Besides tracking the previous issue you could also fill a new issue requesting this feature.
However, I've found 2 related questions in SO with similar issues, in which the solution was to create a small script to perform this:
First is a .sh script
Second is a .py script
Hope this helps you!

Related

Is it possible to stream Cloud Build logs with the Node.js library?

Some context: Our Cloud Build process relies on manual triggers and about 8 substitutions to customize deploys to various firebase projects, hosting sites, and preview channels. Previously we used a bash script and gcloud to automate the selection of these substitution options, the "updating" of the trigger (via gcloud beta builds triggers import: our needs require us to use a single trigger, it's a long story), and the "running" of the trigger.
This bash script was hard to work with and improve, and through the import-run shenanigans actually led to some faulty deploys that caused all kinds of chaos: not great.
However, recently I found a way to pass substitution variables as part of a manual trigger operation using the Node.js library for Cloud Build (runTrigger with subs passed as part of the request)!
Problem: So I'm converting our build utility to Node, which is great, but as far as I can tell there isn't a native way to steam build logs from a running build in the console (except maybe with exec, but that feels hacky).
Am I missing something? Or should I be looking at one of the logging libraries?
I've tried my best scanning Google's docs and APIs (Cloud Build REST, the Node client library, etc.) but to no avail.

Why are Cloud Function Runtime Environment Variables being deleted on deploy?

I recently (2 days ago) upgraded the node runtime engine on our Cloud Functions instance from Node 10 to 12. (Not sure that is a factor, but it is a recent change.)
Since the upgrade I have been using the Cloud Functions project without trouble. Today is the first time I have done a deploy SINCE the deployment to change the node engine. After I did the deploy, ALL of the Runtime Environment Variables were deleted except one labeled FIREBASE_CONFIG.
As a test, I added another test environment variable via the Cloud Functions console UI. I refreshed the page to ensure the variable was there. Then, I ran another deploy, using this command:
firebase use {project_name} && firebase deploy --only functions:{function_name}
After the deploy completed, I refreshed the environment variables page and found that the test variable I had created was now missing.
I'm quite stumped. Any ideas? Thank you!
It is true that the Firebase CLI manages enviroment configuration and does not allow us to set the ENV variables of the runtime during deployment. This has been explained on other post as well, like this one.
I guess you are aware of the difference between the Cloud Functions Runtime Variables and the Firebase Environment configuration, so I will just leave it here as a friendly reminder.
Regarding the actual issue (New deployments erasing previously set "Cloud Functions Runtime Variables"), I believe this must have been something they fixed already, because I have tested with version 9.10.2 of the firebase CLI, and I could not replicate the issue on my end.
I recommend checking the CLI version that you have (firebase --version) and, if you still experience the same issue, provide us with the steps you took.

Can Google App Engine (flexible environment) perform a build step defined in package.json just before deployment?

I couldn't find any documentation about build steps on the flexible environment. Only thing I found is that App Engine will run the start script from your package.json file after deployment, but is it possible to make it run the build script first? This is what Heroku does and I want to replicate it.
What you're looking for is the script called gcp-build as this one can perform a custom build step at deployment, just before starting the application. While this is only documented for Standard Environment as of now (I've let the engineers know), there are multiple public resources that can confirm this works on both environments. See the following links as reference:
Why does Google App Engine flex build step fail while standard works for the same code?
https://github.com/GoogleCloudPlatform/nodejs-docs-samples/tree/master/appengine/typescript

is it possible to deploy one google cloud function on multiple projects?

I am a beginner to cloud.I have a GCP account with multiple projects in it,I have a gcf.Now i am deploying same function again and again individually for each projects from console.So is there any way i Can deploy one cloud function in all projects by just looping the projectIDs using terraform or anyother platforms?
You can define your function and everything that repeats in each project in a module and then use this module in each project definition. To do it you'll need to explicitly define each of your project in terraform configuration. It might be worth doing if you can utilize other terraform feature e.g. tracking state, keeping infrastructure as a code, transparency, reusability and increase infrastructure complexity without making everything confusing.
Otherwise if you not going to do anything complex but instead all you need to do it deploy the same function over multiple projects and nothing more complex is planned for the observable future then Bash scripting with GCP CLI tool is your Swiss knife. You can check this as a reference: https://cloud.google.com/functions/docs/quickstart
Assuming you have your function code in Google Cloud Source Repositories, and you just want to deploy the same code in all projects, you can create a simple BASH script to do so.
First, you need to recover all the projects you have:
gcloud projects list --format 'value(projectId)'
Then, for each project, deploy the function (I'm assuming Nodejs 12 and an HTTP trigger, but edit at your convenience):
for project in $(gcloud projects list --format 'value(projectId)');
do gcloud functions deploy <FUNCTION_NAME> \
--source https://source.developers.google.com/projects/<PROJECT_ID>/repos/<REPOSITORY_ID>/ \
--runtime nodejs12 \
--trigger-http \
--project $project;
done
To do anything fancier, check the other answer from wisp.

Serverless: how to remove one function

I am using serverless to deploy my API on AWS.
In serverless, it allows to deploy a single function:
sls deploy -f <function name>
But it doesn't allow to remove a single function:
sls remove // will remove all functions.
Is there any way to remove single function which won't impact to other functions?
#justin.m.chase suggested:
Simply remove the function in serverless.yml, then run full deploy
sls deploy
the function is removed (Lambda + API Gateway). Perfecto!
I know it's a bit old but the deploy pain of serverless is still a thing.
I recently developed a cli which enables to build microservices in AWS, taking advantage of AWS sam cli (hence the cli name: Rocketsam).
The cli enables caching per function (no more full deploy to the microservice if only one function code changed).
It also has additional features such as splitting the template file to per function, sharing code across functions, fetching logs and more :)
https://www.npmjs.com/package/rocketsam
Currently the cli supports building functions in python 3.6 only, but can be easily extended in the future depending on demand.
As Peter Pham said, remove the function from serverless.yml and do a full:
sls deploy
If you try to delete the function manually in AWS it causes a lot of headaches.
I know this question is over a year old and has been closed but the correct way to remove a single function is to specify it by name which you almost had:
sls remove -f <function name>

Resources