is it possible to deploy one google cloud function on multiple projects? - node.js

I am a beginner to cloud.I have a GCP account with multiple projects in it,I have a gcf.Now i am deploying same function again and again individually for each projects from console.So is there any way i Can deploy one cloud function in all projects by just looping the projectIDs using terraform or anyother platforms?

You can define your function and everything that repeats in each project in a module and then use this module in each project definition. To do it you'll need to explicitly define each of your project in terraform configuration. It might be worth doing if you can utilize other terraform feature e.g. tracking state, keeping infrastructure as a code, transparency, reusability and increase infrastructure complexity without making everything confusing.
Otherwise if you not going to do anything complex but instead all you need to do it deploy the same function over multiple projects and nothing more complex is planned for the observable future then Bash scripting with GCP CLI tool is your Swiss knife. You can check this as a reference: https://cloud.google.com/functions/docs/quickstart

Assuming you have your function code in Google Cloud Source Repositories, and you just want to deploy the same code in all projects, you can create a simple BASH script to do so.
First, you need to recover all the projects you have:
gcloud projects list --format 'value(projectId)'
Then, for each project, deploy the function (I'm assuming Nodejs 12 and an HTTP trigger, but edit at your convenience):
for project in $(gcloud projects list --format 'value(projectId)');
do gcloud functions deploy <FUNCTION_NAME> \
--source https://source.developers.google.com/projects/<PROJECT_ID>/repos/<REPOSITORY_ID>/ \
--runtime nodejs12 \
--trigger-http \
--project $project;
done
To do anything fancier, check the other answer from wisp.

Related

Is it possible to stream Cloud Build logs with the Node.js library?

Some context: Our Cloud Build process relies on manual triggers and about 8 substitutions to customize deploys to various firebase projects, hosting sites, and preview channels. Previously we used a bash script and gcloud to automate the selection of these substitution options, the "updating" of the trigger (via gcloud beta builds triggers import: our needs require us to use a single trigger, it's a long story), and the "running" of the trigger.
This bash script was hard to work with and improve, and through the import-run shenanigans actually led to some faulty deploys that caused all kinds of chaos: not great.
However, recently I found a way to pass substitution variables as part of a manual trigger operation using the Node.js library for Cloud Build (runTrigger with subs passed as part of the request)!
Problem: So I'm converting our build utility to Node, which is great, but as far as I can tell there isn't a native way to steam build logs from a running build in the console (except maybe with exec, but that feels hacky).
Am I missing something? Or should I be looking at one of the logging libraries?
I've tried my best scanning Google's docs and APIs (Cloud Build REST, the Node client library, etc.) but to no avail.

Deploying docker image as IBM Cloud Action with credentials

I have a small NodeJS app I want to deploy to IBM Cloud as an "action". What I've been doing until now is just zipping the project files and creating/updating actions using the IBM Cloud CLI like this:
ibmcloud fn action create project-name C:\Users\myuser\Desktop\node-js-projects\some-project\test-folder.zip --kind nodejs:12
This was working great, however I'm now testing a new project which has a much larger modules folder, and as such IBMCloud won't accept it. I've turned my attention to using Docker as the below article explains.
https://medium.com/weekly-webtips/adding-extra-npm-modules-to-ibm-cloud-functions-with-docker-fabacd5d52f1
Everything makes sense, however I have no idea what to do with the credentials that the app uses. Since IBM Cloud seems to require you to run "docker push" I'm assuming it's not safe to include a .env file in the docker image?
I know in IBM Cloud I can pass "parameters" to an action but not sure if that helps here. Can those params be accessed from a piece of code deployed this way?
Would really appreciate some help on this one. Hoping there's a straightforward standard way of doing it that I've just missed. I'm brand new to docker so still learning.

How to deploy all functions in single command using GCLOUD

I am aware of this gcloud functions deploy hello --entry-point helloworld --runtime python37 --trigger-http which would deploy only the hello function.
But I have multiple functions in my project
Is there a single command like firebase to deploy all functions like firebase deploy --only functions -P default
Right now it is not possible to deploy multiple functions with a single command. There is already an open issue requesting the same, but it is quite old.
Besides tracking the previous issue you could also fill a new issue requesting this feature.
However, I've found 2 related questions in SO with similar issues, in which the solution was to create a small script to perform this:
First is a .sh script
Second is a .py script
Hope this helps you!

AWS Lambda Dev Workflow

I've been using AWS for a while now but am wondering about how to go about developing with Lambda. I'm a big fan of having server-less functions and letting Amazon handle the maintenance and have been using it for a while. My question: Is there a recommended workflow for version control and development?
I understand there's the ability to publish a new version in Lambda. And that you can point to specific versions in a service that calls it, such as API Gateway. I see API Gateway also has some nice abilities to partition who calls which version. i.e. Having a test API and also slowly rolling updates to say 10% of production API calls and scaling up slowly.
However, this feels a bit clunky for an actual version control system. Perhaps the functions are coded locally and uploaded using the AWS CLI and then everything is managed through a third party version control system (Github, Bitbucket, etc)? Can I deploy to new or existing versions of the function this way? That way I can maintain a separation of test and production functions.
Development also doesn't feel as nice through the editor in Lambda. Not to mention using custom packages require to upload anyways. Seems local development is the better solution. Trying to understand others workflows so I can improve mine.
How have you approached this issue in your experience?
I wrote roughly a dozen lambda functions that trigger based on S3 file write event or time, and make a HTTP req to an API to kickstart data processing jobs.
I don't think there's any gold standard. From my research, there are various approaches and frameworks out there. I decided that I didn't want to depend on any kind of frameworks like Serverless nor Apex because I didn't want to learn how to use those things on top of learning about Lambda. Instead I built out improvements organically based on my needs as I was developing a function.
To answer your question, here's my workflow.
Develop locally and git commit changes.
Mock test data and test locally using mocha and chai.
Run a bash script that creates a zip file compressing files to be deployed to AWS lambda.
Upload the zip file to AWS lambda.
You can have version control on your lambda using aws CodeCommit (much simpler than using an external git repository system, although you can do either). Here is a tutorial for setting up a CodePipeline for commit/build/deploy stages: https://docs.aws.amazon.com/codepipeline/latest/userguide/tutorials-simple-codecommit.html
This example deploys an EC2 instance, so for the deploy portion for a lambda, see here
If you set up a pipeline you can have an initial commit stage, then a build stage that runs your unit tests and packages the code, and then a deploy stage (and potentially more stages if required). It's a very organized way of deploying lambda changes.
I would suggest you to have a look at SAM. SAM is a command line tool and a framework to help you to develop your serverless application. Using SAM, you can test your applications locally before to upload them to the cloud. It also support blue / green deployment and CI/CD workflows, starting automatically from github.
https://github.com/awslabs/aws-sam-cli

Serverless: how to remove one function

I am using serverless to deploy my API on AWS.
In serverless, it allows to deploy a single function:
sls deploy -f <function name>
But it doesn't allow to remove a single function:
sls remove // will remove all functions.
Is there any way to remove single function which won't impact to other functions?
#justin.m.chase suggested:
Simply remove the function in serverless.yml, then run full deploy
sls deploy
the function is removed (Lambda + API Gateway). Perfecto!
I know it's a bit old but the deploy pain of serverless is still a thing.
I recently developed a cli which enables to build microservices in AWS, taking advantage of AWS sam cli (hence the cli name: Rocketsam).
The cli enables caching per function (no more full deploy to the microservice if only one function code changed).
It also has additional features such as splitting the template file to per function, sharing code across functions, fetching logs and more :)
https://www.npmjs.com/package/rocketsam
Currently the cli supports building functions in python 3.6 only, but can be easily extended in the future depending on demand.
As Peter Pham said, remove the function from serverless.yml and do a full:
sls deploy
If you try to delete the function manually in AWS it causes a lot of headaches.
I know this question is over a year old and has been closed but the correct way to remove a single function is to specify it by name which you almost had:
sls remove -f <function name>

Resources