deploying functions v2 to different functions based on git branch [closed] - node.js

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 days ago.
Improve this question
Context
I'm creating a firebase functions (v2) web API and wanted to export the code to a different function depending on the current github branch I am in (deploying the function through Github Actions)
eg:
say I accept a PR to prod it should deploy to function-id-ts.a.run.app, but if I push my changes to dev branch x it should deploy the function to dev-x-function-id-ts.a.run.app
My plan was to achieve this by parsing an env variable to the script (either through the GitHub action or a local .env file, depending on where I am deploying it from) and just set exports[process.env.ENV_VARIABLE] = onRequest({options}, api)
Problem
If I do this, when I try deploying to firebase/gcp it will implicitly delete any functions that are not included in index.js, this would mean if I deploy to dev-x-function my function function will be deleted.
deletion docs
With implicit function deletion, firebase deploy parses index.js and removes from production any functions that have been removed from the file.
Questions
Is this an accepted/recommended method of managing having different environments (dev. staging, prod)?
If this is not the way I should be doing it, how should I approach this?
If this is the way I should be doing it, how would I go about deploying to functions
NB: If it is required I am more than happy to access it through/manage it through the gcp (currently manage the deployed function and view logs through it just use firebase as a proxy so it integrates nicely with the rest of the tools I am using)

Related

Opinion on a better platform - Bitbucket vs Gitlab [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed yesterday.
Improve this question
I have seen many pros and cons of both Bitbucket anf Gitlab. I would like to know if Bitbucket supports decentralized workflow like Gitlab.
I am trying to design a workflow like Development-> Master->Staging->Production. Two questions -
Is this possible in Bitbucket to create production branch independently of Master like in Gitlab.
Also, which platform provides secure workflow with minimal coding error from Dev to Production env.
Opinion on Gitlab and Bitbucket, Need to choose a platform that provides safe coding and non centralized workflow

How to best design an API testing implementation for an existing application [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 months ago.
Improve this question
I currently have an existing, functional back end/server node application running a postgres DB with Sequelize as an ORM which uses Express JS to build API endpoints.
As such I need to create an implementation that can repeatable generate a Docker container with a local database built on those existing Sequelize models with reusable test data to be able to send mock requests to my existing endpoints to test their functionality & their associated database functions/transformations using Jest as I have for unit tests.
I am thinking I need to first write a script to generate that Docker container, source some environment variables for specifications to create that Postgres DB with the tables I outlined in Sequelize then insert the test data. Additionally, I need to mock API requests to those existing Express endpoints, I am just unsure exactly how to organize, write and configure these technologies to use my existing node application with Jest. Any ideas on how to devise and implement such a solution for this situation? Thanks!

Multi-language lambda functions within the same directory [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 3 years ago.
Improve this question
I have few lambda written in Typescript and few other in Java.Should I place both the packages in a single directory or maintain a different directory based on the language.We use terraform for deploying Infra and Jenkins for CI/CD. Im also thinking about common code sharing between the lambda functions ,not sure how does that work if we keep all the lambda in the same directory
I would suggest few things here
Group your code in different repositories which will help you with better code management, less code size on lambda as lambda has limit on how much you can upload.
It is better to group different languages in different repo as they are different runtime and hence the settings will be different.
Also, if you make change to typescript code, there shouldn't be any need to touch the java functions and vice versa.
If you have some common code, I would suggest you to look for AWS Lambda layers (https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html). This gives you capability to share the code/bin/executable etc across multiple functions.
Hope this helps.

Release Pipelines for Hundreds of SaaS tenants [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
Our application is currently an ASP.NET Core application hosted on Azure and our code and pipelines are hosted on Azure Dev Ops. The app is pretty simple with just a Web Application and Azure SQL Database.
We currently have a large number of tenants that we would like to deploy to after each release.
We currently have 3 Build Pipelines (which are triggered off dev, test and master branches):
- Dev
- Test
- Production
Where I currently get lost is on where to put the individual tenants, our current path was to make a "Release" pipeline for every tenant. Is this the best way to do this? Should we be using stages instead?
I'm a bit confused as to why you have separate build pipelines for dev, test, and production.
You might consider consolidating all of your pipelines (build and release) into a single YAML pipeline. Under this approach, you'd have 1 build stage which you would capture as a YAML stage template and make use of expressions/conditions to account for the variances between the various environments. Alternatively, if the build process varies widely between environments, you could have separate build stage templates for each and include the appropriate one based on branch that triggered the build.
For releases, you could capture each as a YAML template and then include them at the end of the pipeline using the new deployment job element that has been added to the YAML schema.
Hopefully this gets you closer to a solution or at least gives you some things to think about.

Automating Azure Question for generating infrastructure [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I want to make sure I am taking the right approach.
I am building virtual environments in Azure on a regular anywhere from 3 to 5 servers at a time. Each server has 1 of 4 different resources (RAM/CPU/...) that it will need. Obviously I could script out each VM and just use powershell to deploy each individual VM each time.
More over what I want is a utility or webpage where I can say I need to create x servers and here are the specifications for them, how much will it cost and make it start building them.
Is there any tool like this or what would be the best approach to this?
You could automatically create Azure resources from a Resource Manager template. You create a template file that deploys the resources and a parameters file that supplies parameter values to the template.
Also, you could easily edit and deploy the template on the Azure portal. In this way, you could search Template----Deploy from a custom template---Build your own template in the editor. You could reuse the template after you save it. You could find multiple guidances and sample about the template what you want to deploy.

Resources