AWS Lambda Dev Workflow - node.js

I've been using AWS for a while now but am wondering about how to go about developing with Lambda. I'm a big fan of having server-less functions and letting Amazon handle the maintenance and have been using it for a while. My question: Is there a recommended workflow for version control and development?
I understand there's the ability to publish a new version in Lambda. And that you can point to specific versions in a service that calls it, such as API Gateway. I see API Gateway also has some nice abilities to partition who calls which version. i.e. Having a test API and also slowly rolling updates to say 10% of production API calls and scaling up slowly.
However, this feels a bit clunky for an actual version control system. Perhaps the functions are coded locally and uploaded using the AWS CLI and then everything is managed through a third party version control system (Github, Bitbucket, etc)? Can I deploy to new or existing versions of the function this way? That way I can maintain a separation of test and production functions.
Development also doesn't feel as nice through the editor in Lambda. Not to mention using custom packages require to upload anyways. Seems local development is the better solution. Trying to understand others workflows so I can improve mine.
How have you approached this issue in your experience?

I wrote roughly a dozen lambda functions that trigger based on S3 file write event or time, and make a HTTP req to an API to kickstart data processing jobs.
I don't think there's any gold standard. From my research, there are various approaches and frameworks out there. I decided that I didn't want to depend on any kind of frameworks like Serverless nor Apex because I didn't want to learn how to use those things on top of learning about Lambda. Instead I built out improvements organically based on my needs as I was developing a function.
To answer your question, here's my workflow.
Develop locally and git commit changes.
Mock test data and test locally using mocha and chai.
Run a bash script that creates a zip file compressing files to be deployed to AWS lambda.
Upload the zip file to AWS lambda.

You can have version control on your lambda using aws CodeCommit (much simpler than using an external git repository system, although you can do either). Here is a tutorial for setting up a CodePipeline for commit/build/deploy stages: https://docs.aws.amazon.com/codepipeline/latest/userguide/tutorials-simple-codecommit.html
This example deploys an EC2 instance, so for the deploy portion for a lambda, see here
If you set up a pipeline you can have an initial commit stage, then a build stage that runs your unit tests and packages the code, and then a deploy stage (and potentially more stages if required). It's a very organized way of deploying lambda changes.

I would suggest you to have a look at SAM. SAM is a command line tool and a framework to help you to develop your serverless application. Using SAM, you can test your applications locally before to upload them to the cloud. It also support blue / green deployment and CI/CD workflows, starting automatically from github.
https://github.com/awslabs/aws-sam-cli

Related

Is it possible to stream Cloud Build logs with the Node.js library?

Some context: Our Cloud Build process relies on manual triggers and about 8 substitutions to customize deploys to various firebase projects, hosting sites, and preview channels. Previously we used a bash script and gcloud to automate the selection of these substitution options, the "updating" of the trigger (via gcloud beta builds triggers import: our needs require us to use a single trigger, it's a long story), and the "running" of the trigger.
This bash script was hard to work with and improve, and through the import-run shenanigans actually led to some faulty deploys that caused all kinds of chaos: not great.
However, recently I found a way to pass substitution variables as part of a manual trigger operation using the Node.js library for Cloud Build (runTrigger with subs passed as part of the request)!
Problem: So I'm converting our build utility to Node, which is great, but as far as I can tell there isn't a native way to steam build logs from a running build in the console (except maybe with exec, but that feels hacky).
Am I missing something? Or should I be looking at one of the logging libraries?
I've tried my best scanning Google's docs and APIs (Cloud Build REST, the Node client library, etc.) but to no avail.

Deploy Node.js application to AWS Elastic Beanstalk from Bitbucket using AWS CodePipeline

I'm writing this post after exhausting all the options I could try.
Scenario:
I want to deploy a lightweight backend logic written in Node.js (using JavaScript, Express.js, MongoDB).
Platforms considered:
AWS (preferred)
Heroku
Bitbucket Pipelines (least preferred)
Currently it's deployed on Heroku (and running as expected), but I want to deploy it on our preferred platform, i.e., AWS. Also, I want the whole deployment process to be automated (using Continuous Delivery/Deployment).
To achieve that, I created:
A fresh Node.js environment (v16.0, platform v5.6.0) on Elastic Beanstalk
A new CodePipeline:
Source: Bitbucket
Repository (cloned from here)
Change detection options: Start the pipeline on source code change
Output artifact format: CodePipeline default
Build stage: skipped
Deploy provider: AWS Elastic Beanstalk (with correct region, app and env name)
Outcome on every try:
On the first try when the environment gets created, we get greeted with the default screen:
After subsequent commits, things like this start to show up:
Error pages with error codes in the series of 400s and 500s:
To overcome those problems, I scratched all the blog posts and support articles available regarding those issues and followed the steps thoroughly gaining no success in the end.
Some of the (many) solutions I tried:
Create a Procfile with the the params
Write app.js/index.js outside the src/ (dumb thing, I know)
Switch to immutable deployments (as mentioned here)
I seriously need help resolving this issue. Any help or guidance towards some solution would be greatly appreciated.

Is it nonsense to not use Amplify API (and use the AWS API Gateway SDK instead)?

i would like to know if i am the only one thinking to not use Amplify's API and instead use the classic SDK. The reasons are
1st: I am a beginner to AWS and development so the fact that Amplify is hiding the complexity isn't helping me to learn what's going on behind the scenes, control and understand everything well.
2nd: Can i modify the API without changing the generated to local project files? (files that generated while running "amplify add auth" command.
I am very confused about it, and can;t really find a guide of how to modify local files after doing changes to API through AWS platform.

AWS: How to reproduce NodeJS project?

I need help with someone familiar with AWS and web servers. Currently I'm walking through this tutorial trying to get started with NodeJS and AWS. https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/create_deploy_nodejs.html
I'm trying to figure out how to essentially do a "git clone" of a traditional project but do whatever equivalent that is for an AWS project (ex: If I wanted to work on my existing AWS project on a different machine)
I read some EB CLI documentation (https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/eb3-cmd-commands.html). I tried the "eb clone env-name". However, this actually created a separate environment on AWS within my application, which isn't what I wanted. It also only added a .gitignore and a .elasticbeanstalk folder to my directory, none of my source code for my AWS application.
I'm confused on what the standard process is for working with AWS projects. Particularly, how can I start working on my existing AWS project from another machine? (Is there anyway to pull my source code from AWS project?) Is there anyway I can view my code on AWS?
Side note: In the past I worked with Google Apps Scripts on the cloud, which used Clasp CLI for pushing and pulling code to the cloud. This was very intuitive because it was literally clasp pull to pull code from cloud and clasp push to push code to it.
Elastic Beanstalk isn't a code repo. It's a way to host applications in a simplified way, without having to configure the compute resources. Compare this to something like EC2 where all the networking and web server configuration is manual.
You can still use git to manage your source code, and there's git CLI integration with Elastic Beanstalk too. Once you've got your source code working, you bundle it up into a .zip file and upload it to EB. You can also use AWS CodeBuild to watch git repos, build source code into bundles, and automatically deploy it to Elastic Beanstalk.
If you are looking for a way to host source code on AWS, AWS CodeCommit is the managed git solution.
You should take a look at the Amplify Framework by AWS: https://aws-amplify.github.io/docs/ – here's a walkthrough that will get you were you are heading faster – sure, it mentions teams but, the result can be applied to single developers too: https://aws-amplify.github.io/docs/cli/multienv?sdk=js
Since you mentioned "view my code on AWS", you should have a look here: https://aws.amazon.com/cloud9/ – this will walk you through setting up an account, repos and working with your code on the cloud.
Good luck!

How to do continuous deployment - c++ on AWS

I have a source on a repository server. The application is running on an AWS instance. I could make a script that logs, makes the pull, compiles, and copies the new binary to it's destination.
But how do I copy the new binary if the application is running? what's the usual way to do this? Do I have to stop the application to make an update? how does continuous deployment works then?
I'm using linux, the application is in C++.
You would have to restart the application after copying the binary. I would highly recommend that you use one of the frameworks for continuous building/integration to make this less painful though, for example Jenkins.
It will not only help with the actual deployment process, but can also run tests for you and only deploy if the tests succeed. There is also a plugin for AWS integration.

Resources