I'm currently trying to create a testing environment for the cloud functions independently. I wanted to run the emulator within the functions folder, but there does not seem to be a way to use the root directory as functions directory.
Is there a proper way to change it?
If not, I would love to hear some suggestions on how to properly test the functions, when only the contents of the functions folder is available.
Related
I have developed multiple event-driven cloud functions using NodeJS in GCP. Each cloud function has it's own package.json and index.js also there are many custom module (or utilities) file I created as part of each cloud functions.
But these custom modules are redundant across cloud functions i.e., they are repeated in every cloud function.
Now my question is, if I want to make an update/change in any one of the utility file then I must do the same change in all cloud functions individually. Also, what I observed is that package.json has many common dependencies but with different versions.
To avoid this, I am thinking to group these individual cloud functions into one with only one package.json and single copy of custom modules that can be referred by all cloud functions.
I tried my best to find some useful resource to implement this but couldn't find one.
I referred this document and here they mentioned that it's possible to have multiple function entry point (defined here)
Can anyone let me know how do I group multiple functions to one?
Current project structure:
Root-directory-
-dir_cloudFunction_A
-index.js
-myModule.js
-package.json
-dir_cloudFunction_B
-index.js
-myModule.js
-package.json
Currently I have two cloud functions under a root directory, cloud function A is in dir_cloudFunction_A directory and cloud function B is in dir_cloudFunction_B directory.
Each cloud function has index.js, myModule.js and a package.json
Now I have myModule.js (custom module) in each directory (same copy). Problem with this approach is that, when I need to make any changes in myModule.js, I must ensure that it is updated in all of it's copy wherever this has been referred.
To avoid this issue, I am suggested to share this myModule.js in one place and use it in 'n' number of cloud functions in root directory.
But I'm not sure how can I do this.
I have an AWS lambda API that uses a lambda layer with some helper functions.
Now, when deployed AWS forces a path for the layer that's something like /opt/nodejs/lib/helpers/awsGatewayResponses. However, locally I have another folder structure which (on my local machine) would make the path is layers/api-layer/nodejs/lib/helpers/awsGatewayResponses. (Cause I don't want to have a folder setup that /opt/nodejs/lib/...)
However, I'm setting up some tests using Jest and I've come across the issue that I have to change the imports which is of the format /opt/nodejs/lib/helpers/... to be layers/api-layer/nodejs/lib/helpers/ otherwise I will get import errors - and I don't want to make this change since it is not aligned with the actual deployed environment.
I'm looking for something that can change my paths to be layers/api-layer/nodejs/lib/helpers/ only when I'm running tests. Any ideas on how I can make some kind of dynamic import? I want to run some tests automatically on Github on commits.
Thanks in advance! Please let me know if I have to elaborate.
I currently have a firebase project with multiple cloud functions defined. Each of these functions resides in its own folder with two files: index.js and package.json. As far as I've been able to tell, it is possible to import all of these functions into the index.js file within the default functions folder and export them. However, this approach leads to the deployment of all the functions on the same instance. Is there a way to force them to deploy on their own instances? Thank you!
Firebase parses the index.js and creates separate containers for each exported function.
So each exported function in Cloud Functions runs on its own container instance(s) already. Separately exported functions will never run on the same container instance.
We have a very simple use case--we want to share code with all of our lambdas and we don't want to use webpack.
We can't put relative paths in our package.json files in the lambda folders because when you do sam build twice, it DELETES the shared code and I have no idea why.
Answer requirements:
Be able to debug locally
Be able to run unit tests on business logic (without having to be ran in an AWS sandbox)
Be able to run tests in sam local start-api
Be able to debug the code in the container via sam local invoke
sam build works
sam deploy works
Runs in AWS Lambda in the cloud
TL;DR
Put your shared code in a layer
When referencing shared code in the lambda layer, use a ternary operator when you require(). Check an environment variable that is only set when running in the AWS environment. In this case, we added a short AWS variable in the SAM template, but you can find environment variables that AWS automatically defines, but they will not be as short. This enables you to debug locally outside of the AWS stack, allowing very fast unit tests that test business logic.
let math = require(process.env.AWS ? '/opt/nodejs/common' : '../../layers/layer1/nodejs/common');
let tuc = require(process.env.AWS ? 'temp-units-conv' : '../../layers/layer1/nodejs/node_modules/temp-units-conv');
You shouldn't need to use the ternary operator like that unless within the lambda folder code
Here's a working example that we thought we'd post so that others will have a much easier time of it than we did.
It is our opinion that AWS should make this much easier.
https://github.com/blmille1/aws-sam-layers-template.git
Gotchas
The following gotcha has been avoided in this solution. I am mentioning it because it looked like a straight-forward solution and it took a lot of time before I finally abandoned it.
It is very tempting to add a folder reference in the lambda function's package.json.
//...
"dependencies": {
"common":"file:../../layers/layer1/nodejs/common"
},
//...
If you do that, it will work the first sam build. However, the second time you run sam build, your shared code folder and all subdirectories will be DELETED. This is because when sam builds, it creates an .aws-sam folder. If that folder exists, it performs an npm cleanup, and I think that is what provokes the deleting of the shared code.
I'm currently developing a small game that will rely on a lot Azure App Functions to execute function from time to time. I followed a tutorial on MSDN (https://learn.microsoft.com/en-us/azure/azure-functions/functions-develop-vs#configure-the-project-for-local-development) explaining that I had to create a new project to host a function but so far, I already have 6 different functions and I don't really want to create 6 different projects.
Moreover, all these functions (developed in JavaScript) have a lot of code in common so I created a common JavaScript file with some helper function. Now that I have multiple projects, I can't use it anymore without copy/pasting it in all projects.
Finally, to be able to correctly develop the game, all the functions must be running in parallel on my development machine and I don't really want to open 6 (or more in the future) powershell instances to host these functions.
Is there a way to host multiple functions in the same project and deploy them easily on Azure ?
That's what Function Apps are for. Each Function App may contain multiple Functions, which will be deployed together.
You mention Javascript, but the linked tutorial is in C#. Regardless, you can put multiple functions into the same app: subfolders under the same root (where host.json file is), or static methods in the same C# project. Each function will have a separate function.json file. All functions can share the same code.