An Example of Unit Test for Time Triggered Azure function - azure

I am a newbie in Azure Functions.
I have implemented a time triggered azure function and wish to write unit test cases for it.
I am using specflow and nunit for writing my testcases.
However, I am unable to find a proper example of how to stub time trigger function.
Can someone point me to the correct example?
Thanks.

I wouldn't call it a unit test anymore but you can trigger non-HTTP functions by calling the following admin endpoint of the function app:
POST <ROOT_URL>/admin/functions/<FUNCTION_NAME>
Note that you need to specify the system key in the x-functions-key header when making a request to a deployed function app.
More info in the docs.
Alternative
What I usually try to do is put as much of the business logic in a seperate class which is easily testable and call this class from a function.
Personally, I don't think you should test if the trigger works, that's the responsibility of the Azure Functions Runtime. Fine to test this in a larger scoped integration test but not as a fast and frequently executed unit test.

Get the business logic out of the function itself, and instead have the function call libraries.
Add tests for those libraries.
You don't need to do anything that's specific to azure functions in order to test your code.
If you are attempting to do integration testing, then follow Marc's advice.

Related

Azure durable functions and synchronous read?

I'm trying to use Azure Durable Functions to implement a minimal server that accepts data via a POST and returns the same data via a GET - but I don't seem able to generate the GET response.
Is it simply not possible to return the response via a simple GET response? What I do NOT want to happen is:
GET issued
GET response returns with second URL
Client has to use second URL to get result.
I just want:
GET issues
GET response with requested data.
Possible?
I'm not really sure whether durable functions are meant to be used as a 'server'. Durable functions are part of 'serverless' and therefore I'm not sure whether it's possible (in a clean way).
To my knowledge durable functions are used to orchestrate long lasting processes. So for example handling the orchestration of a batch job. Using durable functions it's possible to create an 'Async HTTP API', to check upon the status of the processing a batch of items (durable function documentation). I've wrote a blogpost about durable functions, feel free to read it (https://www.luminis.eu/blog/azure-functions-for-your-long-lasting-logic/).
But as for your use case; I think you can create two separate Azure functions. One for posting your data, you can use an Azure Blob Storage output binding. Your second function can have a GET http trigger and depending on your data you can use an blob input binding. No need to use durable functions for it :)!
Not really a direct answer to your question, but hopefully a solution to your problem!

Do Azure Functions have a request pipeline of some kind?

I would like to add some common authentication code to a collection of HttpTrigger Azure Functions (v3), which I'm using as an API. I know about the service-side auth associated with AuthorizationLevel.Function, but that won't work for me. The type of auth I need to do is relatively simple: just check a specific HTTP header for a specific value.
In ASP.NET, this kind of thing can be done in an HttpModule. Do Azure Functions have a similar request pipeline of some kind?
As far as I can tell from the documentation, it looks like new Function instances can call Startup.Configure() before calling the target method, if the project is appropriately configured. However, those calls are intended to support Dependency Injection, and don't have access to the HttpRequest object.
Obviously, I could just put an isAuthorized(request) call at the beginning of each API entry point, but that feels klunky, repetitive, and potentially error-prone. Is there a better way?

How to create tests for serverless testing using Jest for these scenarios?

I am new to serverless and NodeJS.Could you please guide me how can I create a automated test cases for
lambda to lambda invoke
API Gateway to Lambda Invoke
DynamoDB insertion test
Please help. Thanks in advance.
If you want full end-to-end test of a lambda function, you will have to handle that outside the function itself.
If you use unit testing tools you will be able to run them locally or even inside the function, but you won't have the ability to actually query the function and go through the whole process.
I'd create a second lambda function with any unit test library, like mocha, and write functional tests that invoke the first lambda function, through API gateway, with a simple http-request package (like request).
EDIT:
Here's more clarification on each one of your points:
1) Lambda to lambda invoke
If by lambda-to-lambda you mean you want to call another function WITHOUT using API GW, then I guess you're planning to use the AWS SDK to trigger a function.
If that is the case, it's like any other test. You will create a test function which will get the SDK to trigger the second lambda, and then check the result of the SDK function. It will probably indicate if it's a success or not, or even give you the result.
2) API gateway to lambda invoke
If you are looking to test if the connection between API GW and lambda works, I'd say, why bother? It's a setup-once-and-use kind of deal.. But if you still want to test this, it will be similar to item 1), with the exception that instead of using the SDK, you'd use an API gateway URL.
So you can use an npm package such as axios or request to make a request to such URL and see if the content is the expected.
I'd even say you can run the test in the lambda function and call the very same lambda function, no need to create separate lambdas.
3) Dynamo insertion
This one is the easiest, just create a unit test that writes something into dynamo. Then, in order to know if the test passes or not, just read the DB trying to find what you wrote.
If you're in the fence between testing libraries, I'd suggest going for mocha and chai.
If I can help you answering something more specific, let me know.

Pass parameters from C# function app to another Javascript function app in Azure

I need to set up an application in Azure and make communicate 2 functions (one written in C# and one written in JavaScript).
The C# fragment consists in analyzing a XML feed, get the data and save in objects then finally send them to the other JavaScript function by parameter.
I did read that we could establish communication between both functions using HTTP calls but is it possible to do it with parameters ?
If not, would have any suggestions in order to achieve something like this properly? I'm getting started with Azure and i don't have enough visibility to know what is recommened in such a situation
Thank you for your advices
Yes, this is absolutely possible. How you do this is up to you. If you look at the default HTTP trigger templates, you can see that they take parameters (for example, as query string parameters). You can find more examples in the HTTP and webhook recipes documentation.
You can use other trigger types for cross-function communication as well. Take a look at this documentation for related best practices: https://learn.microsoft.com/en-us/azure/azure-functions/functions-best-practices#cross-function-communication

Multithreading using Wcf

HI i am new to the the whole programming thing, i have been given a task to multithread 4 stored procedures where each thread runs asynchronously so that the user can get output real quick i have to do it using WCF can anyone help me out with this. Initially what i am trying to do is taking each procedure and getting how much time it takes to execute using parametrizedthreadstart, but i am not sure how to go about it.
Considering you are new to the whole programming thing, you can follow these very basic steps to get thing done.
Create a new WCF service.
Add 4 methods each calling one stored procedure.
Add parameters to the methods which are required by stored procedures.
For Example if your stored procedure is - MySP(varchar name) then your WCF method will
be - MySP(string name);
Now depoly your service in IIS or windows service or Console App or wherever you want.
Create a client application, again it could be anything ConsoleApp or Win Form etc.
Add a reference to your service.
Instantiate service class and call there Async version. By Async I mean there you'll
see all of the four methods with Async attached.
For Example you will find your MySP(string name) method as MySPAsync(string name)
Also there will be MySPCompleted event, subscribe to it.
Now all of your methods are running asynchronously whenever they finish execution they'll call your subscribed methods.
I hope this helps you get started :)
There are a couple of different ways to do this. At the highest level, you can place each service request in it's own service endpoint. This could be defining endpoints for each method, or if you are hosting in IIS, placing each service it's own website. At the lower level, you could define callbacks for each method so that WCF will not block while the method calls are taking place.

Resources