How can I get an Email when my APIs goes down? - node.js

I have a few APIs written in NodeJs using Express, Which are hosted on AWS EC2, What can I do to get an email whenever my APIs go down? Can I use any service of AWS like CloudWatch to send an Email when my API crashes?

Consider using CloudWatch's synthetic monitoring to:
create canaries, configurable scripts that run on a schedule, to monitor your endpoints and APIs. Canaries follow the same routes and perform the same actions as a customer, which makes it possible for you to continually verify your customer experience even when you don't have any customer traffic on your applications. By using canaries, you can discover issues before your customers do.
You can monitor canary events with EventBridge and send custom notifications, including email.

I'm just a beginner, but isn't SNS the solution you would want for this situation?

Related

Real time email feature using AWS services

I am trying to use AWS services to implement a real-time email-sending feature for one of my projects. It is like someone uses my app to schedule a reminder from my project and then the email will be sent to them nearby or at the actual time that he scheduled.
I know the AWS services such as AWS CloudWatch rules (CRONs) and DynamoDB stream (TTL based). But that is not perfect for such a feature. Can anyone please suggest a better way to implement such a feature?
Any type of guidance is acceptable.
-- Thanks in advance.
Imagine your service at huge scale. At such scale, there are likely to be multiple messages going off every minute. Therefore, you could create:
A database that stores the reminder times (this could be DynamoDB or an Amazon RDS database)
An AWS Lambda function that is configured to trigger every minute
When the Lambda function is triggered, it should check the database to retrieve all reminders that should be sent for this particular minute. It can then use Amazon Simple Email Service (SES) to send emails.
If the number of emails to be sent is really big, then rather than having the Lambda function call SES in series, it could put a message into an Amazon SQS queue for each email to be sent. The SQS queue could then trigger another Lambda function that sends the email via SES. This allows the emails to be sent in parallel.

Microservices in Azure

I understand that Microservices is about independent loosely coupled services. I have read https://en.wikipedia.org/wiki/Microservices.
When it comes to Azure, I understand there are many components like Azure Service Fabric, AKS and also have the option of deploying containers within Azure VMs using Docker or any other containerization tools. However, since Microservices is about developing atmoic individually scalable services, can this also be achieved by deploying each service as an Azure Web API APP within an App Service Plan and configure Auto-Scale based on Performance metrics (though each API APP may not be individually scalable, they can still be individually manageable in terms of deployment, configuration etc)?
Can someone please suggest if this thought process is correct?
Microservices aren't a platform or technology so if you can make small independently deployable services then they are microservices. Sure - some tech helps but it depends on your situation.
If you only need a few services you probably don't need anything complex. Make sure services are well modeled, own their own data and ideally have a good monitoring and deployment pipeline setup. Design for service failure where possible.
Do you need to scale each part independently? Ideally, you should be able to but do services have very different requirements? You could have many small App service plans but that comes at cost of unused resources so split when you need to.
This question and of course the answers are going to be opinion based, but generally when thinking in terms of micros services, think not in terms as things like loads of API's and VM's etc. Instead think in terms of. When i upload an image, its needs to be resized, and the table updated to give a url for the thumb. or when XXX record is updated in database, Run XXX in order to create a report, or update Azure search. and that each service, just knows how to do a single thing only. I.E Resize an image.
Now one could say. I have a system, A repo library, and some functions library. When an image is posted, I upload, then call this, and that etc.
With Micor services. You would instead just add the image to a queue. Create an azure function that has a queue trigger. that would resize and save both the large and the thumb to storage. this would then either update the database, or in true micro service, it would add a queue to store the new info, another function would watch that queue and insert into the database.
You can use the DB queue from anything. You can use the Blob queue from anything. Your main API, does not care how images are handled. You can change your functions one day to maybe save to dropbox, instead of azure blob. All really easy, with no re-build of the API, because the API does not care.
A good example I use it for is email and SMS. My systems dont know how to send an email, or an SMS. They only know how to add to a queue. My microservices. SendEmail and SendSMS do know how to do it, and I can change how and who i send that content with, really easy. I can tomorrow change from Twilio to send grid, without ever telling the API that i've done it.
On a more complex thing. I have approval, at the moment that approval sends an email or SMS to either user or admin, and that can change over time. So I have an SMS server, Email Service and and approvalService. when approval happens, it just adds a config to the queue, The rest is done by a logic app, that knows to send an email to XXX and an SMS to XXX and then update database. My api, is just a post, that creates a queue.
Basically what I am saying here is to get started, maybe porting an existing app. Start with the workflow stuff, like send an email, resize an image, create a report, create a PDF, email 50 subscribers etc. and take all that code out and put into there own micro service that just knows how to do one thing. Then when you grow with confidence, create a workflow from all of these services with Logic Apps, let azure take care of the rest, thats what they want to do.

is it not possible to schedule aws-sns?

I am developing IOS and node.js application. And I want to use the push-notifications service. Finally, I decided to use Amazon's sns service.
And I want to send a push service to all registered devices at a specific time, one at a time. I do not know if Amazon's sns service can do it. Is this possible? If not, would I be better off using Firebases?
This can be done using Amazon CloudWatch Events. CloudWatch Events can invoke a SNS topic on a schedule that is user defined.

how can i detect and get email notification of traffic in azure api management

i have question regarding Azure API Management again : ).
i am using API management which is API Gateway doing HTTPS to Azure Storage REST API directly
and is there any way that i cant get email notification when there are massive requests or high latency at response ??
Thanx for reading : )
You can configure alert notifications either in the portal or via the REST API or .NET SDK to monitor for specific Azure Storage Metrics that you want.
See https://azure.microsoft.com/en-us/documentation/articles/insights-receive-alert-notifications/ for more details.
For massive requests, you might want to consider using "TotalRequests" or "TotalBillableRequests" in a specific time period.
For high response latency, you can track "AverageE2ELatency" or "AverageServerLatency" in a specific time period.
See https://azure.microsoft.com/en-us/documentation/articles/storage-monitoring-diagnosing-troubleshooting/#monitoring-performance for more details on these specific metrics and how they relate to performance monitoring.
Hope this helps.
Sriprasad's answer makes sense for configuration from the Storage side. From the API Management side, you cannot currently set a notification on any event other than the built-in ones (subscription requests, new subscriptions, application gallery requests, new issues/comments, approaching of user subscription quota limit).
You can use Log-To-Eventhub policy to log a message to event hub for every request and consume it in a custom or third party solution like AppInsights/Runscopee to fire an alert.
Refer
https://azure.microsoft.com/en-us/documentation/articles/api-management-log-to-eventhub-sample/
If your requirement is to get report/metrics from API Management you can use the management rest api's for APIM.
https://msdn.microsoft.com/en-us/library/dn781421.aspx
Specifically you might want to look at reportByAPI (which gives you useful metrics in response like callcounts, apiTimeAvg) based on which you can setup alerts/email notification.
https://msdn.microsoft.com/en-us/library/dn781421.aspx#ReportByAPI

Multiple Azure queues

I am trying to workout the best implementation/approach to the following problem
I have customers using our win forms application which has a plugin which will connect to the Azure Queue to check if there are awaiting invoices for the connecting customer at pre conf intervals. If there is then the plugin will download the invoices into the customers local db. There are lots of customers using this application so all of them will connect to the queue. They will all need to download their own invoices
How I thought of implementing this was by having named queues for each customer (the customer GUID will identify the queue). So all the customers will use the same Account key/name to connect to the queue. Now the problem with this is that each customer has the account key/name in the dll which they can reflect and retrieve (smart customers). So is there a way I can encrypt the key/name or is there a better solution that somebody can suggest
I think the only secure option is to stand up a web service somewhere that acts as a front-end to the queues. Otherwise, as you said, you're leaking the account key to the client, which would allow any customer to read/change/delete any data in the account.

Resources