How to log and audit Azure Functions code and configuration changes - azure

How can code and configuration changes to Azure Functions be logged and audited? I'm trying to create ways to track and guard against malicious insiders making unauthorized changes to the functionality of Azure Functions. In AWS I can create a CloudTrail trail that logs all write events to Lambda functions and write them to an S3 bucket; the Events are also visible in the Event history section of the CloudTrail console. However, I can't seem to find a way to do something similar for Azure Functions, especially in Azure Stack. I've scoured the Activity Log and the Monitor to no avail. Any help or ideas would be greatly appreciated. Thanks!

Azure has a new feature called Change Analysis
https://aka.ms/changeanalysis
If you are logged in you probably can go directly here
https://portal.azure.com/?feature.customportal=false#blade/Microsoft_Azure_ChangeAnalysis/ChangeAnalysisBladeV2
This feature is also incorporated into the activity log and you can view the changes that were done. The only issue is it's only for 14 days. They are working on creating allowing export to Log Analytics so it could go back further.
You can create alerts on the activity log. The updates you are referring too should trigger an activity log
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/activity-log-alerts

Related

How to monitor the command lines that are executed in gcp at the logs level?

Currently I want to have a monitoring to know who and what is executing at the gcloud level, for example to know if someone executes:
gcloud iam service-accounts list.
The objective is to have a control in case an attacker or another person manages to enter and know the list of service accounts. The objective is to be able to visualize it through the Logs Explorer and then make a Sink towards the SIEM.
Can this be done?
Everytime someone (or something ... eg Terraform) makes changes to your GCP environment or performs some sensitive access, audit records are automatically recorded and are immutable. This means that they can not be deleted or otherwise hidden. These audit records are written to GCP Cloud Logging and can be viewed/reviewed using the Cloud Logging explorer tools. Should you need, you can also set up alerts or other triggers that are automatically fired if certain log records (audit activities) are detected. The full documentation for GCP Audit Logs can be found here:
https://cloud.google.com/logging/docs/audit
Rather than try and repeat that information, let me encourage you to review that article in depth.
For the specific question on gcloud, it helps to realize that everything in GCP happens through API. This means that when you execute a gcloud command (anywhere), that results in an API request to perform the task being sent to GCP. It is here the GCP writes the audit records into the log.
As far as sinking the audit trail written to Cloud Logging to a SIEM, that is absolutely possible. My recommendation is to split the overall puzzle into parts. For part 1, prove to yourself that the audit records you care about are being written to Cloud Logging ... and for part 2, prove to yourself that any and all Cloud Logging records can (with filters) be exported out of Cloud Logging to an external SIEM or to GCP Cloud Storage for long term storage.

What is the best way to document and keep a change log with azure?

We're looking for a lightweight solution that keeps track of our reasons and execution of changes made to azure tenant, there is no approval processes necessary but we would like a tracking system that allows us to easily and quickly catch up on history and existing state.
By default, everything you do to an azure resource is recorded in the azure activity log. You can learn more about it from here. But I would recommend enabling Diagnostic Logging to your default Log Analytics workspace which will be part of your Azure Monitor Logs now. Learn more about Diagnostic Logging from here.

Where to find throttled events of Logic App in Azure Portal?

I wanted to monitor Azure Logic Apps with the help of Azure Monitor alerts. In alerts, I came across a metric Run Throttled events which is showing some numbers in recent days. But I couldn't find any events anywhere to resolve the issue. Is it possible view the actual run throttled events in Azure Portal?
You will need to setup diagnostic logging for Logic Apps, see here.
When you are done with the setup and initial run through of logs and if interested you want to look at more advanced queries via this logs data then go here.
Specifically on throttling you need to see this. Also take a look at limits set for Logic Apps from here as well.

Azure deleted & created resources

I want to have a control in Azure regarding new and deleted items
I need a query to know "who" and "when" a resource is created or deleted in Azure
Is this possible? How can I do this query?
I need a query to know "who" and "when" a resource is created or
deleted in Azure
Is this possible? How can I do this query?
Whenever a resource is created or deleted, information about that operation is stored in Azure Activity Logs. You should be able to find the information by querying that.
Another alternative would be to make use of Azure Event Grid and subscribe to Subscription Events. You can subscribe to Microsoft.Resources.ResourceWriteSuccess (for creation/updation of resources) and Microsoft.Resources.ResourceDeleteSuccess (for resource deletion) events and take action on these events in near real time.
Within the Azure Portal, you can view these types of events from the past 90 days in the Activity Log blade.
For access to events occurring more than 90 days in the past, you need to pre-emptively set up log archival as detailed in the Export the Azure Activity Log article.
If you are planning to use the export Activity Log feature, please make sure you use the new diagnostic setting feature on Azure subscription to export Activity Logs. This feature offers multiple improvements over the old features such as Logprofiles or the Activity Log solution (Log Analytics).
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/activity-log-collect
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/diagnostic-settings-template

Can I set the Azure WebJob Dashboard Status myself?

I'm using the Azure WebJob dashboard for monitoring my jobs. I'm not happy with how far I have to drill into the into the interface to determine what's happening. I'd like to leverage the "Status" field on the webjob details page to show if a particular invocation needs attention and in cases where I consider an invocation a failure, even if it didn't blow up.
I've searched through the Azure WebJobs docs and the features of the Azure WebJobs SDK Extensions package with no luck (but I don't doubt I might have missed it). I manually setting this field possible?
I'm not happy with how far I have to drill into the into the interface to determine what's happening. I'd like to leverage the "Status" field on the webjob details page to show if a particular invocation needs attention and in cases where I consider an invocation a failure, even if it didn't blow up.
As far as I know, it seems that it does not enable us to set status field by ourselves on Azure WebJob Dashboard. If you’d like to display WebJob run details without clicking into the interface, you could try to call WebJobs API to get job runs history and retrieve output or error information from logs by requesting output_url or error_url, and then you could create a custom dashboard and populate it with the output and error details data.
No, you can't set it yourself.
The Kudu APIs may not give you enough detail for individual function instances.
Consider putting a feature request on https://github.com/Azure/azure-webjobs-sdk/
There has been some more investment in exposing a logging API directly over the storage account.

Resources