Azure DataFactory Log Analytics Access - azure

We have a Log Analytics workspace 'XYZ' in one subscription where all azure services logs are being sent to, from all other subscriptions as well.
We have our Data Factory Solution in another subscription where one user has Owner access. Its logs are also being stored in 'XYZ'.
The challenge we are facing is, this user wants to access Data Factory logs but we can't give it as the Log Analytics workspace contains logs from other services as well such as backup.
Is there a way to grant this user access only on Data Factory logs?

You can grant users and groups only the amount of access they need to work with monitoring data in a workspace by using role-based access control(RBAC). More detail, you can refer to this documentation.

Related

Unable to link storage account to Log analytics workspace

We are using fluentbit to output application logs to a Azure log analytics workspace. The application log does appear in the workspace as a table under the Logs blade, Custom Logs category. So far so good.
Due to the maximum retention period of the Log analytic workspace limited to 730 days, I thought linking a storage account to type Custom logs & IIS logs under the Linked storage accounts would solve the problem for me. My understanding is once a storage account is linked to type Custom logs & IIS logs, all Custom Logs will be written into the nominated storage account instead of the default storage account that comes with the creation of the Log analytics workspace. Is this understanding correct?
Secondly, after clicking on the Custom logs & IIS logs item, and selecting a storage account from the Pop-up blade on the left hand side, Azure Portal reported a message Successfully linked storage account . However, the Linked storage accounts view still reports No linked storage accounts.
Browsing the target storage account, no log seems to be written to the storage account.
Updates 1
Storage account network configuration.
Updates 2
The answer is accepted as it is technically correct. However, a few steps/details are missing in the documentation. In order to, map a customer storage account to a LA Workspace, one must build resources to match the following diagram.
Create a AMPLS resource.
Link the AMPLS resource to your LA workspace.
Create private endpoint on the target vnet for the AMPLS resource
Create storage account.
Create print endpoints (blob type) on the target vnet
Link the storage account to the LA workspace.
We need to follow few prerequisites before linking the storage account to the workspace.
Storage account should be in the same region as log analytics workspace.
Need to give permissions for other services to allow accessing the storage account.
Allow Azure Monitor to access the storage account. If you chose to allow only select networks to access your storage account, you should select the exception: “Allow trusted Microsoft services to access this storage account”
For rest of the configuration information refer to MS Docs.
By following the above documentation, I can link the storage account successfully as below:

Does an Azure subscription owner have access to an Azure database in it?

This article says that an Azure subscription owner has access to all the resources in the subscription. However to get access to an Azure database, one must either be a user in the database, or be part of the Azure Admin AD group.
Can a subscription owner access the database regardless of the SQL security? If so, how?
The article you refer to gives a very high-level overview on RBAC roles provided in Azure.
It is important to understand these built-in roles that give access to the resources (the management plane) vs those that give access to the resource data (the data plane).
For example, many built-in roles give users access to data, for example: Storage and KeyVault.
As for databases, it all depends on the type of database engine your refer to. Each have specific particularities in terms of roles and permissions.
SQL Database is managed right in the SQL server. This link provides additional details on how this is done. SQL Database
Other modern database engines, such as Cosmos DB, come with different Azure Built-in roles (just like Key Vault or Storage). See this link in order to give you a better idea on the roles and permissions assigned for each roles. Role-based access control in Azure Cosmos DB

How to monitor read write activities on Azure Blob Storage

Need to figure out how to log/retrieve information about who (which Azure AD user) has read/write on blobs in our azure blob storage.
I know you can turn on logging on the storage account level using this:
I can see in the logs the different api calls that have been performed on the blob but If I myself went via the azure portal to open some of the blobs, I could not see this activity recorded in the logs. Any ideas how to monitor this? I need it for auditing purposes.
When you enable Storage Analytics on Portal, you will have $logs folder on your Blob with storage logs.
When you are using Azure AD authentication you need to configure 2.0 logs and use UserPrincipalName column to identify the user and parse the column with JSON AuthorizationDetail.action to identify the action of the user on storage, i.e. Microsoft.Storage/storageAccounts/blobServices/containers/read for list the blobs in a container.
You will not capture OAuth/Azure AD authenticated requests with log format 1.0.
On Azure Storage Uservoice there is also the request for integration with LogAnalytics to simplify logs monitoring, the private preview should start this month.

issue in azure pipeline using azure data factory to pull data from sql-server to azure blob

The client 'abc#abc.com' with object id 'abcabcabcabcabc' does not
have authorization to perform action
'Microsoft.Resources/deployments/write' over scope
'/subscriptions/abcabcabc/resourcegroups/abc-01-east/providers/Microsoft.Resources/deployments/publishing-123123123123'
I was trying to create a pipeline using azure data factory to pull data from sql-server to azure blob, but i am facing the above issue while i was trying to use my integration runtime which already exsist in my azure portal.
At present I have data factory contributor role assigned to me, what other roles should I have to avoid this issue?
I had a similar issue being a contributor for an ADF. With this role, you seem to be able to open the ADF UI, but the moment you try to publish anything, you get the above error. Making me a data factory contributor for that ADF didn't help.
What did help was making me a data factory contributor on the resource group level. So go to the resource group that contains the ADF, go to IAM and add you as a data factory contributor.
I also noticed, you need to close the data factory ui before IAM changes take effect.
Azure's roles are a bit of a mystery to me so it would be useful if someone could provide an explanation of how and why.
Steps
1 - Register an Enterprise APP in your Azure Active Directory
2 - Create a key in the Enterprise APP and save the value somewhere
3 - Go to your Azure SQL Database through Management Console and
CREATE USER [your application name] FROM EXTERNAL PROVIDER;
4 - Change the authentication method for Principal and use the application id and key on the form
For more information:
https://learn.microsoft.com/en-us/azure/data-factory/connector-azure-sql-database

Moving data from Azure Data Lake to another ADL belongs to different tenant using DataFactory

I am moving the data from a Azure Data Lake to another Azure Data Lake Store that belongs to another subscription (tenant) using DataFactory.
I am getting error on uploading the LinkedService of Sink Data Lake like invalid credentials,
So Is it really possible actually what I am doing ? If it is,Kindly let me see some reference.
You can do this. For each data lake you need a separate service principal, each in the same subscription as the data lake. You can then create the two separate connections to the data lakes using each service principal.
You need both the service principal application ID and a key for it.
Note that the service principal not only needs access rights to the folder that you want to copy from or too, but also to the root folder of data lake. I do not know if it also requires access rights to all directories between the root and the source/destination directory.
Note also that regardless of whether you are reading or writing data to this data lake, the service principal does need execute rights as well.
This isn't a complete answer, but just deals with the service principal side of the configuration to creating a connection. That is what gave me the most trouble.
Leave a comment if you feel more information would be useful
If you have not read already, please refer to https://learn.microsoft.com/en-us/azure/data-factory/data-factory-azure-datalake-connector , Search for "Service principal authentication (recommended)". If you have already, I am presuming the error is because you have not provided the appropriate permissions for the entity to the ADLS folders. Without the exact error you are seeing, cannot say it is the source or the sink. In short can you provide more details of your context first and then what error you are seeing?
Thanks,
Sachin Sheth
Program Manager,
Azure Data Lake.

Resources