Grant access to Azure Data Lake Gen2 using a parameterized script - azure

We are trying to grant read/write access to many folders in our Azure data Lake gen 2 containers and although we can do this through the UI, it's quite tedious and has to be repeated for all environments. Has anyone used a better way using Powershell to automate or at least parameterize this process of granted access to Azure Data Lake gen 2 containers and avoid granting access manually?
Unfortunately I couldn't get this to work using the following link or other documentation as it's for Gen 1 but it's very similar to what I need to do for gen 2.
https://www.sqlchick.com/entries/2018/3/17/assigning-data-permissions-for-azure-data-lake-store-part-3

According to my test, we can use the PowerShell to manage Azure Data Lake Gen2 permissions. For more details, please refer to the document
Install the required module
install-Module PowerShellGet –Repository PSGallery –Force
install-Module Az.Storage -Repository PSGallery -RequiredVersion 1.9.1-preview –AllowPrerelease –AllowClobber –Force
Besides, please note that if you want to install the module, you need to meet some conditions
.NET Framework is 4.7.2 or greater installed
PowerShell is 5.1 or higher
Script
Connect-AzAccount
$groupName=""
$accountName=""
$account= Get-AzStorageAccount -ResourceGroupName $groupName -Name $accountName
$ctx = $account.Context
$filesystemName = "test"
$dirname="template/"
$Id = "<the Object ID of user, group or service principal>"
$dir=Get-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname
$acl = New-AzDataLakeGen2ItemAclObject -AccessControlType user -EntityId $id -Permission "rw-" -InputObject $dir.ACL
Update-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname -Acl $acl
$dir=Get-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname
$dir.ACL

Thanks Jim Xu for providing the script above. I'm just complementing the code with the following items :
Get all folders from the container
Assign ACL for all folders
Propagate ACL to all subfolders
$groupName="resource group name"
$accountName="storage account name"
$account= Get-AzStorageAccount -ResourceGroupName $groupName -Name $accountName
$ctx = $account.Context
$filesystemName = "container name"
$Id = (Get-AzADGroup -DisplayName '<type user / group name here>').Id
$items = Get-AzDataLakeGen2ChildItem -Context $ctx -FileSystem $filesystemName
foreach ( $item in $items) {
$dir = Get-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path "$($item.Path)/"
$acl = New-AzDataLakeGen2ItemAclObject -AccessControlType group -EntityId $id -Permission "rwx" -InputObject $dir.ACL -DefaultScope
# Update ACL on blob item
Update-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path "$($item.Path)/" -Acl $acl
# Propagate ACL to child blob items
Set-AzDataLakeGen2AclRecursive -Context $ctx -FileSystem $filesystemName -Path "$($item.Path)/" -Acl $acl
}

Related

Want to all details of blob storage in one script

Script:
$storageAccountKey = (Get-AzStorageAccountKey -ResourceGroupName "RD-RDIPCloudMigration-AppResources-devtest" -AccountName "rdipstoragegen2").Value[0]
$ctx = New-AzStorageContext -StorageAccountName "rdipstoragegen2" -StorageAccountKey $storageAccountKey
Get-AzDataLakeGen2ChildItem -Context $ctx -FileSystem "rdipdata" -Path "eval/raw/rdipclinicaltrial/data/warehouse/integrated/clntrl_reference_use.db" | export-csv "test.csv" -NoTypeInformation
By using this I am only able to get details of file level, if I run this script again and again in each level the only i can get details of the file inside the folder.
FYI
Kindly help me to get one script by which i can get all the details include file inside the folder.
There is a -Recurse switch within the Get-AzDataLakeGen2ChildItem cmdlet that:
Indicates if will recursively get the Child Item. The default is
false.

Upload a file to azure data lake storage using PowerShell

The code below works manually but what I'm trying to do is find a way I can run it in an automated way through a job and log in to my azure account also not expose the account key in the code.
any advice?
{Connect-AzAccount
Select-AzSubscription -SubscriptionId <subId>
$ctx = New-AzStorageContext -StorageAccountName <Accounttest> -UseConnectedAccount
$ctx = New-AzStorageContext -StorageAccountName <Accounttest>
-StorageAccountKey <accoutnKey>
$filesystemName = "DataFiles"
New-AzStorageContainer -Context $ctx -Name $filesystemName
# folder directory
$filesystemName = " DataFiles "
$dirname = "my-directory/"
New-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname -Directory
#Upload a file to a directory
$localSrcFile = "upload2.txt"
$filesystemName = " DataFiles "
$dirname = "my-directory/"
$destPath = $dirname + (Get-Item $localSrcFile).Name
New-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $destPath -Source $localSrcFile -Force
}
I think it will be better to put your PowerShell script in a DevOps pipeline in Azure Pipelines for instance. A Service principal would be needed. You would need to configure the service connection in Azure DevOps and your service principal should be Contributor on the resource group where the Data Lake Storage is located.
See link: Manage service connections

File uploaded with <no name> using Set-AzStorageBlobContent

I am trying to upload file from my local machine to azure blob using Powershell.
Code
$date = Get-Date -Format "dd-MM-yyyy"
$Ctx = New-AzStorageContext -StorageAccountName abc -StorageAccountKey 'xxxxxx=='
Set-AzStorageBlobContent -Container jmeter -Blob "Result/Load1/$date/" -File "C:\Result.jtl" -Context $Ctx -Force
The hierarchy created successfully inside the container jmeter
e.g : Result/Load1/13-03-2020
But file uploaded with no name
As mentioned in the comment, it should be -Blob "Load1/$date/Result.jtl".
Set-AzStorageBlobContent -Container test1 -Blob "Load1/$date/Result.jtl" -File "C:\Users\joyw\Desktop\Result.jtl" -Context $Ctx -Force

Upload data from Azure Automation Account Powershell

I have some code which pulls out a Kusto query from Azure Monitor and I need to upload the data to a blob storage account for long term retention.
I can pull out the data and display it in the azure automation screen when I run through the test pane, but it doesn't upload to blob.
I think the error is here
$SearchResult
$StorageAccountName = Get-AutomationVariable -Name "AccessKey"
$StorageAccountKey = Get-AutomationVariable -Name "StorageAccName"
foreach ($sr in $SearchResult){
$ctx = New-AzureStorageContext -StorageAccountName $StorageAccountName `
-StorageAccountKey $StorageAccountKey
$ContainerName = "Data"
New-AzureStorageContainer -Name $ContainerName -Context $ctx -Permission Blob
$BlobName = "$sr"
Set-AzureStorageBlobContent -Container $ContainerName -Blob $BlobName `
-Context $ctx
The full script is below
https://pastebin.com/embed_iframe/RyLJZVKW
Basically it authenticates using some stored variables and then runs a query which returns the results below (up to that part works), but then I'd like to upload the data to Blob.
The output example can be found at:
https://pastebin.com/embed_iframe/fEF6NsnK
If there's a better way of getting a kusto query stored straight to blob storage I'd be happy to consider that .. thanks everyone :)
Your Set-AzureStorageBlobContent call seems to be missing the -File parameter. You probably get a message complaining about this in the job Error stream.
Assuming that you want to send the data stored in the $sr variable, something like this should work (not tested):
$tempFile = New-TemporaryFile
$sr | Out-File -FilePath $tempFile.FullName
Set-AzureStorageBlobContent ... -File $tempFile.FullName
Remove-Item $tempFile

Transfer multiple files and folder to a Azure storage

I have storage that's where I host a static website. I want to create a task that transfers my build files into that storage.
$context = New-AzureStorageContext -StorageAccountName $env:prSourceBranchName -StorageAccountKey "e4Nt0i1pcsYWnzLKw4PRdwu+************/qMj7fXyJK6lS4YTlPCOdbFlEG2LN9g2i5/yQ=="
Get-ChildItem -Path $env:System_DefaultWorkingDirectory/_ClientWeb-Build-CI/ShellArtifact/out/build
Get-ChildItem -Path $env:System_DefaultWorkingDirectory/_ClientWeb-Build-CI/ShellArtifact/out/build ls -File -Recurse | Set-AzureStorageBlobContent -Container $env:prSourceBranchName -Context $context
When I run the code:
I can see the files that need to be uploaded but when I check my $web blob (created by the Static Website) it's empty.
https://learn.microsoft.com/en-us/powershell/module/azure.storage/set-azurestorageblobcontent?view=azurermps-6.13.0 check out the second example.
Can someone explain why nothing is happening?
I want to do this > https://learn.microsoft.com/en-us/cli/azure/storage/blob?view=azure-cli-latest#az-storage-blob-upload-batch but through AzureRM.
Remove ls from your last line of code which is used to upload to blob storage:
Change:
Get-ChildItem -Path $env:System_DefaultWorkingDirectory/_ClientWeb-Build-CI/ShellArtifact/out/build ls -File -Recurse | Set-AzureStorageBlobContent -Container $env:prSourceBranchName -Context $context
To:
Get-ChildItem -Path $env:System_DefaultWorkingDirectory/_ClientWeb-Build-CI/ShellArtifact/out/build -File -Recurse | Set-AzureStorageBlobContent -Container $env:prSourceBranchName -Context $context

Resources