How to create folder structure inside Azure Storage Container using PowerShell - azure

I have created the storage account along container in Azure using ARM templates. But I want to create folder structure inside the container using PowerShell script.
For example:
Folder1>SubFolder1>
Folder2>SUbFolder2>
…… etc
So, can anyone pls suggest me on this.

There are no directories inside Azure Storage Container. You can create a single container, and then, blobs inside of it.
Alternatively, you can use to include '/' in the blob name.
Eg:
account/container/2020/09/24/sample.txt
where "2020/09/24/sample.txt" is the blobname

You can't create the directories inside the Container
All blobs must reside in a blob container, which is simply a logical grouping of blobs. An account can contain an unlimited number of containers, and each container can store an unlimited number of blobs. You can include the / in the container name ("folder/1.txt"). You can create a folder structure
This SO thread may help you in your scenario
Creating an Azure Blob Hierarchy

You can create the folder structure using the below PowerShell Script.
#connecting to Storage account
$storageAccount = Get-AzStorageAccount -ResourceGroupName "<ResourceGroupName>"
-AccountName "<storage account>"
$ctx = $storageAccount.Context
# Passing Container name
$filesystemName = "<Container>"
# directory to be created
$folders1= #('folder1/folder2/folder3','folder4/folder5/folder6')
$FolderArray = ""
for ($i = 0; $i -le ($folders1.length - 1); $i += 1){
$dirname =""
$path =""
$filter=""
$FolderArray =$folders1[$i].Split("/")
for ($j = 0; $j -le ($FolderArray.length - 1); $j += 1){
$dirname = $dirname+$FolderArray[$j]+"/"
#print Directory name
$dirname
$path = $path + $(if ($j -eq 0) {"/"} else { "" })
$filter = $filter + $(if ($j -eq 0) {""} else { "/" })+ $FolderArray[$j]
# Check the directory is exist or not
$present = Get-AzDataLakeGen2ChildItem -Context
$ctx -FileSystem $filesystemName -Path $path |
Where-Object {$_.Name -eq "$($filter)"} -ErrorAction SilentlyContinue
# Create directory
if (! $present)
{
New-AzDataLakeGen2Item -Context $ctx -FileSystem
$filesystemName -Path $dirname -Directory
Write-Host "The directory $dirname is created"
}
# Show the existing folder
else
{
Write-Host "Folder named $dirname already exists"
Get-AzDataLakeGen2ChildItem -Context $ctx -FileSystem
$filesystemName -Path $path
}
$path = $(if ($path -eq "/"){$FolderArray[$j]} else {""}) +
$(if ($path -ne "/"){$path +"/"} else {""}) +
$(if ($path -ne "/"){$FolderArray[$j]} else {""})
}
}

Related

Iterate the files in an ADLS2 Azure Datalake Directory given a SAS url

I'd like to download the files from a ADLS2 Storage blob directory - I have only a SAS url to the said directory, and I would like to recursively download all the files in that directory.
It is very clear how to do this given the storage credentials, and there are many examples that show how to do it - but I couldn't find any which uses a SAS url.
Any clues or documentation links would be much appreciated!
I have reproduced in my environment, and I got expected results as below and I have taken code from #ROGER ZANDER's Blog:
function DownloadBlob {
param (
[Parameter(Mandatory)]
[string]$URL,
[string]$Path = (Get-Location)
)
$uri = $URL.split('?')[0]
$sas = $URL.split('?')[1]
$newurl = $uri + "?restype=container&comp=list&" + $sas
$body = Invoke-RestMethod -uri $newurl
$xml = [xml]$body.Substring($body.IndexOf('<'))
$files = $xml.ChildNodes.Blobs.Blob.Name
$files | ForEach-Object { $_; New-Item (Join-Path $Path (Split-Path $_)) -ItemType Directory -ea SilentlyContinue | Out-Null
(New-Object System.Net.WebClient).DownloadFile($uri + "/" + $_ + "?" + $sas, (Join-Path $Path $_))
}
}
Then call DownloadBlob Function and Give SAS URL.
Output:
In Local Machine Downloaded File:

Azure PowerShell script to compare files in the blob with local destination and download files which is not available in the destination

Could you please help me with Azure PowerShell script to compare files in the blob with local destination and download files which is not available in the destination.
I tried some, but able to get the answer.
$blobNames = Get-Content
For ($i=0; $i -lt $blobNames.Length; $i++) {
$blob = $blobNames[$i]
Write-Host "Downloading $blob. Please wait."
Get-AzStorageBlobContent -Blob $blob -Container $containerName -Destination $destination -Context $context -Verbos
PowerShell script to compare files in the blob with local destination and download files which is not available in the destination:
I've created a script and it works for me:
$ContainerName ='<containername>'
$destination_path = 'C:\Users\xxxx\Desktop\blobcheck' #pathtobedownloaded
$Ctx = New-AzureStorageContext '<storageaccount>' -StorageAccountKey 'accesskey'
$Blobs = Get-AzureStorageBlob -Container $ContainerName -Context $Ctx
$localfile = get-childitem -LiteralPath C:\Users\xxxxxx
For ($i=0; $i -lt $Blobs.Length; $i++) {
if($Blobs[$i].Name -eq $localfile){
Write-Host "Presented"
}
else{
$blob = $Blobs[$i].Name
Write-Host "Downloading $blob. Please wait."
Get-AzureStorageBlobContent -Blob $blob -Container $ContainerName -Destination $destination_path -Context $Ctx -Verbose
}
}
Compared and downloaded the files which is not present in my local folder:
Files uploaded in My Local folder (destination_path):
Prompting to rewrite if the file is already exists as shown below:
If the prompt is not needed, then you can simply use
DisplayAlerts = FALSE
Note: Get storage access key from Azure Portal:
Goto <Storageaccount> -> Access keys

Azure File Storage deletion calculation

I am trying to delete Files out of the Azure File Storage that are that are 30 + 1 on the first day of the month or older.
I have basic list and remove script that works. My main question is how do I do a calculation for a if older than statement?
$resourceGroupName=""
$storageAccName=""
$fileShareName=""
$directoryPath=""
## Function to Lists directories and files
Function GetFiles
{
Write-Host -ForegroundColor Green "Lists directories and files.."
## Get the storage account context
$ctx=(Get-AzStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccName).Context
## List directories
$directories=Get-AZStorageFile -Context $ctx -ShareName $fileShareName
## Loop through directories
foreach($directory in $directories)
{
write-host -ForegroundColor Magenta " Directory Name: " $directory.Name
$files=Get-AZStorageFile -Context $ctx -ShareName $fileShareName -Path $directory.Name | Get-AZStorageFile
## Loop through all files and display
foreach ($file in $files)
{
write-host -ForegroundColor Yellow $file.Name
}
}
}
GetFiles
$context = ""
Remove-AzStorageFile -ShareName "name" -Path "path" -Context $context
Get-ChildItem -Path $path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force
We do have AZ CLI command to delete files older than XX days.
// Delete old files block
$filelist = az storage file list -s $myshare --account-name $accountName --account-key $accountKey
$fileArray = $filelist | ConvertFrom-Json
foreach ($file in $fileArray | Where-Object {$_.properties.lastModified.DateTime -lt ((Get-Date).AddDays(-31))})
{
$removefile = $file.name
if ($removefile -ne $null)
{
Write-Host "Removing file $removefile"
az storage file delete -s $myshare -p $removefile
}
}
Reference So Thread: Use Azure Cli to Delete Old Files in Azure file share - Stack Overflow

Upload multiple folders from local storage to Azure as new containers with folder contents

We have Azure Blob Storage Accounts with 100s of containers. The file structure is something like below:
container_01
|
--somemedia.jpg
--anothermedia.jpg
container_02
|
--secondcontainersmedia.jpg
--andSoOn
--AndSoOnAndSoOn
My client wants to download all of the containers to local storage so that if necessary they can be re-uploaded to Azure. After doing some research I found this blog post. Updating the script from there to suit my needs (just updating from AzureRM to AZ and my personal connection and local path), I came up with the following suitable script for downloading the files.
$destination_path = 'C:\Storage Dump Test'
$connection_string = '[Insert Connection String]'
$storage_account = New-AzStorageContext -ConnectionString $connection_string
$containers = Get-AzStorageContainer -Context $storage_account
Write-Host 'Starting Storage Dump...'
foreach ($container in $containers)
{
Write-Host -NoNewline 'Processing: ' . $container.Name . '...'
$container_path = $destination_path + '\' + $container.Name
if(!(Test-Path -Path $container_path ))
{
New-Item -ItemType directory -Path $container_path
}
$blobs = Get-AzStorageBlob -Container $container.Name -Context $storage_account
Write-Host -NoNewline ' Downloading files...'
foreach ($blob in $blobs)
{
$fileNameCheck = $container_path + '\' + $blob.Name
if(!(Test-Path $fileNameCheck ))
{
Get-AzStorageBlobContent `
-Container $container.Name -Blob $blob.Name -Destination $container_path `
-Context $storage_account
}
}
Write-Host ' Done.'
}
Write-Host 'Download complete.'
So now I have a directory on my local storage with hundreds of folders containing media items. I need to create a PS script (or find some other way) to basically do the opposite-- take all the folders in that directory, create containers using the names of the folders, and upload the items within each folder to the appropriate container.
How should I start going about this?
You'd have a lot more success, quicker, using azcopy instead of working with the azure cmdlets. To copy:
azcopy copy '<local-file-path>' 'https://<storage-account-name>.<blob| dfs>.core.windows.net/<container-name>/<blob-name>'
It can also create containers:
azcopy make 'https://mystorageaccount.blob.core.windows.net/mycontainer'
azcopy can download an entire container without you having to specify each file. Use --recursive
See: https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10

How to rename application pool that already has application assigned to it?

I have an Application pool that has a lot of applications been assigned to it, it won't let me rename.
Beside delete and creating a new application pool, is there anyway to get a new name for my application pool? I don't want to go and reassign every application in it.
Assign applications to another pool, rename the one you wanted renamed. Re-assign applications back to your pool.
IIS doesn't support other options
This was the simplest way that I could work it out, although I can't believe this isn't easier.
Import-Module WebAdministration
$oldName = "OldAppPool";
$newName = "NewAppPool";
if(-not (Test-Path IIS:\AppPools\TempPool)){
New-WebAppPool TempPool
}
$tempAppPool = Get-Item IIS:\AppPools\TempPool
foreach($site in Get-ChildItem IIS:\Sites){
$apps = $site | Get-ChildItem | Where-Object { $_.ApplicationPool -eq $oldName }
foreach($app in $apps){
$path = ("IIS:\Sites\{0}\{1}" -f $site.name, $app.name)
$path
Set-ItemProperty $path applicationPool TempPool
}
}
Set-ItemProperty "IIS:\AppPools\$oldName" -Name name -Value $newName
foreach($site in Get-ChildItem IIS:\Sites){
$apps = $site | Get-ChildItem | Where-Object { $_.ApplicationPool -eq "TempPool" }
foreach($app in $apps){
$path = ("IIS:\Sites\{0}\{1}" -f $site.name, $app.name)
$path
Set-ItemProperty $path applicationPool $newName
}
}
Remove-WebAppPool TempPool
No, there isn't.
Either put up with the name, or create a new App Pool and assign the applications one-by-one.
If you need to repeat it on multiple servers, you can even automate it with ADSI and JavaScript or VBScript:
http://msdn.microsoft.com/en-us/library/ms525389(v=vs.90).aspx
I've created similar script to automate this job.
It is a bit different from the other answer here:
It works for WebSites in addition to WebApplications;
It works for all pools: with and without assigned applications;
Powershell script:
Import-Module WebAdministration
Function Rename-AppPool([String]$oldName="", [String]$newName="") {
if ($oldName -eq "") {
Write-Warning "Parameter 'oldName' was not provided."
return
}
if ($newName -eq "") {
Write-Warning "Parameter 'newName' was not provided."
return
}
if(-not (Test-Path "IIS:\AppPools\$oldName")){
Write-Warning "There is no pool with name '$oldName' to rename. Operation stopped."
return
}
if(Test-Path "IIS:\AppPools\$newName"){
Write-Warning "Pool with name '$newName' already exists. Operation stopped."
return
}
Write-Output "Renaming app pool '$oldName' to '$newName'"
$pathsOfPools = New-Object System.Collections.ArrayList
$listOfSites = Get-ChildItem "IIS:\Sites"
foreach ($site in $listOfSites) {
if ($site.applicationPool -eq $oldName) {
$path = ("IIS:\Sites\{0}" -f $site.name)
$pathsOfPools.Add($path) | Out-Null
}
$apps = $site | Get-ChildItem
foreach ($app in $apps) {
if ($app.applicationPool -eq $oldName) {
$path = ("IIS:\Sites\{0}\{1}" -f $site.name, $app.name)
$pathsOfPools.Add($path) | Out-Null
}
}
}
$tempGuid = [Guid]::NewGuid()
$tempName = $tempGuid.Guid
if ($pathsOfPools.Count -gt 0) {
$pathsOfPools
New-WebAppPool $tempName | Out-Null
Write-Output "Temp app pool '$tempName' has been created"
Write-Output "Changing apps to Temp pool"
foreach ($path in $pathsOfPools) {
Set-ItemProperty $path applicationPool $tempName
}
}
Set-ItemProperty "IIS:\AppPools\$oldName" -Name name -Value $newName
Write-Output "Application pool name has been changed"
if ($pathsOfPools.Count -gt 0) {
Write-Output "Changing apps to New pool"
foreach ($path in $pathsOfPools) {
Set-ItemProperty $path applicationPool $newName
}
Remove-WebAppPool $tempName
Write-Output "Temp pool has been removed"
}
}
Rename-AppPool "OldName" "NewBetterName"
Yes, there is an option. Create a dummy app pool or make use of DefaultApppool. Associate the existing site to the defaultapppool . Now go to the original app pool, Stop the app pool and rename.
Associate back the url to the renamed appool.

Resources