How can I get a list of directories in my container?
I can use Get-AzureStorageBlob to get all the blobs and filter by distinct prefix /name/, but it might be slow with millions of blobs.
Is there a proper way of achieving this in PowerShell?
There's no concept of directories, only containers and blobs. A blob name may have delimiters with look like directories, and may be filtered.
If you choose to store millions of blobs in a container, then you'll be searching through millions of blob names, even with delimiter filtering, whether using PowerShell, SDK, or direct REST calls.
As far as "proper" way: There is no proper way: Only you can decide how you organize your containers and blobs, and where (or if) you choose to store metadata for more efficient searching (such as a database).
The other answer is correct that there is nothing out of the box as there is no real thing as a folder however only file names that contain a folder like path.
Using a regex in PowerShell you can find top-level folders. As mentioned, this may be slow is there are millions of items in your account but for a small number it may work for you.
$context = New-AzureStorageContext -ConnectionString '[XXXXX]'
$containerName = '[XXXXX]'
$blobs = Get-AzureStorageBlob -Container $containerName -Context $context
$folders = New-Object System.Collections.Generic.List[System.Object]
foreach ($blob in $blobs)
{
if($blob.Name -match '^[^\/]*\/[^\/]*$')
{
$folder = $blob.Name.Substring(0,$blob.Name.IndexOf("/"));
if(!$folders.Contains($folder))
{
$folders.Add($folder)
}
}
}
foreach ($folder in $folders)
{
Write-Host $folder
}
Related
My PowerShell is VERY rusty, so please bear with me. I've been tasked to bulk tag Azure Resources based on CSV data, specifically Azure VM's. In this CSV are 3 headers (VMName, TagName, TagValue). I've tried to automate this task with PowerShell and no matter how I format the code, I keep falling short. Can someone help me clear this up or perhaps point me in the direction of a known working PS script that will help me accomplish this?
`
Connect-AzAccount -TenantId ''
Set-AzContext -Subscription ''
$Import = Import-Csv -Path '...\Tags.csv' |
ForEach-Object {
$_.psobject.properties |
ForEach-Object {Set-Variable -Name $_.Name -Value $_.Tags}
foreach ($Name in $Import) {
$Tag = $_.Name.Tags
$Tag.Add($_.Tags)
Set-AzResource -ResourceId $Name.ResourceId -Tag -Force
}
}
`
I've tried a hash table and a fully customized script. It either only applies the Tag Name and not the Tag Value, and it needs both, or it shoots off error after error. Microsoft seems to want bulk tagging at the subscription and resource group level, so it's a bit difficult to get this right specific to resources. In the end, I want the script to read the server name in Row 1 Column 1, find that resource in Azure, and create the Tag using the Tag Name (Row 1 Column 2) and Tag Value (Row 1 Column 3).
After reproducing from my end, I could able to get this work using New-AzTag. Below is the complete script that worked for me.
$Import = Import-Csv -Path 'Tags.csv'
foreach($I in $Import) {
$Resource = Get-AzResource -Name $I.VMName
$TagName=$I.TagName
$TagValue=$I.TagValue
New-AzTag -ResourceId $Resource.ResourceId -Tag #{"$TagName"="$TagValue"}
}
RESULTS:
Tags.csv
In portal
Is there a way to fetch all details of routing from each subscription with Route-Table Name, subscriptions, next hop type address prefix.
I have tried 'Get-AzRouteTable -ResourceGroupName "" -Name "prod" | Get-AzRouteConfig | Export-Csv azureroutetable2.csv' but getting only from specific environment, is there a better way to do the same?
Regards
Devbrat
According to the help for Get-AzRouteTable it doesn't require any value to be specified for ResourceGroupName so we should just be able to loop through your available contexts. If you did have a large Azure Network infrastructure and they were logically organised by Resource Group, you may want to consider looping through Resource Groups too.
That being said, you can do something like this. It will create a csv for each subscription in your current session. Change the . at the beginning of the path value to set a full path if you would like.
$Contexts = Get-AzContext -ListAvailable
foreach ($Context in $Contexts) {
[void](Set-AzContext -SubscriptionId $Context.Subscription.Id)
$RouteConfig = Get-AzRouteTable -Name "prod" | Get-AzRouteConfig
# If you are using PowerShell 5.1, add '-NoTypeInformation' to the end of this command.
$RouteConfig | Export-Csv -Path ".\$($Context.Subscription.Name)_Routes.csv" -Encoding utf8 -NoClobber
}
Can anyone suggest a reasonably practical and efficient way to load 1.2 million test items into a SharePoint Online list?
Background: We've decided to build a new application on top of SharePoint Online. Other application architecture options have all proved non-viable for various reasons. The application will use several SharePoint lists for persistence, one of which will be large, about 1.2 million items at peak. (Yes, we're planning ways to handle the 5000 item view limit.) To test viability of the architecture (including those view limit tactics) we need to create 1.2M test items in a list. Nothing we've tried has been practical:
Tried making POST calls to the REST API, with 5 concurrent threads so it will finish in a reasonable time. This fails after a bit with a HTTP 429 "Too many requests".
Tried uploading a spreadsheet with 1.2M rows. This fails at 130K entries each time, and I don't see a practical way to either upload / append to an existing list, nor to append items from one list to another existing list.
Tried running a Workflow (SharePoint 2013 variety, if that matters). This works but runs way too slow single threaded and I'm hesitant to try multiple concurrent workflows because this is a shared environment and if I trash the server that would be way not good.
Thanks in advance for any pointers!
You could try to use pnp powershell to add more than 1 million items.
$username = "amos#contoso.onmicrosoft.com"
$password = "password"
$cred = New-Object -TypeName System.Management.Automation.PSCredential -argumentlist $userName, $(convertto-securestring $Password -asplaintext -force)
Connect-PnPOnline -Url https://contoso.sharepoint.com/sites/dev -Credentials $cred
$ListName ="testn"
for($i=0;$i -lt 1000001;$i++){
Add-PnPListItem -List $ListName -Values #{"Title" = "test";}
}
Fastest way to load and process items from SPO using PnPPowershell goes something like following. Idea is to not initialize the items collection in any variable and directly process the items by page size.
Get-PnpListItem -List "{ListName}" -Fields "Field1","Field2","Fieldn" -PageSize 5000 | % {$i=0}{
% {$i=0}{
Do not move { to next line, I know it's weird but if you move, then good luck.
$item = $_
#Do your stuff with $item
Write-Host $item.Id
}
I have "billing reader" access to several hundred subscriptions in an EA.
I'm trying to get a list of virtual machines and their sizes across all subscriptions.
So currently when I run a "Get-AzureRMSubscription" it shows me all the subscriptions (hundreds of them), but i'm not sure how to actually run a script against all the subscriptions?
Would be great to get a "Get-AzureRMVM" across them all
Any suggestions?
Thanks in advance!
You can possibly do something like this:
$azureSubs = Get-AzureRMSubscription
$azureSubs | ForEach-Object {Select-AzureRMSubscription $_ | Out-Null; Get-AzureRMVM -WarningAction SilentlyContinue}
You are essentially setting an array variable to hold all your Azure Subscription and piping it to the ForEach-Object cmdlet to iterate all of the objects in the array. Then you pipe it to the Get-AzureRMVM cmdlet to list all VMs in each subscription.
This is definitely not optimized for performance and there might be better solutions out there, but at least you can run it and forget it.
The reason for the Out-Null and -WarningAction is to suppress the outputs you do not need.
You didn't ask but for classic resources we have the following script run on a regular basis and its output stored in a SQL Database.
$subscriptions = Get-AzureSubscription
foreach ($sub in $subscriptions)
{
$sub | Select-AzureSubscription
Get-AzureService | % {
Get-AzureDeployment -ServiceName $_.ServiceName
} | % {
New-Object -TypeName 'PSObject' -Property #{ 'ServiceName' = $_.ServiceName; 'Addresses' = $_.VirtualIPs.Address; }
} | sort Addresses | ft
}
% is ForEach-Object, ft is Format-Table although some kind souls may come along and try to edit this and make it harder to reuse. You can add/remove properties in the select statement to tailor your output as needed. Try it in one subscription to refined your needs, then create a script to make it easy to reuse.
We recently released Azure Resource Graph to support these types of searches across multiple subscriptions. See documentation here https://learn.microsoft.com/en-us/azure/governance/resource-graph/overview
I've got an existing set of azure storage tables that are one-per-client to hold events in a multi-tenant cloud system.
Eg, there might be 3 tables to hold sign-in information:
ClientASignins
ClientBSignins
ClientCSignins
Is there a way to dynamically loop through these as part of either a copy operation or in something like a Pig script?
Or is there another way to achieve this result?
Many thanks!
If you keep track of these tables in another location, like Azure Storage, you could use PowerShell to loop through each of them and create a hive table over each. For example:
foreach($t in $tableList) {
$hiveQuery = "CREATE EXTERNAL TABLE $t(IntValue int)
STORED BY 'com.microsoft.hadoop.azure.hive.AzureTableHiveStorageHandler'
TBLPROPERTIES(
""azure.table.name""=""$($t.tableName)"",
""azure.table.account.uri""=""http://$storageAccount.table.core.windows.net"",
""azure.table.storage.key""=""$((Get-AzureStorageKey $storageAccount).Primary)"");"
Out-File -FilePath .\HiveCreateTable.q -InputObject $hiveQuery -Encoding ascii
$hiveQueryBlob = Set-AzureStorageBlobContent -File .\HiveCreateTable.q -Blob "queries/HiveCreateTable.q" `
-Container $clusterContainer.Name -Force
$createTableJobDefinition = New-AzureHDInsightHiveJobDefinition -QueryFile /queries/HiveCreateTable.q
$job = Start-AzureHDInsightJob -JobDefinition $createTableJobDefinition -Cluster $cluster.Name
Wait-AzureHDInsightJob -Job $job
#INSERT YOUR OPERATIONS FOR EACH TABLE HERE
}
Research:
http://blogs.msdn.com/b/mostlytrue/archive/2014/04/04/analyzing-azure-table-storage-data-with-hdinsight.aspx
How can manage Azure Table with Powershell?
In the end I opted for a couple Azure Data Factory Custom Activities written in c# and now my workflow is:
Custom activity: aggregate the data for the current slice into a single blob file for analysis in Pig.
HDInsight: Analyse with Pig
Custom activity: disperse the data to the array of target tables from blob storage to table storage.
I did this to keep the pipelines as simple as possible and remove the need for any duplication of pipelines/scripts.
References:
Use Custom Activities In Azure Data Factory pipeline
HttpDataDownloader Sample