List of all subscription - sharepoint

How to check all report's subscriptions on Sharepoint 2010?
I know only how to check subsciption on specific report:

Unfortunately, there is no way way for you to do this from the GUI. You are going to have to break out PowerShell to get this information.
NOTE: I haven't tested this code, kinda writing it from the hip, but the gist should be able to help you out.:
$spWeb = Get-SPWeb <Site reports are contained in>
$spRepList = $spWeb.Lists["<List containing reports Name>"];
#Get all files
$spFileList = $spRepList.Files;
foreach($spFile in $spFileList)
{
#determine if the file is a report or a regular document
if($spFile.URL -like "*.rdl")
{
$reportServiceProxy = New-WebServiceProxy -URI <url to the reporting web service> -Namespace <Namespace of the service> -UseDefaultCredentials
$subscriptionList += $reportServiceProxy.ListSubscriptions($site);
#From here you can write to a file or write to the screen. I will let you decide
$subscriptionList | select Path, report, Description, Owner, SubscriptionID, lastexecuted,Status | where {$_.path -eq $spFile.URL}
}
}
Hope this helps.
Dave

Related

Query Multi-Domain for SamAccountName using PowerShell

I'm trying to populate an employee ID column in a CSV file by querying Active Directory against another column in the CSV file called "OwnerEmail" (Which is the user principal name). The problem is users in the owneremail column do not all belong to the same domain. How can I query 2 domains at one time?
Table for reference
Employee ID
OwnerEmail
DeptNumber
Costs
test#mail.com
0894
4654.45
test2#mail.com
3453
4994.15
This is what I have tried so far. The script isn't working and there are no error messages. Any Ideas
$Domains='us.domain.corp', 'uk.domain.corp'
$CSVImport |Select-Object #{
Name = "employeeID"
Expression = {
foreach($user in $CSVImport)
{
foreach($Domain in $Domains){
$user= (Get-ADUser -Filter "UserPrincipalName -eq '$($_.OwnerEmail)'" -Server $Domain -Properties 'SamAccountName').SamAccountName
}
}
}}, * |Select-Object employeeID, DepartmentNumber, OwnerEmail, #{Name="Costs"; Expression={"$ $($_.Cost)"}} | Export-Csv "$Env:temp/$OutputFile" -NoTypeInformation
How can I query 2 domains at one time?
There is no need to do this, you can query both at once with multithreading but seems like an overkill. What I would recommend is to query all users at once per Domain, the code below may seem awfully complicated but should be pretty efficient. See the inline comments for details.
# Import the Csv
$CSVImport = Import-Csv path\to\thecsv.csv
# Create a LDAP Filter to query all users at once
# This filter would look like this for example:
# (|(userPrincipalName=test#mail.com)(userPrincipalName=test2#mail.com))
$filter = "(|"
foreach($email in $CSVImport.OwnerEmail) {
if(-not [string]::IsNullOrWhiteSpace($email)) {
$filter += "(userPrincipalName=$email)"
}
}
$filter += ")"
# For each Domain, use the same filter and get all existing users
'us.domain.corp', 'uk.domain.corp' | ForEach-Object { $map = #{} } {
foreach($user in Get-ADUser -LDAPFilter $filter -Server $_) {
# and store them in a hashtable where
# the Keys are their `UserPrincipalName`
# and the Values are the attribute of interest (`SamAccountName`)
$map[$user.UserPrincipalName] = $user.SamAccountName
}
}
# Now we can simply use a calculated property with `Select-Object`
$CSVImport | Select-Object #{N='EmployeeID'; E={ $map[$_.OwnerEmail] }}, * |
Export-Csv "$Env:temp/$OutputFile" -NoTypeInformation

Need to apply an if condition based on a check in Powershell

I am new to Powershell. I am actually getting the details of the azure data factory linked services but after get I need to use contains to check if the element exists. In python I would just check if string in a list but powershell not quite sure. Please check the code below.
$output = Get-AzDataFactoryV2LinkedService -ResourceGroupName $ResourceGroupName -DataFactoryName "xxxxxxxx" | Format-List
The output of the below is :
sample output given below
LinkedServiceName : abcdef
ResourceGroupName : ghijk
DataFactoryName : lmnopq
Properties : Microsoft.Azure.Management.DataFactory.Models.AzureDatabricksLinkedService
So now I try to do this:
if ($output.Properties -contains "Microsoft.Azure.Management.DataFactory.Models.AzureDatabricksLinkedService") {
Write-Output "test output"
}
But $output.Properties gives us the properties of that json.
I need to check if "Microsoft.Azure.Management.DataFactory.Models.AzureDatabricksLinkedService" exists in output variable and perform the required operations. Please help me on this.
The -contains operator requires a collection and an element. Here's a basic example of its proper use:
$collection = #(1,2,3,4)
$element1 = 5
$element2 = 3
if ($collection -contains $element1) {'yes'} else {'no'}
if ($collection -contains $element2) {'yes'} else {'no'}
What you've done is ask PowerShell to look in an object that isn't a collection for an element of type [string] and value equal to the name of that same object.
What you need to do is inspect this object:
$output.Properties | format-list *
Then once you figure out what needs to be present inside of it, create a new condition.
$output.Properties.something -eq 'some string value'
...assuming that your value is a string, for example.
I would recommend watching some beginner tutorials.

How to remove azure file share old data from the azure storage account?

I have 3 months of old data which is stored on azure storage account ? Now i want remove the data
if it >= 30days
The following script lists files/FileDir recursively in a file share and delete the files older than 14 days.You may give the required day limit.
Refered from thread here.
Refer this for the best practices before delete activity.
$ctx = New-AzStorageContext -StorageAccountName $accountName -StorageAccountKey $key
$shareName = <shareName>
$DirIndex = 0
$dirsToList = New-Object System.Collections.Generic.List[System.Object]
# Get share root Dir
$shareroot = Get-AzStorageFile -ShareName $shareName -Path . -context $ctx
$dirsToList += $shareroot
# List files recursively and remove file older than 14 days
While ($dirsToList.Count -gt $DirIndex)
{
$dir = $dirsToList[$DirIndex]
$DirIndex ++
$fileListItems = $dir | Get-AzStorageFile
$dirsListOut = $fileListItems | where {$_.GetType().Name -eq "AzureStorageFileDirectory"}
$dirsToList += $dirsListOut
$files = $fileListItems | where {$_.GetType().Name -eq "AzureStorageFile"}
foreach($file in $files)
{
# Fetch Attributes of each file and output
$task = $file.CloudFile.FetchAttributesAsync()
$task.Wait()
# remove file if it's older than 14 days.
if ($file.CloudFile.Properties.LastModified -lt (Get-Date).AddDays(-14))
{
## print the file LMT
# $file | Select #{ Name = "Uri"; Expression = { $_.CloudFile.SnapshotQualifiedUri} }, #{ Name = "LastModified"; Expression = { $_.CloudFile.Properties.LastModified } }
# remove file
$file | Remove-AzStorageFile
}
}
#Debug log
# Write-Host $DirIndex $dirsToList.Length $dir.CloudFileDirectory.SnapshotQualifiedUri.ToString()
}
(OR)
Delete activity can be configured from Azure data Factory(ADF) by following the steps below.This require linking your azure storage account with the ADF by giving account name, fileshare name and path if needed.
Deploy an ADF if not already configured:
Open ADF and create a PipeLine with any name.
The Process is :We get the metadata of the file shares in storage account selected>Loop through them>configure Delete activity for the files inside them older than 30 days(or say some x days)
Search for get meta data >> select and drag to place in the area
shown and name the file. 2) Then navigate to Dataset(This data set
points to the files in storage account),select new . So select azure
file storage>Name it 3) >>Select the format as csv 4)>>Link your account
by setting up the properties > give the file path.
To get files older than or equals 30 days you can configure endtime
as #adddays(utcnow(),-30)
Now we have to use foreach loop to iterate through the array of
files. Drag and drop foreach loop from activities and connect/link the arrwoed line with get metadata.
In settings ,Tick mark sequential box to iterate files in sequential order.
childItems is the property of getmetadata output JSON which has array
of objects . So select Dynamic content for Items and configure as
#activity("Get old files").output.childItems and click on
finish. ( Here Get old files is the name of my get activity created
previously )
In the Foreach activity ,edit the activities and set a delete
activity as below.
Go to the dataset where we previously linked our file storage
account.
Create a file name parameter and link it to delete activity by
adding dynamic content for file path by selecting that filepath as
#dataset().FileName
Then go to the delete pipeline and add the file name as shown.
You may also add logging setting to link account and see the activity
happening after the path is debugged.
Other reference: clean-up-files-by-built-in-delete-activity-in-azure-data-factory/

Powershell getting same values when using if in foreach

I'm trying to get metric from Azure to Zabbix.
The issue is that Metric for VM consists of 2 words:Percentage CPU, and Zabbix doesn't allow item keys to consists of 2 words. I also tried Percentage%20CPU but getting errors in Zabbix, and I created Zabbix key percentage_cpu.
So I decided prior sending data from Zabbix to Azure to "translate" percentage_cpu to Percentage%20CPU. This works great if only that key is present, but issue starts when I add another key (in this example SQL metric).
For SQL metric all values are in one word - no need to change anything, but then metric for VM is also assigned to SQL. I'm trying to avoid writing separate file for every service
$host_items = Get-ZabbixHostItems -url $zabbix_url -auth $zabbix_auth -
zabbix_host $host_name
foreach ($host_item in $host_items)
{
#$host_item_details = select-string -InputObject $host_item.key_ -Pattern '^(azure\.sql)\.(.*)\.(.*)\[\"(.*)\"\]$';
$host_item_details = select-string -InputObject $host_item.key_ -Pattern '^(azure\.\w{2,})\.(.*)\.(.*)\[\"(.*)\"\,(.*)]$';
#$host_item_details = select-string -InputObject $host_item.key_ -Pattern '^(azure)\.(.*)\.(.*)\.(.*)\[\"(.*)\"\,(.*)]$';
$host_item_provider = $host_item_details.Matches.groups[1];
$host_item_metric = $host_item_details.Matches.groups[2];
$host_item_timegrain = $host_item_details.Matches.groups[3];
$host_item_resource = $host_item_details.Matches.groups[4];
$host_item_suffix = $host_item_details.Matches.groups[5];
if ($host_item_metric='percentage_cpu')
{$host_item_metric='Percentage%20CPU'}
else
{ $host_item_metric = $host_item_details.Matches.groups[2];}
#}
$uri = "https://management.azure.com{0}/providers/microsoft.insights/metrics?api-version={1}&interval={2}&timespan={3}&metric={4}" -f `
$host_item_resource, `
"2017-05-01-preview", `
$host_item_timegrain.ToString().ToUpper(), `
$($(get-date).ToUniversalTime().addminutes(-15).tostring("yyyy-MM-ddTHH:mm:ssZ") + "/" + $(get-date).ToUniversalTime().addminutes(-2).tostring("yyyy-MM-ddTHH:mm:ssZ")), `
$host_item_metric;
write-host $uri;
}
output of hostitems_
azure.sql.dtu_consumption_percent.pt1m["/subscriptions/111-222/resourceGroups/rg/providers/Microsoft.Sql/servers/mojsql/databases/test",common]
azure.vm.percentage_cpu.pt1m["/subscriptions/111-222/resourceGroups/rg/providers/Microsoft.Compute/virtualMachines/test",common]
When I ran code above I'm getting these URI's
https://management.azure.com/subscriptions/111-222/resourceGroups/rg/providers/Microsoft.Sql/servers/mojsql/databases/test/providers/microsoft.insights/m
etrics?api-version=2017-05-01-preview&interval=PT1M&timespan=2018-08-11T07:38:05Z/2018-08-11T07:51:05Z&metric=Percentage%20CPU
https://management.azure.com/subscriptions/111-222/resourceGroups/rg/providers/Microsoft.Compute/virtualMachines/test/providers/microsoft.insights/metric
s?api-version=2017-05-01-preview&interval=PT1M&timespan=2018-08-11T07:38:05Z/2018-08-11T07:51:05Z&metric=Percentage%20CPU
For first link (SQL) metric should be dtu_consumption but I'm getting same metric for both links
Second attempt:
if ($host_item_metric -eq 'percentage_cpu')
{$host_item_metric='Percentage%20CPU';}
else
{ $host_item_metric = $host_item_details.Matches.groups[2];}
write-host $host_item_metric
}
output: (original values)
dtu_consumption_percent
percentage_cpu
had to use -like
if ($host_item_metric -like 'percentage_cpu')
{$host_item_metric='Percentage%20CPU';}
else
{ $host_item_metric = $host_item_details.Matches.groups[2]}

A single cloud project for each environment?

Previously I have always had separate cloud projects for each environment, like this:
This poses some problems:
Maintaining multiple ServiceDefinition.csdef files
When building to a common output path, which ServiceDefinition.csdef is copied?
I am proposing using a single Cloud Project with multiple ServiceConfiguration files for each environment, and multiple profiles for publishing:
Pros:
Less maintenance issues (1 project and 1 ServiceDefinition.csdef)
A single ServiceDefinition.csdef is copied to the output folder
The problem I have now is that all environments need to have the same instance size as this is defined in the ServiceDefinition.csdef.
Is there any way I can get around this problem?
Yes, we create multiple packages (one for each environment). We have different powershell scripts to patch things like the vm-size (one example):
param(
[parameter(Mandatory=$true)]
[string]$fileToPatch,
[parameter(Mandatory=$true)]
[string]$roleName,
[parameter(Mandatory=$true)]
[validateSet("ExtraSmall", "Small", "Medium", "Large", "ExtraLarge", "A5", "A6", "A7")]
[string]$vmsize = 'Small'
)
# MAIN
$xml = New-Object System.Xml.XmlDocument
$xml.Load($fileToPatch)
$namespaceMgr = New-Object System.Xml.XmlNamespaceManager $xml.NameTable
$namespace = $xml.DocumentElement.NamespaceURI
$namespaceMgr.AddNamespace("ns", $namespace)
$xpathWorkerRoles = "/ns:ServiceDefinition/ns:WorkerRole"
$xpathWebRoles = "/ns:ServiceDefinition/ns:WebRole"
$Roles = $xml.SelectNodes($xpathWebRoles, $namespaceMgr) + $xml.SelectNodes($xpathWorkerRoles, $namespaceMgr)
$Roles | Where-Object { $_.name -eq $RoleName} | % { $_.vmsize = $vmsize; Write-Host 'Patched vmsize to' $vmsize 'for' $_.name }
$xml.Save($fileToPatch)

Resources