How can I retrieve files within a repo in Azure within a Powershell script? - azure

I need to (within a Powershell script) determine all the possible controller file names within a repository. This is for a test that will hit every controller and verify it is functioning and not missing from the startup file. (MVC app)
Since all the files are DLLs I cannot simply ask for the files in the folder nor do I want to hard code the names. How can I get a listing of files within a certain folder in order call each one to test within a powershell script?
Perhaps a better way to ask this is:
How can I list files within a folder that is inside a repo? (using a Powershell script)

You could refer to this doc:Interacting with Azure Web Apps Virtual File System using PowerShell and the Kudu API. It uses the VFS API in the wiki doc. And there is a api to list files at directory specified by path.
GET /api/vfs/{path}/
Lists files at directory specified by path.
And in the previous doc, under the title Downloading a File from an App Service, there is a scripts to download the files. I use the path without $kuduPath to list files. And you need to Getting the Kudu REST API Authorisation header via PowerShell. And then the scripts would be like this.
$kuduApiUrl="https://<webname>.scm.azurewebsites.net/api/vfs/site/wwwroot/"
Invoke-RestMethod -Uri $kuduApiUrl `
-Headers #{"Authorization"=$kuduApiAuthorisationToken;"If-Match"="*"} `
-Method GET `
-OutFile $localPath `
-ContentType "multipart/form-data"
And here is the result pic.It will list all files and the folders.
hope this could help you, if you still have other questions,please let me know.

Related

What are "variants" in Azure permissions

In the examples on this https://learn.microsoft.com/en-us/powershell/microsoftgraph/find-mg-graph-command?view=graph-powershell-1.0, I see something called "variants", but I haven't found any explanation of "variants" when I search.
Specifically, I'm trying to work with our Azure team to set the proper permission for the email "markRead" function:
$graphApiPostUrl = "https://graph.microsoft.com/v1.0/admin/serviceAnnouncement/messages/markRead"
Find-MgGraphCommand -Uri $graphApiPostUrl | Format-Table -AutoSize
I requested ServiceMessageViewpoint.Write permisison, and they said they gave it to me, but I'm still gettting an "401 unauthorized" when I try it. I'm wondering if I need to request the variants as well? [I am able to list/enumerate the emails.]
My original question was here: Powershell - How to set token for GraphAPI to mark emails as read?
Variants don't have anything to do with Azure Permissions. In this case, they are a different word for Powershell parameter sets.
get-help Find-MgGraphCommand -Full
OUTPUTS
Microsoft.Graph.PowerShell.Authentication.Models.IGraphCommand with the following properties:
1. Command: Name of command.
2. Module: Module in which a command is defined.
3. Method: The HTTP method a command makes.
4. Uri: The Microsoft Graph API URI a command calls.
5. OutputType: The return type of a command.
6. Permissions: Permissions needed to use a command. This field can be empty if the permissions are not yet
available in Graph Explorer.
7. Variants: The parameter sets of a command.
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_parameter_sets?view=powershell-7.2
PowerShell uses parameter sets to enable you to write a single function that can do different actions for different scenarios. Parameter sets enable you to expose different parameters to the user. And, to return different information based on the parameters specified by the user. You can only use one parameter set at a time.
You are using the command correctly, there is rather an issue the call to graph API.

Azure Artifacts - Download a specific version of maven artifact

We are using azure devops for our CI/CD process. Many a times, in order to do some custom builds, we need to download the specific maven jar from artifact repo.
Is there a way (commandline or API) to do the same ?
Again the question is about download specific jar from azure artifacts.
Azure Artifacts - Download a specific version of maven artifact
The answer is yes. We could use the REST API Maven - Download Package to download the specific jar from azure artifacts:
GET https://pkgs.dev.azure.com/{organization}/{project}/_apis/packaging/feeds/{feedId}/maven/{groupId}/{artifactId}/{version}/{fileName}/content?api-version=5.1-preview.1
First, we need to get the feedId. We could use the REST API Feed Management - Get Feeds to get the feedId:
GET https://feeds.dev.azure.com/{organization}/{project}/_apis/packaging/feeds?api-version=5.1-preview.1
Note: The project parameter must be supplied if the feed was created in a project. If the feed is not associated with any project, omit the project parameter from the request.
For other parameters in the URL, we could get it from the overview of the package. Select the package and open the package, we could get following view:
Now, we have all the parameters, feedId, groupId, artifactId, version, fileName.
So, we could use the REST API with -OutFile $(Build.SourcesDirectory)\myFirstApp-1.0-20190818.032400-1.jar to download the package (Inline powershell task):
$url = "https://pkgs.dev.azure.com/<OrganizationName>/_apis/packaging/feeds/83cd6431-16cc-480d-bb4d-a213e17b3a2b/maven/MyGroup/myFirstApp/1.0-SNAPSHOT/myFirstApp-1.0-20190818.032400-1.jar/content?api-version=5.1-preview.1"
$buildPipeline= Invoke-RestMethod -Uri $url -OutFile $(Build.SourcesDirectory)\myFirstApp-1.0-20190818.032400-1.jar -Headers #{
Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN"
} -Method Get
Since my maven feed is an organization scoped feed, I omit the project parameter from the URL.
The result:
I've found this way:
Open feed page in Azure, click Connect to feed
Choose Maven, copy the URL from Project setup > repository/url of the pom.xml sample.
That should look like:
https://pkgs.dev.azure.com/YOUR-ORGANIZATION/_packaging/YOUR-FEED-NAME/maven/v1
Append the artifact info to that link:
https://pkgs.dev.azure.com/YOUR-ORGANIZATION/_packaging/YOUR-FEED-NAME/maven/v1/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar
Hint: compare it to the one from maven repository:
https://repo1.maven.org/maven2/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar

Substitute Service Fabric application parameters during deployment

I'm setting up my production environment and would like to secure my environment-related variables.
For the moment, every environment has its own application parameters file, which works well, but I don't want every dev in my team knowing the production connection strings and other sensitive stuffs that could appear in there.
So I'm looking for every possibility available.
I've seen that in Azure DevOps, which I'm using at the moment for my CI/CD, there is some possible variable substitution (xml transformation). Is it usable in a SF project?
I've seen in another project something similar through Octopus.
Are there any other tools that would help me manage my variables by environment safely (and easily)?
Can I do that with my KeyVault eventually?
Any recommendations?
Thanks
EDIT: an example of how I'd like to manage those values; this is a screenshot from octopus :
so something similar to this that separates and injects the values is what I'm looking for.
You can do XML transformation to the ApplicationParameter file to update the values in there before you deploy it.
The other option is use Powershell to update the application and pass the parameters as argument to the script.
The Start-ServiceFabricApplicationUpgrade command accept as parameter a hashtable with the parameters, technically, the builtin task in VSTS\DevOps transform the application parameters in a hashtable, the script would be something like this:
#Get the existing parameters
$app = Get-ServiceFabricApplication -ApplicationName "fabric:/AzureFilesVolumePlugin"
#Create a temp hashtable and populate with existing values
$parameters = #{ }
$app.ApplicationParameters | ForEach-Object { $parameters.Add($_.Name, $_.Value) }
#Replace the desired parameters
$parameters["test"] = "123test" #Here you would replace with your variable, like $env:username
#Upgrade the application
Start-ServiceFabricApplicationUpgrade -ApplicationName "fabric:/AzureFilesVolumePlugin" -ApplicationParameter $parameters -ApplicationTypeVersion "6.4.617.9590" -UnmonitoredAuto
Keep in mind that the existing VSTS Task also has other operations, like copy the package to SF and register the application version in the image store, you will need to replicate it. You can copy the full script from Deploy-FabricApplication.ps1 file in the service fabric project and replace it with your changes. The other approach is get the source for the VSTS Task here and add your changes.
If you are planning to use KeyVault, I would recommend the application access the values direct on KeyVault instead of passing it to SF, this way, you can change the values in KeyVault without redeploying the application. In the deployment, you would only pass the KeyVault credentials\configuration.

Grant permissions to folder in Sharepoint library using Powershell?

I have a Sharepoint Library, which I have a Powershell script dropping files into for processing. The Powershell script reaches out to Active Directory, and returns Group Membership information. The script then creates a folder for the group owner (if it doesn't exist) in my Library, using the group owners' name, and drops a .CSV of all the users contained in the specific group into that folder.
The need here, is to grant 'Read' permissions only to the owner of the group, which will be the name of the folder we are working in. Ideally the folder would be hidden, however I understand that there are limitations when working with Sharepoint.
For example:
John Doe, User: jdoe would be able to access Z:/jdoe/IT.csv but not
Z:/someuser/HR.csv
I have my Sharepoint Library mapped to Z:/ currently, to make my life easier for Powershell.
I executed get-command Module Microsoft.SharePoint.PowerShell | ft name and ran through the list of Sharepoint Commands.
I then stumbled across the Grant-SPObjectSecurity Cmdlet, which I assume is what I would want to use on the Powershell side to, when the folder is being created, apply Sharepoint permissions only to the user for which the folder is being created for.
The process from start to finish is: Powershell Script 'Get_Group_Members' executes, reading a text file containing an Active Directory Group name, per line. For each group found, the script identifies the owner of the group, creates a folder named with the owners AD name, and puts a .CSV file in the folder listing all members of the group. Then, I (for now anyway) manually initiate the next Script 'Import_CSV' which pulls all the information into a Sharepoint list for an unrelated process.
Hope that helps understand what's happening. Am I right in assuming I should handle this on the Powershell side, as opposed to the Sharepoint side? If so, am I headin' in the right direction with Grant -SPObjectSecurity?
Thanks!
Update:
Following the link I provided in a comment below, here is what I came up with:
function GrantUserpermission($strOwnerName)
{
[Microsoft.SharePoint.SPUserCollection]$spusers=[Microsoft.SharePoint.SPUserCollection]$web.SiteUsers
[Microsoft.SharePoint.SPUser]$spuser=$spusers[$strOwnerName]
"Strowner name: " + $strOwnerName
# Get the SPWeb object and save it to a variable
$web = Get-SPWeb -identity $WebURL
if ($strOwnerName -ne $null)
{
$sproleass=new-object Microsoft.SharePoint.SPRoleAssignment([Microsoft.SharePoint.SPPrincipal]$spuser)
$folder.BreakRoleInheritance("true")
$sproleass.RoleDefinitionBindings.Add($web.RoleDefinitions["Contribute"])
$folder.RoleAssignments.Add($sproleass);
Write-Host "Permission provided for user ", $strOwnerName
}
else
{
Write-Host "User ""$userName"" was not found in this web!"
}
}
And here, are the error(s) associated with my code:
Full code can be found here: http://pastebin.com/iBpj6V1U
Update #2
#apply permissions to folder
"Strowner name: " + $strOwnerName
function GrantUserpermission($strOwnerName)
{
$web = Get-SPWeb -identity $WebURL
[Microsoft.SharePoint.SPUser]$spuser=$web.EnsureUser($strOwnerName)
"Strowner name in Function: " + $strOwnerName
Updated code #2: http://pastebin.com/DzP1hVce
I ended up realizing, that if I am using Powershell to get information to a .CSV, and then ultimately to Sharepoint, that it doesn't make sense to actually waste time with files, and tap directly into Sharepoint via Powershell.
Here's the code I had used to accomplish this: http://pastebin.com/xRyvXLCB
Special thanks to #TheMadTechnician

Enable Cache On Azure CDN

I am setting up Azure CDN, and having trouble setting the Cache-Control header.
I used Cloudberry Explorer to setup a sync between my server folders and the CDN. This is working well. All my files were copied to the CDN with no problem.
Under Tools > Http Headers > Edit Http Header I set the value for Cache-Control to be: public,max-age=604800
However, this does not appear to be having any effect (according to both Fiddler and Page Speed).
Any tips on setting the Cache-Control header for the Azure CDN would be GREATLY appreciated.
I had this issue myself and needed to update the Cache-Control header on thousands of files. To prevent caching issues in sites, I re-deploy these files with every release to a new path.
I was able to patch together some different suggestions online and ultimately landed on the following solution, which I currently use for deploying one of my production apps.
You need two files, and the script assumes they're in the same directory on your computer:
A text file with a listing of the files in the container (see example below)
The PowerShell script
The Text File (file-list.txt)
The file should be in the example format below with the full file path as deployed to the CDN container. Note this uses forward slashes, and should not include the container name since it will be included in the script. The name of this text file will be included in the PowerShell script below.
v12/app/app.js
v12/app/app.min.js
v12/app/app.min.js.map
v12/app/account/signup.js
v12/app/account/signup.min.js
... (and so on)
The Script (cdn-cache-control.ps1)
The full script is below. You'll need to replace the constants like STORAGE_ACCOUNT_NAME, STORAGE_KEY, and you may need to update the path to the Azure SDK DLL if you have a different version. There are also 2 possible implementations of $blobClient; I repurposed some of this code from a source online, and the un-commented one works for me.
The key difference between what I have here and what you'll find online is the inclusion of $blob.FetchAttributes(). Without explicitly calling this method, a majority of the blob properties like Content-Type, Last Modified Date, and others will be loaded into memory as empty/default values, then when $blob.SetProperties() is called these empty values will blow away the existing ones in the CDN, causing files to load without a Content-Type among other things.
Add-Type -Path "C:\Program Files\Microsoft SDKs\Azure\.NET SDK\v2.9\bin\Microsoft.WindowsAzure.StorageClient.dll"
$accountName = "STORAGE_ACCOUNT_NAME"
$accountKey = "STORAGE_KEY"
$blobContainerName = "STORAGE_CONTAINER_NAME"
$storageCredentials = New-Object Microsoft.WindowsAzure.StorageCredentialsAccountAndKey -ArgumentList $accountName,$accountKey
$storageAccount = New-Object Microsoft.WindowsAzure.CloudStorageAccount -ArgumentList $storageCredentials,$true
#$blobClient = $storageAccount.CreateCloudBlobClient()
$blobClient = [Microsoft.WindowsAzure.StorageClient.CloudStorageAccountStorageClientExtensions]::CreateCloudBlobClient($storageAccount)
$cacheControlValue = "max-age=31556926"
echo "Setting cache control: $cacheControlValue"
Get-Content "file-list.txt" | foreach {
$blobName = "$blobContainerName/$_".Trim()
$blob = $blobClient.GetBlobReference($blobName)
$blob.FetchAttributes()
$blob.Properties.CacheControl = $cacheControlValue
$blob.SetProperties()
echo $blobName
}
It was tricky to find information about mass-setting the Cache-Control header but I've run this script for multiple production releases with great success. I've verified the configuration of the header as well, and routinely run Google's PageSpeed Insights against my site to verify.

Resources