Copy-item between server with local administrator credentials - credentials

I try to copy file between computers
New-PSDrive -Name H -PSProvider FileSystem -Root $ServerDisk -Credential $Credential1 -Persist
New-PSDrive -Name G -PSProvider FileSystem -Root $LocalDir -Credential $Credential2 -Persist
Copy-Item -Force -Verbose -Container -Path G:$FileFolder$ZipFileName -Destination H:$ServerDest
Remove-PSDrive H
Remove-PSDrive G
But the file does not copy
Console output:
What I make wrong?
For test I add these computers to the same domain and used admin domain account - with this solution, this script works and copies files.

Related

Save files to Azure fileshare in the sub directory

Below is the runbook code I am using to save the file to azure fileshare. But unable to save in subdirectory.
#Set the context using the storage account name and key
$context = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$s = Get-AzureStorageShare "X1" –Context $context
$ErrorLogFileName = "Test.csv"
$LogItem = New-Item -ItemType File -Name $ErrorLogFileName
$_ | Out-File -FilePath $ErrorLogFileName -Append
Set-AzureStorageFileContent –Share $s –Source $ErrorLogFileName
Here I have a folder structure like X1/X2. But unable to get there and save the Test.csv. infact able to save it X1 folder on the Azure fileshare. Any idea ?
Thanks in Advance
You can specify the -Path parameter for Set-AzureStorageFileContent.
For example, the file share is X1, and there is a folder X2 inside X1. Then you can use the command below:
Set-AzureStorageFileContent –Share "X1" –Source $ErrorLogFileName -Path "X2" -Context $context
By the way, the command you're using is old, please try to consider using the latest az powershell command. For example, you can use Set-AzStorageFileContent instead of Set-AzureStorageFileContent(if you try to use the az module, then please replace all the old commands with the new ones).

Azure Invoke-AzVMRunCommand -

I am wondering if there is way to use the Invoke-AzVMRunCommand to run a single command, rather than a powershell ps1 file?
As an example, I want to execute a single command... "C:\app\app.exe -c exit". Without the need to push a powershell commandlet to the system.
I am able to do this via the Azure Portal "RunPowerShellScript" and it works but would like to do it to multiple systems via the command line via Invoke-AzVMRunCommand. These systems do not share a command account that can be used.
According to Microsoft, here is the syntax...
Invoke-AzVMRunCommand -ResourceGroupName 'rgname' -VMName 'vmname' -CommandId 'RunPowerShellScript' -ScriptPath 'sample.ps1' -Parameter #{param1 = "var1"; param2 = "var2"}
I don't want to run a script, I merely want to be able to execute a command on the system. Is this possible?
There is no direct way of doing it. But, you can write a script block and generate a file from it and then run Invoke-AzVMRunCommand using that file and later on delete that file if required.
$Server = "server01"
[System.String]$ScriptBlock = {Get-Process}
$FileName = "RunScript.ps1"
Out-File -FilePath $FileName -InputObject $ScriptBlock -NoNewline
$vm = Get-AzVM -Name $Server
Invoke-AzVMRunCommand -ResourceGroupName $vm.ResourceGroupName -Name $Server -CommandId 'RunPowerShellScript' -ScriptPath $FileName
Remove-Item -Path $FileName -Force -ErrorAction SilentlyContinue
It's now possible to use the:
-ScriptString
... option, however you need to ensure that the Az version will support it.
Azure Pipelines as of 2022-07-21 don't support it:
"A parameter cannot be found that matches parameter name 'ScriptString'."
See the documentation:
https://learn.microsoft.com/en-us/powershell/module/az.compute/invoke-azvmruncommand?view=azps-8.1.0

Grant access to Azure Data Lake Gen2 using a parameterized script

We are trying to grant read/write access to many folders in our Azure data Lake gen 2 containers and although we can do this through the UI, it's quite tedious and has to be repeated for all environments. Has anyone used a better way using Powershell to automate or at least parameterize this process of granted access to Azure Data Lake gen 2 containers and avoid granting access manually?
Unfortunately I couldn't get this to work using the following link or other documentation as it's for Gen 1 but it's very similar to what I need to do for gen 2.
https://www.sqlchick.com/entries/2018/3/17/assigning-data-permissions-for-azure-data-lake-store-part-3
According to my test, we can use the PowerShell to manage Azure Data Lake Gen2 permissions. For more details, please refer to the document
Install the required module
install-Module PowerShellGet –Repository PSGallery –Force
install-Module Az.Storage -Repository PSGallery -RequiredVersion 1.9.1-preview –AllowPrerelease –AllowClobber –Force
Besides, please note that if you want to install the module, you need to meet some conditions
.NET Framework is 4.7.2 or greater installed
PowerShell is 5.1 or higher
Script
Connect-AzAccount
$groupName=""
$accountName=""
$account= Get-AzStorageAccount -ResourceGroupName $groupName -Name $accountName
$ctx = $account.Context
$filesystemName = "test"
$dirname="template/"
$Id = "<the Object ID of user, group or service principal>"
$dir=Get-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname
$acl = New-AzDataLakeGen2ItemAclObject -AccessControlType user -EntityId $id -Permission "rw-" -InputObject $dir.ACL
Update-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname -Acl $acl
$dir=Get-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname
$dir.ACL
Thanks Jim Xu for providing the script above. I'm just complementing the code with the following items :
Get all folders from the container
Assign ACL for all folders
Propagate ACL to all subfolders
$groupName="resource group name"
$accountName="storage account name"
$account= Get-AzStorageAccount -ResourceGroupName $groupName -Name $accountName
$ctx = $account.Context
$filesystemName = "container name"
$Id = (Get-AzADGroup -DisplayName '<type user / group name here>').Id
$items = Get-AzDataLakeGen2ChildItem -Context $ctx -FileSystem $filesystemName
foreach ( $item in $items) {
$dir = Get-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path "$($item.Path)/"
$acl = New-AzDataLakeGen2ItemAclObject -AccessControlType group -EntityId $id -Permission "rwx" -InputObject $dir.ACL -DefaultScope
# Update ACL on blob item
Update-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path "$($item.Path)/" -Acl $acl
# Propagate ACL to child blob items
Set-AzDataLakeGen2AclRecursive -Context $ctx -FileSystem $filesystemName -Path "$($item.Path)/" -Acl $acl
}

Transfer multiple files and folder to a Azure storage

I have storage that's where I host a static website. I want to create a task that transfers my build files into that storage.
$context = New-AzureStorageContext -StorageAccountName $env:prSourceBranchName -StorageAccountKey "e4Nt0i1pcsYWnzLKw4PRdwu+************/qMj7fXyJK6lS4YTlPCOdbFlEG2LN9g2i5/yQ=="
Get-ChildItem -Path $env:System_DefaultWorkingDirectory/_ClientWeb-Build-CI/ShellArtifact/out/build
Get-ChildItem -Path $env:System_DefaultWorkingDirectory/_ClientWeb-Build-CI/ShellArtifact/out/build ls -File -Recurse | Set-AzureStorageBlobContent -Container $env:prSourceBranchName -Context $context
When I run the code:
I can see the files that need to be uploaded but when I check my $web blob (created by the Static Website) it's empty.
https://learn.microsoft.com/en-us/powershell/module/azure.storage/set-azurestorageblobcontent?view=azurermps-6.13.0 check out the second example.
Can someone explain why nothing is happening?
I want to do this > https://learn.microsoft.com/en-us/cli/azure/storage/blob?view=azure-cli-latest#az-storage-blob-upload-batch but through AzureRM.
Remove ls from your last line of code which is used to upload to blob storage:
Change:
Get-ChildItem -Path $env:System_DefaultWorkingDirectory/_ClientWeb-Build-CI/ShellArtifact/out/build ls -File -Recurse | Set-AzureStorageBlobContent -Container $env:prSourceBranchName -Context $context
To:
Get-ChildItem -Path $env:System_DefaultWorkingDirectory/_ClientWeb-Build-CI/ShellArtifact/out/build -File -Recurse | Set-AzureStorageBlobContent -Container $env:prSourceBranchName -Context $context

Download multiple powershell files from Azure Storage container as a Zip

How to download all the PowerShell files from the Azure storage
account container to a zip folder through power shell cmdlets?
As of now, the below cmdlet helps to download a specific blob by its name
$blob = Get-AzureStorageBlobContent -Container hemac -Blob "CreateAndDeploy-Eventhubs.ps1" -Context $ctx -Destination $f -Force
First set a folder to download all the blobs
1.Provide the full path to a directory you wish to use for downloaded blob
$DestinationFolder = "<C:\DownloadedBlobs>"
Create the destination directory and download the blob
New-Item -Path $DestinationFolder -ItemType Directory -Force
$blob | Get-AzureStorageBlobContent –Destination $DestinationFolder
Now zip the entire folder.
$folderToZip = "C:\DownloadedBlobs"
$rootfolder = Split-Path -Path $folderToZip
$zipFile = Join-Path -Path $rootfolder -ChildPath "ZippedFile.zip"
Then you should use this - docs
Compress-Archive -Path $folderToZip -DestinationPath $zipFile -Verbose
The zipped file will be in the same directory as the download folder

Resources