I have this Powershell script I am trying to run, to do me a 'bacpac' file from an Azure tenancy database, to on-prem (local folder).
# Load SMO Assembly
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
# Define the source database
$sourceServer = "myserver-sql-server"
$sourceDB = "mydb-sql-db"
# Define the target file
$targetFile = "c:\temp\mydb.bacpac"
# Connect to the source database
$sourceServer = New-Object Microsoft.SqlServer.Management.Smo.Server $sourceServer
$sourceDB = $sourceServer.Databases[$sourceDB]
# Export the database to the target file
$sourceDB.ExportBacpac($targetFile)
The error I am getting is on the last line...
You cannot call a method on a null-valued expression. At line:2 char:1
$sourceDB.ExportBacpac($targetFile)
+ CategoryInfo : InvalidOperation: (:) [], RuntimeException
+ FullyQualifiedErrorId : InvokeMethodOnNull
The variables have values. Am I missing a parameter calling 'ExportBacPac'?
This script worked...
Function Get-Bacpacs {
Param(
[string]$location
, [string]$server
, [string]$smolibrary
, [string]$daclibrary
, [string]$username
, [string]$password
)
Process
{
$dt = Get-Date -uFormat "%Y%m%d"
Add-Type -Path $smolibrary
$scon = "Data Source=$server.database.windows.net;Initial Catalog=master;User ID=$username;Password=$password;"
$servercon = New-Object Microsoft.SqlServer.Management.Common.ServerConnection
$servercon.ConnectionString = $scon
$srv = New-Object Microsoft.SqlServer.Management.SMO.Server($servercon)
foreach ($db in $srv.Databases | Where-Object {$_.Name -ne "master"})
{
$database = $db.Name
$bak_scon = "Data Source=$server.database.windows.net;Initial Catalog=$database;Connection Timeout=0;User ID=$username;Password=$password;"
if (!(Test-Path $location))
{
New-Item $location -ItemType Directory
}
$outfile = $location + $database + "_" + $dt + ".bacpac"
Add-Type -Path $daclibrary
$d_exbac = New-Object Microsoft.SqlServer.Dac.DacServices $bak_scon
try
{
$d_exbac.ExportBacpac($outfile, $database)
}
catch
{
Write-Warning $_
### Other alerting can go here
}
}
}
}
### This makes it easier for us who don't have really long screens! Location may vary.
$smo = "C:\Program Files (x86)\Microsoft SQL Server\110\SDK\Assemblies\Microsoft.SqlServer.Smo.dll"
$dac = "C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\Microsoft.SqlServer.Dac.dll"
Get-Bacpacs -location "" -server "" -smolibrary $smo -daclibrary $dac -username "" -password ""
Related
Hi I am trying to upload published files from Azure Git artifacts to FTP. But randomly I am getting the below error.
This is only happening for files not for any folder or subfolders.
“UploadFile” with “2” argument(s): “The Content-Type header cannot be set to a multipart type for this request.
All the files are available in the artifacts.
Observation & tried:
All files are present in artifacts
Getting the error only for files. (Folder & subfolders are creating successfully)
Stopped the Web App and then tried to upload
Tried Sleep also between uploading two files
After all of these, the result is the same.
Can anyone please help me out?
Get-ChildItem -Path $target_directory
# Upload files recursively
write-host "target_directory - $target_directory"
Set-Location $target_directory
$webclient = New-Object -TypeName System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$files = Get-ChildItem -Path $target_directory -Recurse
write-host 'Uploading started............................'
foreach ($file in $files)
{
write-host "file -" + $file.FullName
if($file.FullName -match "web.config" -or $file.FullName -match ".pdb" -or $file.FullName -match "roslyn" -or $file.FullName -match "obj\Debug"){
write-host "ignoring " + $file.FullName
continue
}
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
$uri = New-Object System.Uri("$url/$relativepath")
write-host "uri -" + $uri.AbsoluteUri
write-host '--------'
if($file.PSIsContainer)
{
write-host 'PSIsContainer - All dir/files within' $file
get-childitem -path $file -Recurse
try
{
$makeDirectory = [System.Net.WebRequest]::Create($uri);
$makeDirectory.Credentials = New-Object System.Net.NetworkCredential($username,$password);
$makeDirectory.Method = [System.Net.WebRequestMethods+FTP]::MakeDirectory;
$makeDirectory.GetResponse();
#folder created successfully
}
catch [Net.WebException]
{
try
{
#if there was an error returned, check if folder already existed on server
$checkDirectory = [System.Net.WebRequest]::Create($uri);
$checkDirectory.Credentials = New-Object System.Net.NetworkCredential($username,$password);
$checkDirectory.Method = [System.Net.WebRequestMethods+FTP]::PrintWorkingDirectory;
$response = $checkDirectory.GetResponse();
$response.StatusDescription
#folder already exists!
}
catch [Net.WebException]
{
#if the folder didn't exist, then it's probably a file perms issue, incorrect credentials, dodgy server name etc
}
}
continue
}
try{
write-host "Uploading to " $uri.AbsoluteUri " from " $file.FullName
$webclient.UploadFile($uri, $file.FullName)
start-sleep -Seconds 1
}
catch{
##[error]Error message
Write-Host "##[error]Error in uploading " $file -ForegroundColor red
Write-Host $_
$disputedFiles = $disputedFiles + $file.FullName
$_file = #{
FileName = $file.FullName
FTPPath = $uri.AbsoluteUri
}
$_o = New-Object psobject -Property $_file;
$disputedFilesList = $disputedFilesList + $_o
}
}
Write-Host "##[debug] Starting uploading the disputed files....."
foreach($file in $disputedFilesList){
try{
write-host "Uploading to " $file.FTPPath " from " $file.FileName
$webclient.UploadFile($file.FTPPath, $file.FileName)
start-sleep -Seconds 1
}
catch{
write-host "##[error]Error(2) in uploading to " $file.FTPPath " from " $file.FileName
}
}
Write-Host "##[debug] Ending uploading the disputed files....."
remove-item -path $target_directory\* -Recurse
get-childitem -path $target_directory
write-host "Directory Empty after cleanup"
$webclient.Dispose()
I'm writing a multithreaded script which was leveraging the ability to multithread with runspaces and saving time until I had to use the Exchange module. It takes so long to load the Exchange module, it's almost as slow or slower than synchronous processing.
Below is the code I am working with, to repro you'll need to populate the servers array and the Exchange uri. At the bottom in quotes you can see the errors I'm receiving indicating the Exchange module is not being loaded and the Exchange cmdlets are not accessible.
Is there a way to copy the module to each new runspace and only initialize it one time or another way that I can significantly save time and reap the benefits of multiple runspaces with Exchange Powershell?
Get-Module | Remove-Module;
#Region Initial sessionstate
$servers = #("XXXXX", "XXXXX", "XXXXX");
$uri = 'https://XXXXX/powershell?ExchClientVer=15.1';
$session = new-pssession -ConfigurationName Microsoft.Exchange -ConnectionUri $uri -Authentication Negotiate;
$sessionImported = Import-PSSession $session -AllowClobber;
$sessionState = [System.Management.Automation.Runspaces.InitialSessionState]::Create($sessionImported);
$modules = Get-Module ;
foreach ($module in $modules)
{
if ($module.ExportedCommands.Keys -contains "get-mailbox")
{
$exchangeModule = $module;
}
}
$initialSessionState = [initialsessionstate]::CreateDefault()
$initialSessionState.ImportPSModule($exchangeModule)
#EndRegion Initial sessionstate
$RunspacePool = [runspacefactory]::CreateRunspacePool($initialSessionState);
$RunspacePool.Open();
foreach ($server in $servers)
{
$threads = New-Object System.Collections.ArrayList;
$PowerShell = [powershell]::Create($sessionState);
$PowerShell.RunspacePool = $RunspacePool;
[void]$PowerShell.AddScript({
Param ($server)
[pscustomobject]#{
server = $server
} | Out-Null
Set-ADServerSettings -ViewEntireForest:$true;
$mbs = Get-MailboxDatabase -Server $server -Status;
return $mbs;
}) # end of add script
$PowerShell.AddParameter('server', $server) | Out-Null;
$returnVal = $PowerShell.BeginInvoke();
$temp = "" | Select PowerShell,returnVal;
$temp.PowerShell = $PowerShell;
$temp.returnVal = $returnVal;
$threads.Add($Temp) | Out-Null;
} #foreach server
$completed = $false;
$threadsCompleted = New-Object System.Collections.ArrayList;
while ($completed -eq $false)
{
$completed = $true;
$threadsCompleted.Clear();
foreach ($thread in $threads)
{
$endInvoke = $thread.PowerShell.EndInvoke($thread.returnVal);
#$endInvoke;
$threadHandle = $thread.returnVal.AsyncWaitHandle.Handle;
$threadIsCompleted = $thread.returnVal.IsCompleted;
if ($threadIsCompleted -eq $false)
{
$completed = $false;
}
else
{
$threadsCompleted.Add($threadHandle) | Out-Null;
}
}
Write-Host "Threads completed count: " $threadsCompleted.Count;
foreach ($threadCompleted in $threadsCompleted)
{
Write-Host "Completed thread handle -" $threadCompleted;
}
#Write-Host "";
sleep -Milliseconds 1000;
} # while end
Write-Host "Seconds elapsed:" $stopwatch.Elapsed.TotalSeconds;
Write-Host "";
$temp.PowerShell.Streams.Error;
return;
Set-ADServerSettings : The term 'Set-ADServerSettings' is not
recognized as the name of a cmdlet, function, script file, or operable
program. Check the spelling of the name, or if a path was included,
verify that the path is correct and try again. At line:9 char:3
Set-ADServerSettings -ViewEntireForest:$true;
~~~~~~~~~~~~~~~~~~~~
CategoryInfo : ObjectNotFound: (Set-ADServerSettings:String) [], CommandNotFoundException
FullyQualifiedErrorId : CommandNotFoundException Get-MailboxDatabase : The term 'Get-MailboxDatabase' is not recognized
as the name of a cmdlet, function, script file, or operable program.
Check the spelling of the name, or if a path was included, verify that
the path is correct and try again. At line:10 char:10
$mbs = Get-MailboxDatabase -Server $server -Status;
~~~~~~~~~~~~~~~~~~~
CategoryInfo : ObjectNotFound: (Get-MailboxDatabase:String) [], CommandNotFoundException
FullyQualifiedErrorId : CommandNotFoundException
So the answer is yes, I found out that you can use a property from initialsessionstate called ImportPSModule (which can accept a known module name from a static module or a file path to a dynamic or custom module), the tricky thing with Exchange is that that the module is created dynamically. There is hope though, if you capture the module the way I do below, there is a property to it called "Path" which gives you a local file path to where the module is temporarily stored on the machine after it is created the first time during that particular session.
I also discovered that you can specify a list of commands you want to import (commented out below) which saves an enormous amount of time with importing the Exchange module if you only need a few commands. Literally the progress bar for the Exchange command import only flashes when you filter on only a few commands and if you were to blink you'd miss it. If you use the specific command code make sure to import the get-mailbox command as that's how I filter on which module is for Exchange on prem, and configure the uri to your environment.
The benchmark for the single threaded version of this was 106 seconds, the fastest I've seen my code process with runspaces and multithreading is 9.8 seconds, so the potential for processing time savings is significant.
$WarningPreference = "SilentlyContinue";
Get-Module | Remove-Module;
#Region Initial sessionstate
$uri = 'https://xxxxx/powershell?ExchClientVer=15.1';
$session = new-pssession -ConfigurationName Microsoft.Exchange -ConnectionUri $uri -Authentication Negotiate;
Import-PSSession $session -AllowClobber > $null;
#Import-PSSession $session -AllowClobber -CommandName Get-Mailbox, Get-MailboxDatabaseCopyStatus, Get-MailboxDatabase, Get-DatabaseAvailabilityGroup > $null;
$modules = Get-Module;
foreach ($module in $modules)
{
if ($module.ExportedCommands.Keys -contains "get-mailbox")
{
$exchangeModule = $module;
}
}
$maxThreads = 17;
$iss = [system.management.automation.runspaces.initialsessionstate]::CreateDefault()
[void]$iss.ImportPSModule($exchangeModule.Path)
$runspacePool = [runspacefactory]::CreateRunspacePool(1, $maxThreads, $iss, $host)
$runspacePool.Open()
#Region Get dag members
$PowerShell = [powershell]::Create($iss); # may not need a runspace here
$PowerShell.RunspacePool = $RunspacePool;
$PowerShell.AddCommand("get-databaseavailabilitygroup") | Out-Null;
$PowerShell.AddParameter("identity", $DagName) | Out-Null;
$returnVal = $PowerShell.Invoke(); # writes to screen
$servers = New-Object System.Collections.ArrayList;
$serversUnsorted = $returnVal.Servers | sort;
foreach ($server in $serversUnsorted)
{
$servers.Add($server) | Out-Null;
}
#EndRegion Get dag members
#EndRegion Initial sessionstate
foreach ($server in $servers)
{
$PowerShell = [powershell]::Create($iss);
$PowerShell.RunspacePool = $RunspacePool;
[void]$PowerShell.AddScript({
Param ($server)
[pscustomobject]#{
server = $server
} | Out-Null
Set-ADServerSettings -ViewEntireForest:$true;
$copyStatus = Get-MailboxDatabaseCopyStatus -Server $server;
return $copyStatus;
}) # end of add script
$PowerShell.AddParameter('server', $server) | Out-Null;
$returnVal = $PowerShell.BeginInvoke();
$temp = "" | Select PowerShell,returnVal;
$temp.PowerShell = $PowerShell;
$temp.returnVal = $returnVal;
$threads.Add($Temp) | Out-Null;
} #foreach server
I have written a code that have to upload multiple files in azure web app using powershell.
I want to upload folder saved in $appdirectory variable.
$appdirectory="C:\scriptfolder\*"
$webappname="myapitestapp1"
$xml = [xml](Get-AzureRmWebAppPublishingProfile -Name $webappname -ResourceGroupName sibs -OutputFile null)
$xml = [xml]$xml
username = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userName").value
$password = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userPWD").value
$url = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#publishUrl").value
Set-Location $appdirectory
$webclient = New-Object -TypeName System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$files = Get-ChildItem -Path $appdirectory -Recurse
foreach ($file in $files)
{
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
$uri = New-Object System.Uri("$url/$relativepath")
if($file.PSIsContainer)
{
#$uri.AbsolutePath + "is Directory"
$ftprequest = [System.Net.FtpWebRequest]::Create($uri);
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::MakeDirectory
$ftprequest.UseBinary = $true
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$response = $ftprequest.GetResponse();
$response.StatusDescription
continue
}
"Uploading to " + $uri.AbsoluteUri
$webclient.UploadFile($uri, $file.FullName)
}
$webclient.Dispose()
its copying files that is from subfolder and uploading but it is not uploading the root directory. I want to upload all folder and files using powershell.
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
was wrong.
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace('\', '/')
This worked and stopped hiding files
I'm not sure how to debug this, assuming it's not a problem with the cmdlet. I'm trying to replace the automated SQL export with an automation workflow, but I can't seem to get Start-AzureSqlDatabaseExport to work -- it keeps getting the following warning and error messages.
d4fc0004-0c0b-443e-ad1b-310af7fd4e2a:[localhost]:Client Session Id: 'c12c92eb-acd5-424d-97dc-84c4e9c4f914-2017-01-04
19:00:23Z'
d4fc0004-0c0b-443e-ad1b-310af7fd4e2a:[localhost]:Client Request Id: 'd534f5fd-0fc0-4d68-8176-7508b35aa9d8-2017-01-04
19:00:33Z'
Start-AzureSqlDatabaseExport : Object reference not set to an instance of an object.
At DBBackup:11 char:11
+
+ CategoryInfo : NotSpecified: (:) [Start-AzureSqlDatabaseExport], NullReferenceException
+ FullyQualifiedErrorId : Microsoft.WindowsAzure.Commands.SqlDatabase.Database.Cmdlet.StartAzureSqlDatabaseExport
This seems similar to some other questions, but they seem to be unanswered or not applicable. I did have a similar procedure working in the Powershell environment. I replaced that procedure with the automated export from Azure, which seems like a poor choice now! I've tried a number of variations, using sqlcontext and databasename instead of database, for example.
Here's my code with sensitive parts replaced with ****:
workflow DBBackup {
param(
[parameter(Mandatory=$true)]
[string] $dbcode
)
$cred = Get-AutomationPSCredential -Name "admindbcredentials"
$VerbosePreference = "Continue"
inlineScript {
$dbcode = $using:dbcode
$cred = $using:cred
if ($dbcode -eq $null)
{
Write-Output "Database code must be specified"
}
Else
{
$dbcode = $dbcode.ToUpper()
$dbsize = 1
$dbrestorewait = 10
$dbserver = "kl8p7d444a"
$stacct = $dbcode.ToLower()
$stkey = "***storagekey***"
Write-Verbose "DB Server '$dbserver' DB Code '$dbcode'"
Write-Verbose "Storage Account '$stacct'"
$url = "https://$dbserver.database.windows.net"
$sqlctx = New-AzureSqlDatabaseServerContext -ManageUrl $url -Credential $cred
# $sqlctx = New-AzureSqlDatabaseServerContext -ManageUrl $url -Credential $cred
$stctx = New-AzureStorageContext -StorageAccountName $stacct -StorageAccountKey $stkey
$dbname = "FSUMS_" + $dbcode
$dt = Get-Date
$timestamp = $dt.ToString("yyyyMMdd") + "_" + $dt.ToString("HHmmss")
$bkupname = $dbname + "_" + $timestamp + ".bacpac"
$stcon = Get-AzureStorageContainer -Context $stctx -Name "backups"
$db = Get-AzureSqlDatabase -Context $sqlctx -DatabaseName $dbname
Write-Verbose "Backup $dbname to $bkupname in storage account $stacct"
Start-AzureSqlDatabaseExport $sqlctx -DatabaseName $dbname -StorageContainer $stcon -BlobName $bkupname
}
}
}
Try New-AzureRmSqlDatabaseExport instead. This command will return export status object. If you want a synchronous export you can check for "export status" in a loop.
Adding the following lines corrected the problem:
In the workflow before inlineScript:
$cred = Get-AutomationPSCredential -Name "admincredentials"
(where admincredentials was an asset with my admin login credentials)
and inside the inlineScript:
Add-AzureAccount $cred
Select-AzureSubscription "My subscription"
Some runbooks don't seem to need this, but probably best to always include it.
I have a requirement to download files from a sharepoint online document library using powershell
I've managed to get to the point where the download should happen but no luck.
I know its something to do with how I am using the stream/writer
any hints would be greatly appreciated
*Edit
No error messages are thrown just 0 length files in my local Directory
$SPClient = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
$SPRuntime = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.Runtime")
$webUrl = Read-Host -Prompt "HTTPS URL for your SP Online 2013 site"
$username = Read-Host -Prompt "Email address for logging into that site"
$password = Read-Host -Prompt "Password for $username" -AsSecureString
$folder = "PoSHTest"
$destination = "C:\\test"
$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($webUrl)
$ctx.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username, $password)
$web = $ctx.Web
$lists = $web.Lists.GetByTitle($folder)
$query = [Microsoft.SharePoint.Client.CamlQuery]::CreateAllItemsQuery(10000)
$result = $lists.GetItems($query)
$ctx.Load($Lists)
$ctx.Load($result)
$ctx.ExecuteQuery()
#Edited the foreach as per #JNK
foreach ($File in $result) {
Write-host "Url: $($File["FileRef"]), title: $($File["FileLeafRef"]) "
$binary = [Microsoft.SharePoint.Client.File]::OpenBinaryDirect($ctx,$File["FileRef"])
$Action = [System.IO.FileMode]::Create
$new = $destination + "\\" + $File["FileLeafRef"]
$stream = New-Object System.IO.FileStream $new, $Action
$writer = New-Object System.IO.BinaryWriter($stream)
$writer.write($binary)
$writer.Close()
}
You could also utilize WebClient.DownloadFile Method by providing SharePoint Online credentials to download the resource from SharePoint Online as demonstrated below.
Prerequisites
SharePoint Online Client Components SDK have to be installed on the machine running the script.
How to download a file in SharePoint Online/O365 in PowerShell
Download-File.ps1 function:
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.Runtime")
Function Download-File([string]$UserName, [string]$Password,[string]$FileUrl,[string]$DownloadPath)
{
if([string]::IsNullOrEmpty($Password)) {
$SecurePassword = Read-Host -Prompt "Enter the password" -AsSecureString
}
else {
$SecurePassword = $Password | ConvertTo-SecureString -AsPlainText -Force
}
$fileName = [System.IO.Path]::GetFileName($FileUrl)
$downloadFilePath = [System.IO.Path]::Combine($DownloadPath,$fileName)
$client = New-Object System.Net.WebClient
$client.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName, $SecurePassword)
$client.Headers.Add("X-FORMS_BASED_AUTH_ACCEPTED", "f")
$client.DownloadFile($FileUrl, $downloadFilePath)
$client.Dispose()
}
Usage
Download-File -UserName "username#contoso.onmicrosoft.com" -Password "passowrd" -FileUrl https://consoto.sharepoint.com/Shared Documents/SharePoint User Guide.docx -DownloadPath "c:\downloads"
I was able to download the file successfully with the following relevant code snippet. You should be able to extend it for your situation.
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
$siteUrl = Read-Host -Prompt "Enter web URL"
$username = Read-Host -Prompt "Enter your username"
$password = Read-Host -Prompt "Enter password" -AsSecureString
$source = "/filepath/sourcefilename.dat" #server relative URL here
$target = "C:/detinationfilename.dat" #URI of the file locally stored
$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($siteUrl)
$credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username, $password)
$ctx.Credentials = $credentials
[Microsoft.SharePoint.Client.FileInformation] $fileInfo = [Microsoft.SharePoint.Client.File]::OpenBinaryDirect($ctx,$source);
[System.IO.FileStream] $writeStream = [System.IO.File]::Open($target,[System.IO.FileMode]::Create);
$fileInfo.Stream.CopyTo($writeStream);
$writeStream.Close();
While the CSOM code above likely can be made to work I find it easier to use the web client method.
(from http://soerennielsen.wordpress.com/2013/08/25/use-csom-from-powershell/)
I've used the code below, to retrieve a bunch of files (metadata from CSOM queries) to a folder (using your $result collection, other params should be adjusted a bit):
#$siteUrlString site collection url
#$outPath path to export directory
$siteUri = [Uri]$siteUrlString
$client = new-object System.Net.WebClient
$client.UseDefaultCredentials=$true
if ( -not (Test-Path $outPath) ) {
New-Item $outPath -Type Directory | Out-Null
}
$result |% {
$url = new-object Uri($siteUri, $_["FileRef"])
$fileName = $_["FileLeafRef"]
$outFile = Join-Path $outPath $fileName
Write-Host "Downloading $url to $outFile"
try{
$client.DownloadFile( $url, $outFile )
}
catch{
#one simple retry...
try{
$client.DownloadFile( $url, $outFile )
}
catch{
write-error "Failed to download $url, $_"
}
}
}
The trick here is the
$client.UseDefaultCredentials=$true
which will authenticate the webclient for you (as the current user).
The direct and almost shortest answer to the question is simply:
$url = 'https://the.server/path/to/the/file.txt'
$outfile = "$env:userprofile\file.txt"
Invoke-WebRequest -Uri $url -OutFile $outfile -Credential (Get-Credential)
This works at least in Powershell 5.1...
So I gave up on this. it turned out to be much easier to write an SSIS script component to do the job.
I have awarded Soeren as he posted some code that will work for regular websites but not sodding SharePoint Online.
Thanks Sorean!
Short an easy approach to download a file from sharepoint online, using just powershell and sharepoint online url ( no pnp powershell )
This approach can also be used to perform Sharepoint REST queries, with just powershell and sharepoint REST api
# required MS dependencies
# feel free to download them from here https://www.microsoft.com/en-us/download/details.aspx?id=42038
Add-Type -Path 'C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll' -ErrorAction Stop
Add-Type -Path 'C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll' -ErrorAction Stop
# prepare passwords
$spCredential = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($user, $(ConvertTo-SecureString -AsPlainText $pass -Force))
# prepare and perform rest api query
$Context = New-Object Microsoft.SharePoint.Client.ClientContext($targetSiteUrl)
$Context.Credentials = $spCredential
try {
#this may return an error, but still will finish context setup
$Context.ExecuteQuery()
}
catch {
write-host "TODO: fix executeQuery() err 400 bug" -ForegroundColor Yellow
}
$AuthenticationCookie = $Context.Credentials.GetAuthenticationCookie($targetSiteUrl, $true)
$WebSession = New-Object Microsoft.PowerShell.Commands.WebRequestSession
$WebSession.Credentials = $Context.Credentials
$WebSession.Cookies.SetCookies($targetSiteUrl, $AuthenticationCookie)
$WebSession.Headers.Add("Accept", "application/json;odata=verbose")
Invoke-WebRequest -Uri $spFileUrl -OutFile $outputFilePath -WebSession $WebSession -errorAction Stop
Where
$outputFilePath is the target output file in which you want to save the remote file.
$targetSiteUrl is the target sp site url.
$spFileUrl is the "[sharepoint file full url]"
$user plain text sp user email
$pass plain text sp user pass