downloading a file from SharePoint Online with PowerShell - sharepoint

I have a requirement to download files from a sharepoint online document library using powershell
I've managed to get to the point where the download should happen but no luck.
I know its something to do with how I am using the stream/writer
any hints would be greatly appreciated
*Edit
No error messages are thrown just 0 length files in my local Directory
$SPClient = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
$SPRuntime = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.Runtime")
$webUrl = Read-Host -Prompt "HTTPS URL for your SP Online 2013 site"
$username = Read-Host -Prompt "Email address for logging into that site"
$password = Read-Host -Prompt "Password for $username" -AsSecureString
$folder = "PoSHTest"
$destination = "C:\\test"
$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($webUrl)
$ctx.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username, $password)
$web = $ctx.Web
$lists = $web.Lists.GetByTitle($folder)
$query = [Microsoft.SharePoint.Client.CamlQuery]::CreateAllItemsQuery(10000)
$result = $lists.GetItems($query)
$ctx.Load($Lists)
$ctx.Load($result)
$ctx.ExecuteQuery()
#Edited the foreach as per #JNK
foreach ($File in $result) {
Write-host "Url: $($File["FileRef"]), title: $($File["FileLeafRef"]) "
$binary = [Microsoft.SharePoint.Client.File]::OpenBinaryDirect($ctx,$File["FileRef"])
$Action = [System.IO.FileMode]::Create
$new = $destination + "\\" + $File["FileLeafRef"]
$stream = New-Object System.IO.FileStream $new, $Action
$writer = New-Object System.IO.BinaryWriter($stream)
$writer.write($binary)
$writer.Close()
}

You could also utilize WebClient.DownloadFile Method by providing SharePoint Online credentials to download the resource from SharePoint Online as demonstrated below.
Prerequisites
SharePoint Online Client Components SDK have to be installed on the machine running the script.
How to download a file in SharePoint Online/O365 in PowerShell
Download-File.ps1 function:
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.Runtime")
Function Download-File([string]$UserName, [string]$Password,[string]$FileUrl,[string]$DownloadPath)
{
if([string]::IsNullOrEmpty($Password)) {
$SecurePassword = Read-Host -Prompt "Enter the password" -AsSecureString
}
else {
$SecurePassword = $Password | ConvertTo-SecureString -AsPlainText -Force
}
$fileName = [System.IO.Path]::GetFileName($FileUrl)
$downloadFilePath = [System.IO.Path]::Combine($DownloadPath,$fileName)
$client = New-Object System.Net.WebClient
$client.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName, $SecurePassword)
$client.Headers.Add("X-FORMS_BASED_AUTH_ACCEPTED", "f")
$client.DownloadFile($FileUrl, $downloadFilePath)
$client.Dispose()
}
Usage
Download-File -UserName "username#contoso.onmicrosoft.com" -Password "passowrd" -FileUrl https://consoto.sharepoint.com/Shared Documents/SharePoint User Guide.docx -DownloadPath "c:\downloads"

I was able to download the file successfully with the following relevant code snippet. You should be able to extend it for your situation.
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
$siteUrl = Read-Host -Prompt "Enter web URL"
$username = Read-Host -Prompt "Enter your username"
$password = Read-Host -Prompt "Enter password" -AsSecureString
$source = "/filepath/sourcefilename.dat" #server relative URL here
$target = "C:/detinationfilename.dat" #URI of the file locally stored
$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($siteUrl)
$credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username, $password)
$ctx.Credentials = $credentials
[Microsoft.SharePoint.Client.FileInformation] $fileInfo = [Microsoft.SharePoint.Client.File]::OpenBinaryDirect($ctx,$source);
[System.IO.FileStream] $writeStream = [System.IO.File]::Open($target,[System.IO.FileMode]::Create);
$fileInfo.Stream.CopyTo($writeStream);
$writeStream.Close();

While the CSOM code above likely can be made to work I find it easier to use the web client method.
(from http://soerennielsen.wordpress.com/2013/08/25/use-csom-from-powershell/)
I've used the code below, to retrieve a bunch of files (metadata from CSOM queries) to a folder (using your $result collection, other params should be adjusted a bit):
#$siteUrlString site collection url
#$outPath path to export directory
$siteUri = [Uri]$siteUrlString
$client = new-object System.Net.WebClient
$client.UseDefaultCredentials=$true
if ( -not (Test-Path $outPath) ) {
New-Item $outPath -Type Directory | Out-Null
}
$result |% {
$url = new-object Uri($siteUri, $_["FileRef"])
$fileName = $_["FileLeafRef"]
$outFile = Join-Path $outPath $fileName
Write-Host "Downloading $url to $outFile"
try{
$client.DownloadFile( $url, $outFile )
}
catch{
#one simple retry...
try{
$client.DownloadFile( $url, $outFile )
}
catch{
write-error "Failed to download $url, $_"
}
}
}
The trick here is the
$client.UseDefaultCredentials=$true
which will authenticate the webclient for you (as the current user).

The direct and almost shortest answer to the question is simply:
$url = 'https://the.server/path/to/the/file.txt'
$outfile = "$env:userprofile\file.txt"
Invoke-WebRequest -Uri $url -OutFile $outfile -Credential (Get-Credential)
This works at least in Powershell 5.1...

So I gave up on this. it turned out to be much easier to write an SSIS script component to do the job.
I have awarded Soeren as he posted some code that will work for regular websites but not sodding SharePoint Online.
Thanks Sorean!

Short an easy approach to download a file from sharepoint online, using just powershell and sharepoint online url ( no pnp powershell )
This approach can also be used to perform Sharepoint REST queries, with just powershell and sharepoint REST api
# required MS dependencies
# feel free to download them from here https://www.microsoft.com/en-us/download/details.aspx?id=42038
Add-Type -Path 'C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll' -ErrorAction Stop
Add-Type -Path 'C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll' -ErrorAction Stop
# prepare passwords
$spCredential = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($user, $(ConvertTo-SecureString -AsPlainText $pass -Force))
# prepare and perform rest api query
$Context = New-Object Microsoft.SharePoint.Client.ClientContext($targetSiteUrl)
$Context.Credentials = $spCredential
try {
#this may return an error, but still will finish context setup
$Context.ExecuteQuery()
}
catch {
write-host "TODO: fix executeQuery() err 400 bug" -ForegroundColor Yellow
}
$AuthenticationCookie = $Context.Credentials.GetAuthenticationCookie($targetSiteUrl, $true)
$WebSession = New-Object Microsoft.PowerShell.Commands.WebRequestSession
$WebSession.Credentials = $Context.Credentials
$WebSession.Cookies.SetCookies($targetSiteUrl, $AuthenticationCookie)
$WebSession.Headers.Add("Accept", "application/json;odata=verbose")
Invoke-WebRequest -Uri $spFileUrl -OutFile $outputFilePath -WebSession $WebSession -errorAction Stop
Where
$outputFilePath is the target output file in which you want to save the remote file.
$targetSiteUrl is the target sp site url.
$spFileUrl is the "[sharepoint file full url]"
$user plain text sp user email
$pass plain text sp user pass

Related

do we have CSOM code to retrieve all lists under a site collection?

I am trying hard to get details of all the lists from a SharePoint site collection and export to CSV.
I am trying to run remotely from my machine and it doesn't work. I am not an export CSOM guy so asking for help, please.
I heard from someone that CSOM can run from anywhere and able to get info from SHarePoint 2013 server.
import-module C:\powershell\SharePointPnPPowerShell2013\3.16.1912.0\sharepointpnppowershell2013.psd1 -DisableNameChecking
$username = "testing#domain.com"
$password="Password"
$secureStringPwd = $Password | ConvertTo-SecureString -AsPlainText -Force
$Credentials = New-Object System.Management.Automation.PSCredential -ArgumentList $Username, $secureStringPwd
$connection=Connect-PnPOnline -Url https://cys.test.domain.com -Credentials $Credentials
$siteurl = "https://cys.test.domain.com/sites/testsite"
function Get-DocInventory([string]$siteUrl) {
$site = New-Object Microsoft.SharePoint.SPSite "https://cys.test.domain.com"
$web = Get-SPWeb "https://cys.test.domain.com/sites/testsite"
foreach ($list in $web.Lists) {
foreach ($item in $list.Items) {
#foreach($version in $item.Versions){
$data = #{
"List Name" = $list.Title
"Created By" = $item["Author"]
"Created Date" = $item["Created"]
"Modified By" = $item["Editor"]
"Modified Date" = $item["Modified"]
"Item Name" = $item.File.Name
"URL"=$web.Site.MakeFullUrl("$($web.ServerRelativeUrl.TrimEnd('/'))/$($item.Url)");
}
New-Object PSObject -Property $data | Select "List Name", "Item Name", "Created By", "Created Date", "Modified By", "Modified Date", "URL"
}
#}
$web.Dispose();
}
}
Get-DocInventory | Export-Csv -NoTypeInformation -Path "d:\test\test.csv"
If you are using SharePoint 2013 On-Premise, try to install SharePoint 2013 CSOM firstly:
SharePoint Server 2013 Client Components SDK
Then use this script to get list details in a site collection includes sub sites:
#Load SharePoint CSOM Assemblies
Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
#Variables for Processing
$SiteUrl= "http://sp/sites/devtest"
$username="administrator"
$password="******"
$domain="Contoso"
#Setup Credentials to connect
$Credentials = New-Object System.Net.NetworkCredential($username, $password, $domain)
Try {
#Function to Get all lists from the web
Function Get-SPOList($Web)
{
#Get All Lists from the web
$Lists = $Web.Lists
$Context.Load($Lists)
$Context.ExecuteQuery()
#Get all lists from the web
ForEach($List in $Lists)
{
#Get the List Name
Write-host $List.Title
Write-host $List.Created
}
}
#Function to get all webs from given URL
Function Get-SPOWeb($WebURL)
{
#Set up the context
$Context = New-Object Microsoft.SharePoint.Client.ClientContext($WebURL)
$Context.Credentials = $Credentials
$Web = $context.Web
$Context.Load($web)
#Get all immediate subsites of the site
$Context.Load($web.Webs)
$Context.executeQuery()
#Call the function to Get Lists of the web
Write-host "Processing Web :"$Web.URL
Get-SPOList $Web
#Iterate through each subsite in the current web
foreach ($Subweb in $web.Webs)
{
#Call the function recursively to process all subsites underneaththe current web
Get-SPOWeb($SubWeb.URL)
}
}
#Call the function to get all sites
Get-SPOWeb $SiteUrl
}
catch {
write-host "Error: $($_.Exception.Message)" -foregroundcolor Red
}

Upload files via ftp with its sub directories using powershell

I have written a code that have to upload multiple files in azure web app using powershell.
I want to upload folder saved in $appdirectory variable.
$appdirectory="C:\scriptfolder\*"
$webappname="myapitestapp1"
$xml = [xml](Get-AzureRmWebAppPublishingProfile -Name $webappname -ResourceGroupName sibs -OutputFile null)
$xml = [xml]$xml
username = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userName").value
$password = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#userPWD").value
$url = $xml.SelectNodes("//publishProfile[#publishMethod=`"FTP`"]/#publishUrl").value
Set-Location $appdirectory
$webclient = New-Object -TypeName System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$files = Get-ChildItem -Path $appdirectory -Recurse
foreach ($file in $files)
{
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
$uri = New-Object System.Uri("$url/$relativepath")
if($file.PSIsContainer)
{
#$uri.AbsolutePath + "is Directory"
$ftprequest = [System.Net.FtpWebRequest]::Create($uri);
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::MakeDirectory
$ftprequest.UseBinary = $true
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$response = $ftprequest.GetResponse();
$response.StatusDescription
continue
}
"Uploading to " + $uri.AbsoluteUri
$webclient.UploadFile($uri, $file.FullName)
}
$webclient.Dispose()
its copying files that is from subfolder and uploading but it is not uploading the root directory. I want to upload all folder and files using powershell.
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace(".\", "").Replace('\', '/')
was wrong.
$relativepath = (Resolve-Path -Path $file.FullName -Relative).Replace('\', '/')
This worked and stopped hiding files

How to prevent Sharepoint Office 365 online to add most viewed links to navbar?

Sharepoint Office 365 automatically adds links to most viewed pages. How to prevent it. It is hard to hide every newly appeared link from Site Settings > Navigation Settings. Thank you!
You could delete Recent node from Left Navigation via SharePoint CSOM API.
Prerequisites: SharePoint Online Client Components SDK
The following example demonstrates how to delete Recent node from Left Navigation in PowerShell:
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll"
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
Function Get-Context([string]$Url,[string]$Username,[string]$Password){
$SecurePassword = $Password | ConvertTo-SecureString -AsPlainText -Force
$credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName, $SecurePassword)
$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($url)
$ctx.Credentials = $credentials
return $ctx
}
Function Delete-NavigationNode([Microsoft.SharePoint.Client.Web]$Web,[string]$NodeTitle){
$ctx = $Web.Context
$nodes = $Web.Navigation.QuickLaunch
$ctx.Load($nodes)
$ctx.ExecuteQuery()
$node = $nodes.GetEnumerator() | where { $_.Title -eq $NodeTitle } | Select -First 1
$node.DeleteObject()
$ctx.ExecuteQuery()
}
$Url = "https://contoso.sharepoint.com/"
$Username = "jdoe#contoso.onmicrosoft.com"
$Password = ""
$ctx = Get-Context -Url $Url -Username $Username -Password $Password
Delete-NavigationNode -Web $ctx.Web -NodeTitle "Recent"

Use powershell to retrieve all subsites for sharepoint online

I'm trying to write a powershell script to retrieve all site collections, their groups and subsites from our SharePoint Online tenancy. So far I've been able to get all sites, their groups and the users in each group. I'm just having trouble getting the subsites and identifying any lists with unique permissions (and retrieving those permissions hopefully!)
So far I have this to get the lists:
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
$password = Read-Host -Prompt "Enter password" -AsSecureString
$credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials("username", $password)
$siteURL = "https://site/sites/test"
$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($siteUrl)
$ctx.Credentials = $credentials
$web = $ctx.Web
$lists = $web.Lists
$ctx.Load($lists)
$ctx.ExecuteQuery()
foreach($list in $lists)
{
Write-Host $list.Title
}
and this to get site groups and users:
Connect-SPOService –url https://site-admin.sharepoint.com
$sites = Get-SPOSite -Detailed
foreach ($site in $sites)
{
Write-Host $site.Title
$siteGroups = Get-SPOSiteGroup -Site $site.Url
foreach ($group in $siteGroups)
{
$users = Get-SPOUser -Site $site.Url -Group $group.Title -Limit All |ft -wrap
$url = $site.Url
$groupName = $group.Title
Write-Host $groupName + ' ' + $group.Users
}
}
So for the lists I tried using $list.HasUniqueRoleAssignments to determine if the list returned has unique permissions but it always seems to return false even if there are unique permissions.
For the subsites I can get a count of the subsites, but how do I get the URL?
Thanks in advance!
EDIT: So I've been able to get the URL of the subsites using the client context code I used to get the lists:
$subs = $web.Webs
$ctx.Load($subs)
$ctx.ExecuteQuery()
foreach($sub in $subs)
{
Write-Host $sub.Url
}
Still stuck with getting permissions though...
For SharePoint Online (SPO) Microsoft released SharePoint Online Management Shell that contains Get-SPOSite cmdlet to return one or more site collections.
Example:
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
$AdminUrl = "https://tenant-admin.sharepoint.com/"
$UserName = "username#tenant.onmicrosoft.com"
$Password = "password"
$SecurePassword = $Password | ConvertTo-SecureString -AsPlainText -Force
$Credentials = New-Object -TypeName System.Management.Automation.PSCredential -argumentlist $userName, $SecurePassword
#Retrieve all site collection infos
Connect-SPOService -Url $AdminUrl -Credential $Credentials
$sites = Get-SPOSite
How to retrieve all sites via CSOM in PowerShell
Since SharePoint Online Management Shell does not contain any cmdlets for working with sites, we will utilize CSOM API for that purpose. The below function retrieves all the sites in site collection:
function Get-SPOWebs(){
param(
$Url = $(throw "Please provide a Site Collection Url"),
$Credential = $(throw "Please provide a Credentials")
)
$context = New-Object Microsoft.SharePoint.Client.ClientContext($Url)
$context.Credentials = $Credential
$web = $context.Web
$context.Load($web)
$context.Load($web.Webs)
$context.ExecuteQuery()
foreach($web in $web.Webs)
{
Get-SPOWebs -Url $web.Url -Credential $Credential
$web
}
}
Example: print all sites for a site collection in SPO
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
$UserName = "username#tenant.onmicrosoft.com"
$Password = "password"
$SecurePassword = $Password | ConvertTo-SecureString -AsPlainText -Force
$SPOCredentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserName, $SecurePassword)
$AllWebs = Get-SPOWebs -Url 'https://tenant.sharepoint.com' -Credential $SPOCredentials
$AllWebs | %{ Write-Host $_.Title }
Result
By combining both techniques you could achieve the desired results:
#Retrieve all site collection infos
Connect-SPOService -Url $AdminUrl -Credential $Credentials
$sites = Get-SPOSite
#Retrieve and print all sites
foreach ($site in $sites)
{
Write-Host 'Site collection:' $site.Url
$AllWebs = Get-SPOWebs -Url $site.Url -Credential $SPOCredentials
$AllWebs | %{ Write-Host $_.Title }
Write-Host '-----------------------------'
}

using credentials to get-childitem on other server

I'm working on a script that uses get-childitem on the other server, but need to change it so it uses credentials of the local account on the other server to do that. When I was just using Active Directory to do that, I was saving the task in our scheduler with my AD login, and it was good on the other server, using the UNC path. But we decided to change it to the local login there recently and I'm getting an error message, trying to use net use. Does anyone know of a good way to do this with the UNC path instead? Or, any idea why the following is giving an error message?
function GetSecureLogin(){
$global:username = "stuff"
$global:password = get-content C:\filename.txt | convertto-securestring
}
function Cleanup([string]$Drive) {
try {
$deleteTime = -42
$now = Get-Date
**#this is saying cannot find path '\\name.na.xxx.net\20xServerBackup\V' name truncated**
Get-ChildItem -Path $Drive -Recurse -Force |Where-Object {$_.LastWriteTime -lt $limit} | Remove-Item -Force
}
Catch{
Write-Host "Failed"
}
}
#####################start of script####################
$share = '\\name.na.xxx.net\20xServerBackup\'
$TheDrive = '\\name.na.xxx.net\20xServerBackup\VMs\'
$global:password = ""
$global:username = ""
GetSecureLogin
net use $share $global:password /USER:$global:username
[array]$DriveArray = #(TheDrive)
try{
$i=0
for ($i = $DriveArray.GetLowerBound(0); $i -le $DriveArray.GetUpperBound(); $i++) {
$tempDrv = $DriveArray[$i]
Cleanup $tempDrv
}
}
catch [Exception] {
Write-Host $_.Exception.Message
}
As you can see, I started using the example at this link with net use, but it's not doing the trick to use credentials to access the other server. powershell unc path cred
I got it to work this way, with New-PSDrive as #robert.westerlund suggests above:
$DestPath = split-path "$Drive" -Parent #this gives format without slash at and and makes powerShell *very happy*
New-PSDrive -Name target -PSProvider FileSystem -Credential $global:cred -Root "$DestPath" | Out-Null
$temp1 = Get-ChildItem -Path target:\VMs\ -Recurse -Force | Where-Object { $_.LastWriteTime -lt $limit}
Get-ChildItem -Path $Drive -Recurse -Force | Where-Object { $_.LastWriteTime -lt $limit} | Remove-Item -Force
Remove-PSDrive target
I had to add the cred part like this too:
$global:cred = new-object -typename System.Management.Automation.PSCredential -argumentlist $global:username, $global:password

Resources