WinSCP - How to download only folders/files 1 day old while excluding empty folders/files? [duplicate] - winscp

I am limited to PuTTY and WinSCP only.
I am trying to download log directories with log files. For example, I want to grab all log_files 6 days old or newer. log_dir2 and log_dir3 including the folders match the criteria, while log_dir1 and its files does not.
DIR/log_dir1/log_files % older than 6 days
DIR/log_dir2/log_files % meets criteria
DIR/log_dir3/log_files % meets criteria
My problem is that while the log_files of log_dir1 are not downloaded, the syntax I am currently using downloads the log_dir1 folder. Normally, not a big deal, but we are talking hundreds of log_dir folders (all empty as the files are older than 6 days). For reasons beyond my control, I cannot move or archive these old log directories with their log files.
My question is simply, how do I change my syntax to ignore folders that are older than 6 days as well as files.
get -filemask="*>6D" /DIR/* C:\temp
I have tried several different combinations of parameters and I have read the support page about Directory Masks and Path Masks. I cannot get any of them working (version issue?). Can anyone explain their syntax better than the help page. I will update tomorrow with the current version of WinSCP that I am using.

Time constraint in WinSCP file mask cannot be used for directories.
But you can prevent WinSCP from creating the empty folders. Use -rawtransfersettings switch with ExcludeEmptyDirectories setting.
get -rawtransfersettings ExcludeEmptyDirectories=1 -filemask="*>6D" /DIR/* C:\temp
This is the original answer, before WinSCP supported ExcludeEmptyDirectories. It might still be useful as a basis for implementations that have even more specific constraints.
You can implement this custom logic easily in PowerShell script with a use of WinSCP .NET assembly:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Sftp
HostName = "example.com"
UserName = "username"
Password = "password"
SshHostKeyFingerprint = "..."
}
$remotePath = "/remote/path"
$localPath = "C:\local\path"
$limit = (Get-Date).AddDays(-6)
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
# Enumerate files to download
$fileInfos =
$session.EnumerateRemoteFiles(
$remotePath, $Null, [WinSCP.EnumerationOptions]::AllDirectories) |
Where-Object { $_.LastWriteTime -gt $limit }
foreach ($fileInfo in $fileInfos)
{
$localFilePath =
[WinSCP.RemotePath]::TranslateRemotePathToLocal(
$fileInfo.FullName, $remotePath, $localPath)
# If the corresponding local folder does not exist yet, create it
$localFileDir = Split-Path -Parent $localFilePath
if (!(Test-Path -Path $localFileDir))
{
Write-Host "Creating local directory $localFileDir..."
New-Item $localFileDir -ItemType directory | Out-Null
}
Write-Host "Downloading file $($fileInfo.FullName)..."
# Download file
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($fileInfo.FullName)
$transferResult = $session.GetFiles($sourcePath, $localFilePath)
# Did the download succeeded?
if (!$transferResult.IsSuccess)
{
# Print error (but continue with other files)
Write-Host ("Error downloading file ${remoteFilePath}: " +
$transferResult.Failures[0].Message)
}
}
$session.Dispose()
Write-Host "Done."
Run the script (download.ps1) like:
powershell.exe -ExecutionPolicy Unrestricted -File download.ps1

Related

Recursively Report Folder statistics in ADLS Gen 1 - CLI or Powershell

I am trying to loop through the directory structure of an ADLS filesystem, reporting each directory's size and format until no further directory exists.
Since the folder structure contains TBs of data, I thought using Azure CLI with PowerShell would be the most efficient way to go about this.
But Gen 1's CLI command is limited to:
az dls fs list --account <storage_account> --path <path of the file>
Can I use this command in Powershell to further loop through the rest of the folder structure? Or is there another way to loop through ADLS's folder structure without using the CLI command?
I am a newbie to Azure CLI, hence apologizing if this question is not very advanced.
We have tested in our local environment, Below observations are based on our analysis.
We cannot get the accumulated directory size, Irrespective of which
tools you use like (ex: portal, storage explorer or through
PowerShell cmdlets).
If you use this cmdlet Get-AzDataLakeStoreChildItemSummary this will give directory count(Number of directories available in this container),file count(total files in all subfolders/directories),length (total size consumed by all the files in directories & in subfolders).
$listdir=Get-AzDataLakeStoreChildItem -account '<DataLakeAccountName>' -Path /
$listarray=$listdir
foreach( $listdir in $listarray)
{
Write-Output ("current we are listing the files with "+$listdir.Name);
Get-AzDataLakeStoreChildItemSummary -account '<DataLakeAccountName>' -path /$($listdir.Name)
}
In the above screen shot , directories with names tried,testing
has values 0 since it does have any files or sub directories.
The value of directory trim is 6 since it is having 2 files under
track1 sub directory and 4 files under jshfa subdirectory.
Alternatively , you can use the below script if you to list the files under the sub directory with there respective sizes.
$listdir=Get-AzDataLakeStoreChildItem -account '<DataLakeAccountName>' -Path /
$listarray=$listdir
foreach( $listdir in $listarray)
{
Write-Output ("current we are listing the files under "+$listdir.Name +" "+"length of the Directory is: " + $listdir.length);
$addvar=Get-AzDataLakeStoreChildItem -account '<DataLakeAccountName>' -Path /$($listdir.Name) | Select -Property length,Name,type;
##Write-Host $addvar;
$subdirect = $addvar;
##Write-Host $subdirect;
foreach( $addvar in $subdirect)
{
if($addvar.type -eq 'DIRECTORY')
{
$subfolderlist=Get-AzDataLakeStoreChildItem -account '<DataLakeAccountName>' -Path /$($listdir.Name)/$($addvar.Name);
Write-Output $subfolderlist
}
}
Write-Output $addvar
}
Below is The output of the above script: Here Trim has 2 subdirectories(track1, track234) since track1 has two files and those are listed here & remaining directories, sub-directories doesn't have any files & hence the length value in below output is 0.
Note:
If you have multiple folders under sub-directory every time you need append the directory path in for each loop & make the changes accordingly.
Also both Azure CLI, PowerShell Cmdlets both dependents on the given -path parameter.

Blazor Wasm PWA IIS Deployment integrity error

I created a new Blazor PWA WebAssembly (last version default template) project and deployed it in a IIS in Windows Server to try PWA.
Installed the last .NET Core Hosting Bundle.
After publising it, I ran the script in the Microsoft Docs to rename dll files:
dir .\_framework\_bin | rename-item -NewName { $_.name -replace ".dll\b",".bin" } ((Get-Content .\_framework\blazor.boot.json -Raw) -replace '.dll"','.bin"') | Set-Content .\_framework\blazor.boot.json
And the serviceworker renaming code too:
((Get-Content .\service-worker-assets.js -Raw) -replace '.dll"','.bin"') | Set-Content .\service-worker-assets.js
Then I deleted the compressed files as the docs says:
wwwroot\service-worker-assets.js.br
wwwroot\service-worker-assets.js.gz
wwwroot\_framework\blazor.boot.json.br
wwwroot\_framework\blazor.boot.json.gz
But I am still getting an error when I load the app:
What Am I missing here?
I guess that it has to do with the hashes and the renaming thing but cant find any solution in the BlazorĀ“s Github issues.
As a result of your modifications to the blazor.boot.json file, the integrity checks fails. service-worker-assets.js contains a list of files and their integrity hashes which are calculated at the time of publish.
You can manually recalculate the hashes using Bash/PowerShell, since you're using IIS, I'll provide the PowerShell script I used for a similar issue:
# make sure you're in the wwwroot folder of the published application
$JsFileContent = Get-Content -Path service-worker-assets.js -Raw
# remove JavaScript from contents so it can be interpreted as JSON
$Json = $JsFileContent.Replace("self.assetsManifest = ", "").Replace(";", "") | ConvertFrom-Json
# grab the assets JSON array
$Assets = $Json.assets
foreach ($Asset in $Assets) {
$OldHash = $Asset.hash
$Path = $Asset.url
$Signature = Get-FileHash -Path $Path -Algorithm SHA256
$SignatureBytes = [byte[]] -split ($Signature.Hash -replace '..', '0x$& ')
$SignatureBase64 = [System.Convert]::ToBase64String($SignatureBytes)
$NewHash = "sha256-$SignatureBase64"
If ($OldHash -ne $NewHash) {
Write-Host "Updating hash for $Path from $OldHash to $NewHash"
# slashes are escaped in the js-file, but PowerShell unescapes them automatically,
# we need to re-escape them
$OldHash = $OldHash.Replace("/", "\/")
$NewHash = $NewHash.Replace("/", "\/")
$JsFileContent = $JsFileContent.Replace("""$OldHash""", """$NewHash""")
}
}
Set-Content -Path service-worker-assets.js -Value $JsFileContent -NoNewline
This script iterates over all files listed inside of service-worker-assets.js, calculates the new hash for each file and updates the hash in the JavaScript file if it's different.
You have to execute the script with the published wwwroot folder as the current working directory.
I described this in more detail on my blog: Fix Blazor WebAssembly PWA integrity checks

Have to build 2 solutions, one per project

We have to build two solutions central on a TFS Server. One solution is a framework, the other includes services, which should be build separately per project in order to deploy them later via script.
In addition the framework assemblies are copied to a (base) project within the framework solution. All projects of the second solution referring to this 'base' project.
My problem is, that I have no idea, how to configure the solutions, project and builds, to behave the request illustrated above.
Please help.
Note: I don't want to put each service project into an msi in order to install it. I just want to deploy the Service out of a central drop-folder on the TFS server.
Team Build can build multiple solutions in the Build Process Template. Just click the [...] button behind the Projects to Build and add both solutions.
TFS redirects the output directory of your projects, which will probably break your script that copies the output from A to the "base project" of B. In order to turn of this redirection set the Output location to AsConfigured.
Now TFS won't know how to copy your output to the Binaries folder, which serves as the source for the copy to droplocation action. To solve that you'll need to write a powershell script and configure this as a post-build script.
The process to create a drop script is clearly documented on MSDN and a sample script is available from CodePlex.
##-----------------------------------------------------------------------
## <copyright file="GatherItemsForDrop.ps1">(c) http://TfsBuildExtensions.codeplex.com/. This source is subject to the Microsoft Permissive License. See http://www.microsoft.com/resources/sharedsource/licensingbasics/sharedsourcelicenses.mspx. All other rights reserved.</copyright>
##-----------------------------------------------------------------------
# Copy the binaries to the bin directory
# so that the build server can drop them
# to the staging location specified on the Build Defaults tab
#
# See
# http://msdn.microsoft.com/en-us/library/bb778394(v=vs.120).aspx
# http://msdn.microsoft.com/en-us/library/dd647547(v=vs.120).aspx#scripts
# Enable -Verbose option
[CmdletBinding()]
# Disable parameter
# Convenience option so you can debug this script or disable it in
# your build definition without having to remove it from
# the 'Post-build script path' build process parameter.
param([switch]$Disable)
if ($PSBoundParameters.ContainsKey('Disable'))
{
Write-Verbose "Script disabled; no actions will be taken on the files."
}
# This script copies the basic file types for managed code projects.
# You can change this list to meet your needs.
$FileTypes = $("*.exe","*.dll","*.exe.config","*.pdb")
# Specify the sub-folders to include
$SourceSubFolders = $("*bin*","*obj*")
# If this script is not running on a build server, remind user to
# set environment variables so that this script can be debugged
if(-not $Env:TF_BUILD -and -not ($Env:TF_BUILD_SOURCESDIRECTORY -and $Env:TF_BUILD_BINARIESDIRECTORY))
{
Write-Error "You must set the following environment variables"
Write-Error "to test this script interactively."
Write-Host '$Env:TF_BUILD_SOURCESDIRECTORY - For example, enter something like:'
Write-Host '$Env:TF_BUILD_SOURCESDIRECTORY = "C:\code\FabrikamTFVC\HelloWorld"'
Write-Host '$Env:TF_BUILD_BINARIESDIRECTORY - For example, enter something like:'
Write-Host '$Env:TF_BUILD_BINARIESDIRECTORY = "C:\code\bin"'
exit 1
}
# Make sure path to source code directory is available
if (-not $Env:TF_BUILD_SOURCESDIRECTORY)
{
Write-Error ("TF_BUILD_SOURCESDIRECTORY environment variable is missing.")
exit 1
}
elseif (-not (Test-Path $Env:TF_BUILD_SOURCESDIRECTORY))
{
Write-Error "TF_BUILD_SOURCESDIRECTORY does not exist: $Env:TF_BUILD_SOURCESDIRECTORY"
exit 1
}
Write-Verbose "TF_BUILD_SOURCESDIRECTORY: $Env:TF_BUILD_SOURCESDIRECTORY"
# Make sure path to binary output directory is available
if (-not $Env:TF_BUILD_BINARIESDIRECTORY)
{
Write-Error ("TF_BUILD_BINARIESDIRECTORY environment variable is missing.")
exit 1
}
if ([IO.File]::Exists($Env:TF_BUILD_BINARIESDIRECTORY))
{
Write-Error "Cannot create output directory."
Write-Error "File with name $Env:TF_BUILD_BINARIESDIRECTORY already exists."
exit 1
}
Write-Verbose "TF_BUILD_BINARIESDIRECTORY: $Env:TF_BUILD_BINARIESDIRECTORY"
# Tell user what script is about to do
Write-Verbose "Will look for and then gather "
Write-Verbose "$FileTypes files from"
Write-Verbose "$Env:TF_BUILD_SOURCESDIRECTORY and copy them to "
Write-Verbose $Env:TF_BUILD_BINARIESDIRECTORY
# Find the files
$files = gci $Env:TF_BUILD_SOURCESDIRECTORY -recurse -include $SourceSubFolders |
?{ $_.PSIsContainer } |
foreach { gci -Path $_.FullName -Recurse -include $FileTypes }
if($files)
{
Write-Verbose "Found $($files.count) files:"
foreach ($file in $files) {
Write-Verbose $file.FullName
}
}
else
{
Write-Warning "Found no files."
}
# If binary output directory exists, make sure it is empty
# If it does not exist, create one
# (this happens when 'Clean workspace' build process parameter is set to True)
if ([IO.Directory]::Exists($Env:TF_BUILD_BINARIESDIRECTORY))
{
$DeletePath = $Env:TF_BUILD_BINARIESDIRECTORY + "\*"
Write-Verbose "$Env:TF_BUILD_BINARIESDIRECTORY exists."
if(-not $Disable)
{
Write-Verbose "Ready to delete $DeletePath"
Remove-Item $DeletePath -recurse
Write-Verbose "Files deleted."
}
}
else
{
Write-Verbose "$Env:TF_BUILD_BINARIESDIRECTORY does not exist."
if(-not $Disable)
{
Write-Verbose "Ready to create it."
[IO.Directory]::CreateDirectory($Env:TF_BUILD_BINARIESDIRECTORY) | Out-Null
Write-Verbose "Directory created."
}
}
# Copy the binaries
Write-Verbose "Ready to copy files."
if(-not $Disable)
{
foreach ($file in $files)
{
Copy $file $Env:TF_BUILD_BINARIESDIRECTORY
}
Write-Verbose "Files copied."
}
A better solution would probably be to have 2 separate builds where the first build publishes the dependencies of the second project as a NuGet package. The Microsoft ALM Rangers have delivered a guide that explains how to set that up.
One option is to host your own nuget feed: http://docs.nuget.org/create/hosting-your-own-nuget-feeds
By hosting your own feed, you can execute custom build activities within your build process which updates your feed.
See this documentation for custom tfs build activities: http://nakedalm.com/creating-a-custom-activity-for-team-foundation-build/
See this documentation for adding powershell to your build process: http://blogs.technet.com/b/heyscriptingguy/archive/2014/04/21/powershell-and-tfs-the-basics-and-beyond.aspx
By hosting your own nuget feed, you have the ability to have your consuming solution leverage your private nugget feed and packages to deal with dependency management and versions. By leveraging the custom build activities you have the ability to update your nuget feed via .net or powershell. You also can automate the deployment via your powershell scripts.

Automatic deployment of solutions with PowerShell

I have a folder, containing several solutions for a SharePoint application, which I want to add and install. I want to iterate over the elements in the folder, then use the Add-SPSolution. After that I want to do a check if the solutions are done deploying, before using the Install-SPSolution. Here is a snippet that I am currently working on:
# Get the location of the folder you are currently in
$dir = $(gl)
# Create a list with the .wsp solutions
$list = Get-ChildItem $dir | where {$_.extension -eq ".wsp"}
Write-Host 'DEPLOYING SOLUTIONS...'
foreach($my_file in Get-ChildItem $list){Add-SPSolution -LiteralPath $my_file.FullName}
Write-Host 'SLEEP FOR 30 SECONDS'
Start-Sleep -s 30
Write-Host 'INSTALLING SOLUTIONS...'
foreach($my_file in Get-ChildItem $list){Install-SPSolution -Identity $my_file.Name -AllWebApplications -GACDeployment}
Is there a way to check if the deployment is finished, and it is ready to start installing the solutions?
You need to check the SPSolution.Deployed property value in a loop - basic solution looks like this:
do { Start-Sleep 2 } while (!((Get-SPSolution $name).Deployed))
The Deploying SharePoint 2010 Solution Packages Using PowerShell article contains more details and this comment discusses a potential caveat.

Change IIS Site Home Directory w/ Powershell

I can query the AD and find all the IIS sites and their virtual directories, now I need to be able to update those home directories and save the changes.
After I fetch the directory entry I can display the site path using $site.Path, however setting it doesn't seem to have any effect. It never changes the actual stored path.
I have tried $site.Path = <new path> and $site.Put( "Path", <new path> ) but neither have these seem to be affecting the stored path.
$site = $iis.psbase.children |
where {$_.keyType -eq "iiswebserver"} |
where {$_.psbase.properties.servercomment -eq $siteConfig.name };
$s = [ADSI]($site.psbase.path + "/ROOT");
$s.Path
# $s.Path = $siteConfig.path
# $s.Put("Path", $siteConfig.path )
$s.psbase.CommitChanges()
Import-Module WebAdministration
Set-ItemProperty 'IIS:\Sites\Default Web Site\' -name physicalPath -value $siteConfig.path
http://technet.microsoft.com/en-us/library/ee909471(WS.10).aspx
Ok, I tried this and it seems to work:
$s.psbase.properties.path[0] = $siteConfig.path
$s.psbase.CommitChanges()
Is there a better cleaner way of handling this?

Resources