powershell script that reads the last modified date of a folder - excel

I need a script that reads the last modified date of a file and by whom it was modified and outputs to excel. I found a script that changes the modification date.
$a = get-date
$b = Get-ChildItem "C:\Intel" -recurse | ? { !$_.psiscontainer }
foreach ($i in $b)
{
$i.LastWriteTime = $a
}
$b

You can easily get the LastWriteTime by checking the LastWriteTime proprty of a file.
get-childitem * | select FullName,LastWriteTime,Owner
You can check the owner of a file which may or may not be the last person to modify depending on the file type. Some office files will change owner to the last person to write to them but I don't know that this is reliable.
get-childitem * | ForEach-Object {get-acl $_ | select owner}
NTFS doesn't log the last person to modify a file. You can either turn on auditing and check the system audit eventlog or look into the filesystemwatcher class and build a custom script that watches for changes to a folder. (Warning: this may cause performance issues.)

Related

WinSCP - How to download only folders/files 1 day old while excluding empty folders/files? [duplicate]

I am limited to PuTTY and WinSCP only.
I am trying to download log directories with log files. For example, I want to grab all log_files 6 days old or newer. log_dir2 and log_dir3 including the folders match the criteria, while log_dir1 and its files does not.
DIR/log_dir1/log_files % older than 6 days
DIR/log_dir2/log_files % meets criteria
DIR/log_dir3/log_files % meets criteria
My problem is that while the log_files of log_dir1 are not downloaded, the syntax I am currently using downloads the log_dir1 folder. Normally, not a big deal, but we are talking hundreds of log_dir folders (all empty as the files are older than 6 days). For reasons beyond my control, I cannot move or archive these old log directories with their log files.
My question is simply, how do I change my syntax to ignore folders that are older than 6 days as well as files.
get -filemask="*>6D" /DIR/* C:\temp
I have tried several different combinations of parameters and I have read the support page about Directory Masks and Path Masks. I cannot get any of them working (version issue?). Can anyone explain their syntax better than the help page. I will update tomorrow with the current version of WinSCP that I am using.
Time constraint in WinSCP file mask cannot be used for directories.
But you can prevent WinSCP from creating the empty folders. Use -rawtransfersettings switch with ExcludeEmptyDirectories setting.
get -rawtransfersettings ExcludeEmptyDirectories=1 -filemask="*>6D" /DIR/* C:\temp
This is the original answer, before WinSCP supported ExcludeEmptyDirectories. It might still be useful as a basis for implementations that have even more specific constraints.
You can implement this custom logic easily in PowerShell script with a use of WinSCP .NET assembly:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Sftp
HostName = "example.com"
UserName = "username"
Password = "password"
SshHostKeyFingerprint = "..."
}
$remotePath = "/remote/path"
$localPath = "C:\local\path"
$limit = (Get-Date).AddDays(-6)
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
# Enumerate files to download
$fileInfos =
$session.EnumerateRemoteFiles(
$remotePath, $Null, [WinSCP.EnumerationOptions]::AllDirectories) |
Where-Object { $_.LastWriteTime -gt $limit }
foreach ($fileInfo in $fileInfos)
{
$localFilePath =
[WinSCP.RemotePath]::TranslateRemotePathToLocal(
$fileInfo.FullName, $remotePath, $localPath)
# If the corresponding local folder does not exist yet, create it
$localFileDir = Split-Path -Parent $localFilePath
if (!(Test-Path -Path $localFileDir))
{
Write-Host "Creating local directory $localFileDir..."
New-Item $localFileDir -ItemType directory | Out-Null
}
Write-Host "Downloading file $($fileInfo.FullName)..."
# Download file
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($fileInfo.FullName)
$transferResult = $session.GetFiles($sourcePath, $localFilePath)
# Did the download succeeded?
if (!$transferResult.IsSuccess)
{
# Print error (but continue with other files)
Write-Host ("Error downloading file ${remoteFilePath}: " +
$transferResult.Failures[0].Message)
}
}
$session.Dispose()
Write-Host "Done."
Run the script (download.ps1) like:
powershell.exe -ExecutionPolicy Unrestricted -File download.ps1

Blazor Wasm PWA IIS Deployment integrity error

I created a new Blazor PWA WebAssembly (last version default template) project and deployed it in a IIS in Windows Server to try PWA.
Installed the last .NET Core Hosting Bundle.
After publising it, I ran the script in the Microsoft Docs to rename dll files:
dir .\_framework\_bin | rename-item -NewName { $_.name -replace ".dll\b",".bin" } ((Get-Content .\_framework\blazor.boot.json -Raw) -replace '.dll"','.bin"') | Set-Content .\_framework\blazor.boot.json
And the serviceworker renaming code too:
((Get-Content .\service-worker-assets.js -Raw) -replace '.dll"','.bin"') | Set-Content .\service-worker-assets.js
Then I deleted the compressed files as the docs says:
wwwroot\service-worker-assets.js.br
wwwroot\service-worker-assets.js.gz
wwwroot\_framework\blazor.boot.json.br
wwwroot\_framework\blazor.boot.json.gz
But I am still getting an error when I load the app:
What Am I missing here?
I guess that it has to do with the hashes and the renaming thing but cant find any solution in the BlazorĀ“s Github issues.
As a result of your modifications to the blazor.boot.json file, the integrity checks fails. service-worker-assets.js contains a list of files and their integrity hashes which are calculated at the time of publish.
You can manually recalculate the hashes using Bash/PowerShell, since you're using IIS, I'll provide the PowerShell script I used for a similar issue:
# make sure you're in the wwwroot folder of the published application
$JsFileContent = Get-Content -Path service-worker-assets.js -Raw
# remove JavaScript from contents so it can be interpreted as JSON
$Json = $JsFileContent.Replace("self.assetsManifest = ", "").Replace(";", "") | ConvertFrom-Json
# grab the assets JSON array
$Assets = $Json.assets
foreach ($Asset in $Assets) {
$OldHash = $Asset.hash
$Path = $Asset.url
$Signature = Get-FileHash -Path $Path -Algorithm SHA256
$SignatureBytes = [byte[]] -split ($Signature.Hash -replace '..', '0x$& ')
$SignatureBase64 = [System.Convert]::ToBase64String($SignatureBytes)
$NewHash = "sha256-$SignatureBase64"
If ($OldHash -ne $NewHash) {
Write-Host "Updating hash for $Path from $OldHash to $NewHash"
# slashes are escaped in the js-file, but PowerShell unescapes them automatically,
# we need to re-escape them
$OldHash = $OldHash.Replace("/", "\/")
$NewHash = $NewHash.Replace("/", "\/")
$JsFileContent = $JsFileContent.Replace("""$OldHash""", """$NewHash""")
}
}
Set-Content -Path service-worker-assets.js -Value $JsFileContent -NoNewline
This script iterates over all files listed inside of service-worker-assets.js, calculates the new hash for each file and updates the hash in the JavaScript file if it's different.
You have to execute the script with the published wwwroot folder as the current working directory.
I described this in more detail on my blog: Fix Blazor WebAssembly PWA integrity checks

using team city to insert build number & perform string replacement operations

I am using team city 9.1.7 version on Windows 2012 server. As part of the build steps, I build a nodejs based application using command line. The output is bunch of Javascript and html files.
In the next step (after the build is over & output is generated), I want to perform following:
Take the current build number from team city and insert it into index.html (available in output folder) file. I want to add a meta tag which can tell me the build version.
In the same file (index.html), I want to perform string find and replace operations. I want to add time stamp to files.
Find this <script src="bundle.js"></script> and
replace with <script src="bundle.js?time=getTime()"></script>
will results in <script src="bundle.js?time=4324324324"></script>
Try the following
Add a PowerShell step and run the following as source code
$versionNumber = "%build.number%"
$filePath = "%teamcity.agent.work.dir%\path\file.txt"
(GC $filePath).Replace("<head>", "<head><meta http-equiv='X-Version-Number' content='$versionNumber'>").Replace("bundle.js", "bundle.js?time=getTime()") | Set-Content $filePath
This will read the file contents in and perform two replacements on them and then write back to the file.
Not sure what your file path is or what you want the header called, but you should be able to change this to suit your requirements.
Hope this helps
REVISION
To catch any exceptions, try wrapping the code in a try catch block
try {
(GC $filePath).Replace("<head>", "<head><meta http-equiv='X-Version-Number' content='$versionNumber'>").Replace("bundle.js", "bundle.js?time=getTime()") | Set-Content $filePath
}
catch [System.Exception] {
Write-Output $_
Exit 1
}
To break out of the cache you could use the version number as this will increment each build and thus be unique
try {
(GC $filePath).Replace("<head>", "<head><meta http-equiv='X-Version-Number' content='$versionNumber'>").Replace("bundle.js", "bundle.js?v=$versionNumber") | Set-Content $filePath
}
catch [System.Exception] {
Write-Output $_
Exit 1
}

how to get-content of text file on multiple servers using powershell

I'm trying to see if I can read a specific txt file (last x rows) on multiple servers.
I have tried this but its incorrect, can you please help me out?
Get-Content C:\Users\admin\server.txt | ForEach-Object {Get-Content "C:\project\file.log" | Select -Last 20}
The file is located on the same folder on each server.
The server.txt has all server names to which i have access with my current user, like
Server1
Server2
Server3
Thank you!
okay, guessing your server.txt file contains server names like \Server1 \Server2 with a line containing a single server name? What I see is wrong is this.
1) Change your ForEach to
Foreach-object($)
2) Inside your Foreach build the correct path. The $ var will be \Server1 so append \c$\Project\file.log
Not sure why the underscore is not showing up but the $ needs an underscore after it.
$server_names = Get-Content "C:\Users\hostlist.txt"
Foreach ($server in $server_names) {Get-Content "\$server\c$\test.log" | Select -Last 20}

Automatic deployment of solutions with PowerShell

I have a folder, containing several solutions for a SharePoint application, which I want to add and install. I want to iterate over the elements in the folder, then use the Add-SPSolution. After that I want to do a check if the solutions are done deploying, before using the Install-SPSolution. Here is a snippet that I am currently working on:
# Get the location of the folder you are currently in
$dir = $(gl)
# Create a list with the .wsp solutions
$list = Get-ChildItem $dir | where {$_.extension -eq ".wsp"}
Write-Host 'DEPLOYING SOLUTIONS...'
foreach($my_file in Get-ChildItem $list){Add-SPSolution -LiteralPath $my_file.FullName}
Write-Host 'SLEEP FOR 30 SECONDS'
Start-Sleep -s 30
Write-Host 'INSTALLING SOLUTIONS...'
foreach($my_file in Get-ChildItem $list){Install-SPSolution -Identity $my_file.Name -AllWebApplications -GACDeployment}
Is there a way to check if the deployment is finished, and it is ready to start installing the solutions?
You need to check the SPSolution.Deployed property value in a loop - basic solution looks like this:
do { Start-Sleep 2 } while (!((Get-SPSolution $name).Deployed))
The Deploying SharePoint 2010 Solution Packages Using PowerShell article contains more details and this comment discusses a potential caveat.

Resources