I would like to build the powershell script to find a string in a configuration file, and use the path found after the match to copy this particular folder to a specified location. After all the folders I need are copied, I want to zip the folder.
The config file is an xml and the specific lines where I want to find the path are like these:
<add key="TlgxDir" value="C:\myapplication\Tlgx" />
example of the config file: https://mega.nz/file/wfRQBLzD#S6DeYTSvDLeilG0Hl0fLwlO4rREhGQaj6G05dhbNchI
So for example my search value in the file "config.xml" is "TlgxDir". I want to copy the folder in the specified path behind it "c:\myapplication\tlgx" to a specified folder (e.g.C:\temp\backup). After I did this procedure for multiple folders I want to zip the destination folder (backup.zip)
I already tried some things but I'm not very familiar with PowerShell...
Thanks in advance to anyone who can help me with this question :)
Just use the PowerShell xml parser:
$xml = [xml](Get-Content .\ui00.exe.config)
$xml.configuration.appSettings.add.where{ $_.key -eq 'TlgxDir' }.Value
C:\myapplication\Tlgx
Ok you can do something like this:
#set target location
$targetPath = 'C:\temp\backup'
#Load xml - replace [path] with the path to the xml document
$xml = New-Object -TypeName xml
$xml.load([path])
#Find node by using xpath and store path
$node = $xml.SelectSingleNode("//add[#key='TlgxDir']")
$sourceDirectory = $node.value
#Copy data
copy-item -Path $sourceDirectory -Destination $targetPath -Recurse -Force -Confirm:$false
#Zip Target
Compress-Archive -Path $targetPath -DestinationPath C:\temp\backup.Zip -Force -Confirm:$false
Related
I am trying to loop through the directory structure of an ADLS filesystem, reporting each directory's size and format until no further directory exists.
Since the folder structure contains TBs of data, I thought using Azure CLI with PowerShell would be the most efficient way to go about this.
But Gen 1's CLI command is limited to:
az dls fs list --account <storage_account> --path <path of the file>
Can I use this command in Powershell to further loop through the rest of the folder structure? Or is there another way to loop through ADLS's folder structure without using the CLI command?
I am a newbie to Azure CLI, hence apologizing if this question is not very advanced.
We have tested in our local environment, Below observations are based on our analysis.
We cannot get the accumulated directory size, Irrespective of which
tools you use like (ex: portal, storage explorer or through
PowerShell cmdlets).
If you use this cmdlet Get-AzDataLakeStoreChildItemSummary this will give directory count(Number of directories available in this container),file count(total files in all subfolders/directories),length (total size consumed by all the files in directories & in subfolders).
$listdir=Get-AzDataLakeStoreChildItem -account '<DataLakeAccountName>' -Path /
$listarray=$listdir
foreach( $listdir in $listarray)
{
Write-Output ("current we are listing the files with "+$listdir.Name);
Get-AzDataLakeStoreChildItemSummary -account '<DataLakeAccountName>' -path /$($listdir.Name)
}
In the above screen shot , directories with names tried,testing
has values 0 since it does have any files or sub directories.
The value of directory trim is 6 since it is having 2 files under
track1 sub directory and 4 files under jshfa subdirectory.
Alternatively , you can use the below script if you to list the files under the sub directory with there respective sizes.
$listdir=Get-AzDataLakeStoreChildItem -account '<DataLakeAccountName>' -Path /
$listarray=$listdir
foreach( $listdir in $listarray)
{
Write-Output ("current we are listing the files under "+$listdir.Name +" "+"length of the Directory is: " + $listdir.length);
$addvar=Get-AzDataLakeStoreChildItem -account '<DataLakeAccountName>' -Path /$($listdir.Name) | Select -Property length,Name,type;
##Write-Host $addvar;
$subdirect = $addvar;
##Write-Host $subdirect;
foreach( $addvar in $subdirect)
{
if($addvar.type -eq 'DIRECTORY')
{
$subfolderlist=Get-AzDataLakeStoreChildItem -account '<DataLakeAccountName>' -Path /$($listdir.Name)/$($addvar.Name);
Write-Output $subfolderlist
}
}
Write-Output $addvar
}
Below is The output of the above script: Here Trim has 2 subdirectories(track1, track234) since track1 has two files and those are listed here & remaining directories, sub-directories doesn't have any files & hence the length value in below output is 0.
Note:
If you have multiple folders under sub-directory every time you need append the directory path in for each loop & make the changes accordingly.
Also both Azure CLI, PowerShell Cmdlets both dependents on the given -path parameter.
I am limited to PuTTY and WinSCP only.
I am trying to download log directories with log files. For example, I want to grab all log_files 6 days old or newer. log_dir2 and log_dir3 including the folders match the criteria, while log_dir1 and its files does not.
DIR/log_dir1/log_files % older than 6 days
DIR/log_dir2/log_files % meets criteria
DIR/log_dir3/log_files % meets criteria
My problem is that while the log_files of log_dir1 are not downloaded, the syntax I am currently using downloads the log_dir1 folder. Normally, not a big deal, but we are talking hundreds of log_dir folders (all empty as the files are older than 6 days). For reasons beyond my control, I cannot move or archive these old log directories with their log files.
My question is simply, how do I change my syntax to ignore folders that are older than 6 days as well as files.
get -filemask="*>6D" /DIR/* C:\temp
I have tried several different combinations of parameters and I have read the support page about Directory Masks and Path Masks. I cannot get any of them working (version issue?). Can anyone explain their syntax better than the help page. I will update tomorrow with the current version of WinSCP that I am using.
Time constraint in WinSCP file mask cannot be used for directories.
But you can prevent WinSCP from creating the empty folders. Use -rawtransfersettings switch with ExcludeEmptyDirectories setting.
get -rawtransfersettings ExcludeEmptyDirectories=1 -filemask="*>6D" /DIR/* C:\temp
This is the original answer, before WinSCP supported ExcludeEmptyDirectories. It might still be useful as a basis for implementations that have even more specific constraints.
You can implement this custom logic easily in PowerShell script with a use of WinSCP .NET assembly:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Sftp
HostName = "example.com"
UserName = "username"
Password = "password"
SshHostKeyFingerprint = "..."
}
$remotePath = "/remote/path"
$localPath = "C:\local\path"
$limit = (Get-Date).AddDays(-6)
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
# Enumerate files to download
$fileInfos =
$session.EnumerateRemoteFiles(
$remotePath, $Null, [WinSCP.EnumerationOptions]::AllDirectories) |
Where-Object { $_.LastWriteTime -gt $limit }
foreach ($fileInfo in $fileInfos)
{
$localFilePath =
[WinSCP.RemotePath]::TranslateRemotePathToLocal(
$fileInfo.FullName, $remotePath, $localPath)
# If the corresponding local folder does not exist yet, create it
$localFileDir = Split-Path -Parent $localFilePath
if (!(Test-Path -Path $localFileDir))
{
Write-Host "Creating local directory $localFileDir..."
New-Item $localFileDir -ItemType directory | Out-Null
}
Write-Host "Downloading file $($fileInfo.FullName)..."
# Download file
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($fileInfo.FullName)
$transferResult = $session.GetFiles($sourcePath, $localFilePath)
# Did the download succeeded?
if (!$transferResult.IsSuccess)
{
# Print error (but continue with other files)
Write-Host ("Error downloading file ${remoteFilePath}: " +
$transferResult.Failures[0].Message)
}
}
$session.Dispose()
Write-Host "Done."
Run the script (download.ps1) like:
powershell.exe -ExecutionPolicy Unrestricted -File download.ps1
I created a new Blazor PWA WebAssembly (last version default template) project and deployed it in a IIS in Windows Server to try PWA.
Installed the last .NET Core Hosting Bundle.
After publising it, I ran the script in the Microsoft Docs to rename dll files:
dir .\_framework\_bin | rename-item -NewName { $_.name -replace ".dll\b",".bin" } ((Get-Content .\_framework\blazor.boot.json -Raw) -replace '.dll"','.bin"') | Set-Content .\_framework\blazor.boot.json
And the serviceworker renaming code too:
((Get-Content .\service-worker-assets.js -Raw) -replace '.dll"','.bin"') | Set-Content .\service-worker-assets.js
Then I deleted the compressed files as the docs says:
wwwroot\service-worker-assets.js.br
wwwroot\service-worker-assets.js.gz
wwwroot\_framework\blazor.boot.json.br
wwwroot\_framework\blazor.boot.json.gz
But I am still getting an error when I load the app:
What Am I missing here?
I guess that it has to do with the hashes and the renaming thing but cant find any solution in the BlazorĀ“s Github issues.
As a result of your modifications to the blazor.boot.json file, the integrity checks fails. service-worker-assets.js contains a list of files and their integrity hashes which are calculated at the time of publish.
You can manually recalculate the hashes using Bash/PowerShell, since you're using IIS, I'll provide the PowerShell script I used for a similar issue:
# make sure you're in the wwwroot folder of the published application
$JsFileContent = Get-Content -Path service-worker-assets.js -Raw
# remove JavaScript from contents so it can be interpreted as JSON
$Json = $JsFileContent.Replace("self.assetsManifest = ", "").Replace(";", "") | ConvertFrom-Json
# grab the assets JSON array
$Assets = $Json.assets
foreach ($Asset in $Assets) {
$OldHash = $Asset.hash
$Path = $Asset.url
$Signature = Get-FileHash -Path $Path -Algorithm SHA256
$SignatureBytes = [byte[]] -split ($Signature.Hash -replace '..', '0x$& ')
$SignatureBase64 = [System.Convert]::ToBase64String($SignatureBytes)
$NewHash = "sha256-$SignatureBase64"
If ($OldHash -ne $NewHash) {
Write-Host "Updating hash for $Path from $OldHash to $NewHash"
# slashes are escaped in the js-file, but PowerShell unescapes them automatically,
# we need to re-escape them
$OldHash = $OldHash.Replace("/", "\/")
$NewHash = $NewHash.Replace("/", "\/")
$JsFileContent = $JsFileContent.Replace("""$OldHash""", """$NewHash""")
}
}
Set-Content -Path service-worker-assets.js -Value $JsFileContent -NoNewline
This script iterates over all files listed inside of service-worker-assets.js, calculates the new hash for each file and updates the hash in the JavaScript file if it's different.
You have to execute the script with the published wwwroot folder as the current working directory.
I described this in more detail on my blog: Fix Blazor WebAssembly PWA integrity checks
So I've got a UNC path like so:
\\server\folder
I want to get just the path to server, eg \\server.
Split-Path "\\server\folder" -Parent returns "". Anything I try which deals with the root, fails.
For example, Get-Item "\\server" fails too.
How can I safely get the path of \\server from \\server\\folder in PowerShell?
By using the System.Uri class and querying its host property:
$uri = new-object System.Uri("\\server\folder")
$uri.host # add "\\" in front to get exactly what you asked
Note: For a UNC path, the root directory is the servername plus the share name part.
An example using regular expressions:
'\\server\share' -replace '(?<!\\)\\\w+'
$fullpath = "\\server\folder"
$parentpath = "\\" + [string]::join("\",$fullpath.Split("\")[2])
$parentpath
\\server
I can query the AD and find all the IIS sites and their virtual directories, now I need to be able to update those home directories and save the changes.
After I fetch the directory entry I can display the site path using $site.Path, however setting it doesn't seem to have any effect. It never changes the actual stored path.
I have tried $site.Path = <new path> and $site.Put( "Path", <new path> ) but neither have these seem to be affecting the stored path.
$site = $iis.psbase.children |
where {$_.keyType -eq "iiswebserver"} |
where {$_.psbase.properties.servercomment -eq $siteConfig.name };
$s = [ADSI]($site.psbase.path + "/ROOT");
$s.Path
# $s.Path = $siteConfig.path
# $s.Put("Path", $siteConfig.path )
$s.psbase.CommitChanges()
Import-Module WebAdministration
Set-ItemProperty 'IIS:\Sites\Default Web Site\' -name physicalPath -value $siteConfig.path
http://technet.microsoft.com/en-us/library/ee909471(WS.10).aspx
Ok, I tried this and it seems to work:
$s.psbase.properties.path[0] = $siteConfig.path
$s.psbase.CommitChanges()
Is there a better cleaner way of handling this?