Blazor Wasm PWA IIS Deployment integrity error - iis

I created a new Blazor PWA WebAssembly (last version default template) project and deployed it in a IIS in Windows Server to try PWA.
Installed the last .NET Core Hosting Bundle.
After publising it, I ran the script in the Microsoft Docs to rename dll files:
dir .\_framework\_bin | rename-item -NewName { $_.name -replace ".dll\b",".bin" } ((Get-Content .\_framework\blazor.boot.json -Raw) -replace '.dll"','.bin"') | Set-Content .\_framework\blazor.boot.json
And the serviceworker renaming code too:
((Get-Content .\service-worker-assets.js -Raw) -replace '.dll"','.bin"') | Set-Content .\service-worker-assets.js
Then I deleted the compressed files as the docs says:
wwwroot\service-worker-assets.js.br
wwwroot\service-worker-assets.js.gz
wwwroot\_framework\blazor.boot.json.br
wwwroot\_framework\blazor.boot.json.gz
But I am still getting an error when I load the app:
What Am I missing here?
I guess that it has to do with the hashes and the renaming thing but cant find any solution in the BlazorĀ“s Github issues.

As a result of your modifications to the blazor.boot.json file, the integrity checks fails. service-worker-assets.js contains a list of files and their integrity hashes which are calculated at the time of publish.
You can manually recalculate the hashes using Bash/PowerShell, since you're using IIS, I'll provide the PowerShell script I used for a similar issue:
# make sure you're in the wwwroot folder of the published application
$JsFileContent = Get-Content -Path service-worker-assets.js -Raw
# remove JavaScript from contents so it can be interpreted as JSON
$Json = $JsFileContent.Replace("self.assetsManifest = ", "").Replace(";", "") | ConvertFrom-Json
# grab the assets JSON array
$Assets = $Json.assets
foreach ($Asset in $Assets) {
$OldHash = $Asset.hash
$Path = $Asset.url
$Signature = Get-FileHash -Path $Path -Algorithm SHA256
$SignatureBytes = [byte[]] -split ($Signature.Hash -replace '..', '0x$& ')
$SignatureBase64 = [System.Convert]::ToBase64String($SignatureBytes)
$NewHash = "sha256-$SignatureBase64"
If ($OldHash -ne $NewHash) {
Write-Host "Updating hash for $Path from $OldHash to $NewHash"
# slashes are escaped in the js-file, but PowerShell unescapes them automatically,
# we need to re-escape them
$OldHash = $OldHash.Replace("/", "\/")
$NewHash = $NewHash.Replace("/", "\/")
$JsFileContent = $JsFileContent.Replace("""$OldHash""", """$NewHash""")
}
}
Set-Content -Path service-worker-assets.js -Value $JsFileContent -NoNewline
This script iterates over all files listed inside of service-worker-assets.js, calculates the new hash for each file and updates the hash in the JavaScript file if it's different.
You have to execute the script with the published wwwroot folder as the current working directory.
I described this in more detail on my blog: Fix Blazor WebAssembly PWA integrity checks

Related

WinSCP - How to download only folders/files 1 day old while excluding empty folders/files? [duplicate]

I am limited to PuTTY and WinSCP only.
I am trying to download log directories with log files. For example, I want to grab all log_files 6 days old or newer. log_dir2 and log_dir3 including the folders match the criteria, while log_dir1 and its files does not.
DIR/log_dir1/log_files % older than 6 days
DIR/log_dir2/log_files % meets criteria
DIR/log_dir3/log_files % meets criteria
My problem is that while the log_files of log_dir1 are not downloaded, the syntax I am currently using downloads the log_dir1 folder. Normally, not a big deal, but we are talking hundreds of log_dir folders (all empty as the files are older than 6 days). For reasons beyond my control, I cannot move or archive these old log directories with their log files.
My question is simply, how do I change my syntax to ignore folders that are older than 6 days as well as files.
get -filemask="*>6D" /DIR/* C:\temp
I have tried several different combinations of parameters and I have read the support page about Directory Masks and Path Masks. I cannot get any of them working (version issue?). Can anyone explain their syntax better than the help page. I will update tomorrow with the current version of WinSCP that I am using.
Time constraint in WinSCP file mask cannot be used for directories.
But you can prevent WinSCP from creating the empty folders. Use -rawtransfersettings switch with ExcludeEmptyDirectories setting.
get -rawtransfersettings ExcludeEmptyDirectories=1 -filemask="*>6D" /DIR/* C:\temp
This is the original answer, before WinSCP supported ExcludeEmptyDirectories. It might still be useful as a basis for implementations that have even more specific constraints.
You can implement this custom logic easily in PowerShell script with a use of WinSCP .NET assembly:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Sftp
HostName = "example.com"
UserName = "username"
Password = "password"
SshHostKeyFingerprint = "..."
}
$remotePath = "/remote/path"
$localPath = "C:\local\path"
$limit = (Get-Date).AddDays(-6)
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
# Enumerate files to download
$fileInfos =
$session.EnumerateRemoteFiles(
$remotePath, $Null, [WinSCP.EnumerationOptions]::AllDirectories) |
Where-Object { $_.LastWriteTime -gt $limit }
foreach ($fileInfo in $fileInfos)
{
$localFilePath =
[WinSCP.RemotePath]::TranslateRemotePathToLocal(
$fileInfo.FullName, $remotePath, $localPath)
# If the corresponding local folder does not exist yet, create it
$localFileDir = Split-Path -Parent $localFilePath
if (!(Test-Path -Path $localFileDir))
{
Write-Host "Creating local directory $localFileDir..."
New-Item $localFileDir -ItemType directory | Out-Null
}
Write-Host "Downloading file $($fileInfo.FullName)..."
# Download file
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($fileInfo.FullName)
$transferResult = $session.GetFiles($sourcePath, $localFilePath)
# Did the download succeeded?
if (!$transferResult.IsSuccess)
{
# Print error (but continue with other files)
Write-Host ("Error downloading file ${remoteFilePath}: " +
$transferResult.Failures[0].Message)
}
}
$session.Dispose()
Write-Host "Done."
Run the script (download.ps1) like:
powershell.exe -ExecutionPolicy Unrestricted -File download.ps1

Downloading a file from Adls gen1 using powershell is not working

I am trying a download a file from azure data lake store using powershell script. The code is triggered from a runbook in azure cloud. Looks like Export-AdlStoreItem is not working as expected.I dont get any error messages or compilation errors. In fact when this command is executed a zero byte file is generated in the destination.The name of that file is TemporaryFile2020-06-02_14-56-57.datc18371c2-d39c-4588-9af0-93aa3e136b01Segments
what is happening?.Please help!.
$LocalDwldPath = "T:\ICE_PROD_DATA_SOURCING\FILE_DOWNLOAD_PATH\TemporaryFile$($TimeinSec).dat"
$SourcePath = "Dataproviders/Landing/GCW/HPIndirect/Orders/AMS/gcw_hp_indirect_orders_ams_745_20200601_04_34_01.dat"
$PRODAdlsName = "itgdls01"
Export-AdlStoreItem -Account $PRODAdlsName -Path $("/" + $SourcePath.Trim()) -Destination $LocalDwldPath -Force -ErrorAction Stop
if( Test-Path $LocalDwldPath.Trim() )
{
Get-Content -Path $LocalDwldPath.Trim() -ReadCount 1000 |% { $FileCount += $_.Count }
Remove-Item $LocalDwldPath.Trim()
Set-Content -Path $cntCaptureFile -Value $FileCount
$TimeinSec = TimeStamp2
Add-Content -Value "$TimeinSec Log: Identified file for getting count is $($SourcePath.Trim()) and the count is $FileCount" -Path $logfile
}
else
{
$TimeinSec = TimeStamp2
Add-Content -Value "$TimeinSec Error: Identified file for getting count is $($SourcePath.Trim()) and the count capture failed as local file is not found!" -Path $logfile
}
According to my research, if you want to download a file from azure data lake with PowerShell, we can use the PowerShell command Export-AzDataLakeStoreItem.
For example
Export-AzDataLakeStoreItem -Account <> -Path '/test/test.csv' -Destination 'D:\myDirectory\test.csv'
For more details, please refer to the document
The issue was with the local download path/destination
("T:\ICE_PROD_DATA_SOURCING\FILE_DOWNLOAD_PATH\TemporaryFile$($TimeinSec).dat").
The T:\ drive is a virtual drive/network drive connected as Azure Fileshare.
Instead of T:\ , I have pointed the destination location to a local drive ("F:\ICE_PROD_DATA_SOURCING\FILE_DOWNLOAD_PATH\TemporaryFile$($TimeinSec).dat") and it worked fine.
Its surprising that Powershell didn't give any error messages when it was not able to save the file in a network path pointed to azure fileshare.

Upload blob with Set-AzStorageBlobContent via pipeline and set ContentType property

Using the Az PowerShell module, I'm trying to enumerate a directory on disk and pipe the output to Set-AzStorageBlobContent to upload to Azure, while preserving the folder structure. This works great, except the ContentType property of all blobs is set to application/octet-stream. I'd like to set it dynamically based on the file extension of the blob being uploaded.
Here's example code for the base case:
Get-ChildItem $SourceRoot -Recurse -File |
Set-AzStorageBlobContent -Container $ContainerName -Context $context -Force
To set the ContentType, I need to add a Properties parameter to Set-AzStorageBlobContent with a value like #{ "ContentType" = "<content type>" }. The content type should be determined from the specific file extension being uploaded. I've written a separate pipelined function that can add a MimeType property to the file object, but I can't figure out how to reference that for the parameter in the pipeline. Example:
function Add-MimeType{
[cmdletbinding()]
param(
[parameter(
Mandatory = $true,
ValueFromPipeline = $true)]
$pipelineInput
)
Process {
$mimeType = Get-MimeType $pipelineInput.Extension
Add-Member -InputObject $pipelineInput -NotePropertyName "MimeType" -NotePropertyValue $mimeType
return $pipelineInput
}
}
function Get-MimeType(
[string]$FileExtension
)
{
switch ($FileExtension.ToLowerInvariant())
{
'.txt' { return 'text/plain' }
'.xml' { return 'text/xml' }
default { return 'application/octet-stream' }
}
}
Get-ChildItem $SourceRoot -Recurse -File |
Add-MimeType |
Set-AzStorageBlobContent -Container $ContainerName -Properties #{"ContentType" = "$($_.MimeType)"} -Context $context -Force
It seems that $_ isn't usable in this context. Is there another way to accomplish this?
The reason I'd like to continue using pipelining is that it appears to work much faster than using a ForEach-Object loop to call the function (where $_ does work).
If you are open to completely different solutions, you can also use AzCopy.
You can upload your whole folder with one command, and AzCopy can also automatically guess the correct mime type based on the file extension. There is also support for Azure Pipelines, if that is part of your setup.
Command could look something like this:
# AzCopy v10 will automatically guess the content type unless you pass --no-guess-mime-type
azcopy copy 'C:\myDirectory' 'https://mystorageaccount.blob.core.windows.net/mycontainer' --recursive
# AzCopy V8
azcopy copy 'C:\myDirectory' 'https://mystorageaccount.blob.core.windows.net/mycontainer' /s /SetContentType
Taken from the output of AzCopy.exe copy --help:
AzCopy automatically detects the content type of the files when uploading from the local disk, based on the file extension or content (if no extension is specified).
The built-in lookup table is small, but on Unix, it is augmented by the local system's mime.types file(s) if available under one or more of these names:
/etc/mime.types
/etc/apache2/mime.types
/etc/apache/mime.types
On Windows, MIME types are extracted from the registry. This feature can be turned off with the help of a flag. Please refer to the flag section.

Get dbname from multiple web.config files with powershell

I would like to issue a powershell command to return me the connection string (specifically I am looking for the db name value) for all the web sites on a web server...
So I would like to see something like
site1 dbname=Northwind
site2 dbname=Fitch
site3 dbname=DemoDB
I have tried using the IIS Powershell snap-in... I thought I was close with this:
PS C:\Windows\system32> Get-WebApplication | Get-WebConfiguration -filter /connectionStrings/*
but... after looking at the results... my answer doesn't appear to be in there
I am very new to powershell - so excuse my ignornance and inexperience
Any help appreciated!
thanks!
Hopefully, this will get you started. This just assumes there will be a web.config file at the physical path of the web application's physical path. It does not recurse to find other web.config files in the web application. It also assumes your connection strings are in the connectionStrings configuration element.
Import-Module WebAdministration
Get-WebApplication | `
ForEach-Object {
$webConfigFile = [xml](Get-Content "$($_.PhysicalPath)\Web.config")
Write-Host "Web Application: $($_.path)"
foreach($connString in $webConfigFile.configuration.connectionStrings.add)
{
Write-Host "Connection String $($connString.name): $($connString.connectionString)"
$dbRegex = "((Initial\sCatalog)|((Database)))\s*=(?<ic>[a-z\s0-9]+?);"
$found = $connString.connectionString -match $dbRegex
if ($found)
{
Write-Host "Database: $($Matches["ic"])"
}
}
Write-Host " "
}
This post may give you an idea to start with. Basically load in the web.config file as an XML file and then just find the node where the connection string is.
Do something like $myFile = ([xml] Get-Content web.config). You can then pipe that to Get-Member ( $myFile | Get-Member -MemberType Property) to start working your way into the file to see what node has it. I'm not at a computer where I can show you some screenshots to explain it more, but you can check this chapter out from PowerShell.com "Master PowerShell" e-book that explains working with XML very well.

Change IIS Site Home Directory w/ Powershell

I can query the AD and find all the IIS sites and their virtual directories, now I need to be able to update those home directories and save the changes.
After I fetch the directory entry I can display the site path using $site.Path, however setting it doesn't seem to have any effect. It never changes the actual stored path.
I have tried $site.Path = <new path> and $site.Put( "Path", <new path> ) but neither have these seem to be affecting the stored path.
$site = $iis.psbase.children |
where {$_.keyType -eq "iiswebserver"} |
where {$_.psbase.properties.servercomment -eq $siteConfig.name };
$s = [ADSI]($site.psbase.path + "/ROOT");
$s.Path
# $s.Path = $siteConfig.path
# $s.Put("Path", $siteConfig.path )
$s.psbase.CommitChanges()
Import-Module WebAdministration
Set-ItemProperty 'IIS:\Sites\Default Web Site\' -name physicalPath -value $siteConfig.path
http://technet.microsoft.com/en-us/library/ee909471(WS.10).aspx
Ok, I tried this and it seems to work:
$s.psbase.properties.path[0] = $siteConfig.path
$s.psbase.CommitChanges()
Is there a better cleaner way of handling this?

Resources