I am working on an issue and I can't seem to get the syntax correct.
I have a directory which has a series of csv files which each contain a list of virtual directories and paths from an old IIS6 machine. I am recreating those on a new IIS7.5 machine and I am able to get them added one directory at a time by going to the directory "iis:\sites\Atlanta" and running this command.
Import-Csv C:\Users\MIGXHZ700\Desktop\Atlanta.csv | Where-Object {$_.Path -match "\\"} | ForEach-Object {New-Item $_.Name -Type VirtualDirectory -physicalPath $_.Path}
For the life of me I can't get the syntax right to run this in a script. I think it's just an issue with concatenation, but I am not 100% sure. Here is where I am at with the script.
$dirs = ls C:\Users\[blah]\Desktop\*.csv | foreach-object {
Import-Csv $_ |
Where-Object {$_.Path -match "\\"} |
ForEach-Object {New-Item 'IIS:\Sites\'+$_.Name -Type VirtualDirectory -physicalPath $_.Path}
}
It also might be an issue doing Foreach inside of a Foreach?
Thanks in advance for any help.
'IIS:\Sites\'+$_.Name is not a valid argument to New-Item, because the -Path parameter takes a string argument, but that's an expression. It's an expression that evaluates to a string representing the path of the item you want to create, but you need to evaluate it by enclosing it in parentheses:
New-Item ('IIS:\Sites\' + $_.Name) -Type VirtualDirectory -PhysicalPath $_.Path
BTW, what's your intention for $dirs? It will be assigned the output of the New-Item command, which will be an array of DirectoryInfo objects (the same as what you'd get from $dirs = Get-ChildItem IIS:\Sites\ after creating all those directories). Is that what you want?
Related
I am for quite a while, in my free time, tackling a script that can batch replace external link addresses in multiple excel files within script folder. I have learned, that you can't change external links via usual powershell to excel interaction, as these values are forced to read-only. However, there is a clever way to bypass that by converting the Excel file to a .zip archive and read/change the files inside and then rename them back to excel format.
Through learning and digging around the web, i have compiled this script function that should create a backup, rename to archive and then replace desired text within, renaming the file backwards afterwards.
'''
function Update-ExcelLinks($xlsxFile, $oldText, $newText) {
# Build BAK file name
$bakFile = $xlsxFile -ireplace [regex]::Escape(".xlsb"), ".bak"
# Build ZIP file name
$zipFile = $xlsxFile -ireplace [regex]::Escape(".xlsb"), ".zip"
# Create temporary folder
$parent = [System.IO.Path]::GetTempPath();
[string] $guid = [System.Guid]::NewGuid();
$tempFolder = Join-Path $parent $guid;
New-Item -ItemType Directory -Path $tempFolder;
# Uncomment the next line to create backup before processing XLSX file
# Copy-Item $xlsxFile $bakFile
# Rename file to ZIP
Rename-Item -Path $xlsxFile -NewName $zipFile
# Not using Expand-Archive because it changes the ZIP format
C:\7z\7za.exe x "$zipFile" -o"$tempFolder"
# Replace old text with new text
$fileNames = Get-ChildItem -Path $tempFolder -Recurse -Include *.xml,*.bin.rels
foreach ($file in $fileNames)
{
(Get-Content -ErrorAction SilentlyContinue $file.PSPath) |
Foreach-Object { $_ -replace $oldText, $newText } |
Set-Content $file.PSPath
}
# Changing working folder because 7Zip option -w doesn't work
Set-Location -Path $tempFolder
# Not using Compress-Archive because it changes the ZIP format
C:\7z\7za.exe u -r "$zipFile" *.*
# Rename file back to XLSB
Rename-Item -Path $zipFile -NewName $xlsxFile
}
'''
The problem is that it successfully interacts with the desired file and renames it, but refuses to interact with external links information within the archive at "'Excel File.zip'\xl\externalLinks_rels" directory. The link information I am trying to replace is to change the "/wk28/example_file_wk28.xlsb" with "/wk29/example_file_wk29.xlsb" by changing the wk28 string to wk29 for each external link and so on. Does anybody have experience in this field? As I am only starting my scripting adventure and can't quite diagnose the problem within this script.
Below is the runbook code I am using to save the file to azure fileshare. But unable to save in subdirectory.
#Set the context using the storage account name and key
$context = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$s = Get-AzureStorageShare "X1" –Context $context
$ErrorLogFileName = "Test.csv"
$LogItem = New-Item -ItemType File -Name $ErrorLogFileName
$_ | Out-File -FilePath $ErrorLogFileName -Append
Set-AzureStorageFileContent –Share $s –Source $ErrorLogFileName
Here I have a folder structure like X1/X2. But unable to get there and save the Test.csv. infact able to save it X1 folder on the Azure fileshare. Any idea ?
Thanks in Advance
You can specify the -Path parameter for Set-AzureStorageFileContent.
For example, the file share is X1, and there is a folder X2 inside X1. Then you can use the command below:
Set-AzureStorageFileContent –Share "X1" –Source $ErrorLogFileName -Path "X2" -Context $context
By the way, the command you're using is old, please try to consider using the latest az powershell command. For example, you can use Set-AzStorageFileContent instead of Set-AzureStorageFileContent(if you try to use the az module, then please replace all the old commands with the new ones).
So I am facing a situation when my project who is deployed on Azure cloud is getting high CPU most of the time it is 100% but after restarting the app, CPU usage goes to 10-15% for a few hours. I did try to use Kudu profiler but it did not help, most of the time it shows that some methods using 40% CPU when total CPU usage is 100%, but they are 2-3% when usage of CPU is low.
What a strange thing I noticed is some API controller methods if they don't get correct request BODY throws CGI/502 error, even though it should be throw Null reference exception because the method get the wrong body, the more interesting - to return CGI exception takes about > 2 min instead of 2sec as usually on my web service on local computer.
I went from S1 to S2 plan, same stuff, even though works a bit faster but azure insights show same 90-10% CPU usage.
First of all i would suggest you to write a code to get a crash dump of your server, you can refer this link for setting up .
Something like below would help you to write it in powershell.
$dumpFolder = "C:\crash-dumps"
if (!(Test-Path $dumpFolder)) {
mkdir $dumpFolder | Out-Null
}
$dumpKey = "HKLM:SOFTWARE\Microsoft\Windows\Windows Error Reporting\LocalDumps"
if (!(Test-Path $dumpKey)) {
New-Item -Path $dumpKey | Out-Null
}
$dumpKey = "HKLM:SOFTWARE\Microsoft\Windows\Windows Error Reporting\LocalDumps\dotnet.exe"
New-Item -Path $dumpKey -Force | Out-Null
New-ItemProperty -Path $dumpKey -Name DumpFolder -Value $dumpFolder -PropertyType String -Force | Out-Null
New-ItemProperty -Path $dumpKey -Name DumpCount -Value 3 -PropertyType DWORD -Force | Out-Null
New-ItemProperty -Path $dumpKey -Name DumpType -Value 2 -PropertyType DWORD -Force | Out-Null
$dumpKey = "HKLM:SOFTWARE\Microsoft\Windows\Windows Error Reporting\LocalDumps\w3wp.exe"
New-Item -Path $dumpKey -Force | Out-Null
New-ItemProperty -Path $dumpKey -Name DumpFolder -Value $dumpFolder -PropertyType String -Force | Out-Null
New-ItemProperty -Path $dumpKey -Name DumpCount -Value 3 -PropertyType DWORD -Force | Out-Null
New-ItemProperty -Path $dumpKey -Name DumpType -Value 2 -PropertyType DWORD -Force | Out-Null
Based on the crash dumps we can easily understand what part is causing issue.
For the similar issue , you can track this request. Also try to degrade your application to V2.0.0 and see if it is still causing the CPU spikes. If yes then we need to look at your code as mentioned in the comments.
I have a powershell script that removes files from a csv list. However, I'm not sure how to export the results once the files have been deleted from the list and mark them in green in a spreadsheet in Excel. How would I approach this? Below is my powershell script:
$files = Get-Content "C:\test\remove.csv"
foreach ($file in $files) {
Remove-Item -Path $file -force
}
Automating some IIS stuff with Powershell. I needed to add an net.msmq binding using the approach listed here:
Why Powershell's New-WebBinding commandlet creates incorrect HostHeader?
Where I add using something like
New-ItemProperty -Path 'IIS:\Sites\Default Web Site' -Name Bindings -value #{protocol="net.msmq"; bindingInformation="server.domain.com"}
So now I need to automate removal of that binding (say the queue server changes). I have messed around with all the Collection cmdlets, and I cannot figure out a way to remove an item.
Get-ItemProperty -Path 'IIS\Sites\Default Web Site' -Name bindings
will return a collection. I can iterate through with ForEach, but I cannot seem to find the magic command to remove an item once I find it.
Any thoughts?
This worked for me:
$prop = (get-ItemProperty -Path 'IIS:\Sites\Default Web Site' -Name bindings).Collection | ? {$_.Protocol -ne "net.msmq"}
Set-ItemProperty "IIS:\sites\Default Web Site" -name bindings -value $prop
Remove-ItemProperty 'IIS:\Sites\DemoSite' -Name bindings -AtElement #{protocol="http";bindingInformation="*:80:DemoSite2"}
straight off the technet......