We are trying to bulk update our Azure Active Directory. We have a excel csv list of UserPrincipalNames that we will update the Title, Department, and Office attributes
# Get List of Clinical CMs
$PATH = "C:\Users\cs\Documents\IT Stuff\Project\Azure AD Update\AD-Update-ClinicalCMs-Test.csv"
$CMs = Import-csv $PATH
# Pass CMs into Function
ForEach ($UPN in $CMs) {
# Do AD Update Task Here
Set-Msoluser -UserPrincipalName $UPN -Title "Case Manager" -Department "Clinical" -Office "Virtual"
}
The CSV:
User.1#domain.com
User.2#domain.com
User.3#domain.com
The Set-MsolUser command will work on its own, but it is not working as intended in this For loop. Any help or insight is greatly appreciated
As Jim Xu commented, here my comment as answer.
The input file you show us is not a CSV file, instead, it is a list of UPN values all on a separate line.
To read these values as string array, the easiest thing to is to use Get-Content:
$PATH = "C:\Users\cs\Documents\IT Stuff\Project\Azure AD Update\AD-Update-ClinicalCMs-Test.csv"
$CMs = Get-Content -Path $PATH
Of course, although massive overkill, it can be done using the Import-Csv cmdlet:
$CMs = (Import-Csv -Path $PATH -Header upn).upn
Related
I'm trying to get the current list of azure vms on first run of script -> Stores to Storage Account in CSV File
O the 2nd run - Current List should be compared with existing csv file in Storage Account incase of any vms decommisioned that should be recorded and stored in 2nd File in Storage Account
This works fine for me but the issue is when we create a new azure vm which was also gets added to decommission csv list
$Difference = Compare-Object $existingVmCsv $vmrecordFile -Property VmName -PassThru | Select-Object VmName,ResourceGroupName,SubscriptionName
I tried couple of side indicators but dint work
$Difference = Compare-Object -ReferenceObject #($vmrecordFile | Select-Object) -DifferenceObject #($existingVmCsv | Select-Object) -PassThru -Property VmName,ResourceGroupName,SubscriptionName | Where-Object {$_sideIndicator -eq "<="}
$Difference = Compare-Object -ReferenceObject $vmrecordFile -DifferenceObject $existingVmCsv -PassThru -Property VmName,ResourceGroupName,SubscriptionName | Where-Object {$_sideIndicator -eq "<="}
Thank you User Cpt.Whale - Stack Overflow . Posting your suggestions as answer to help other community members.
It seems, you have a typo in a syntax. Object property references should use a "." , like Where-Object { $_.sideIndicator -eq '<=' }
'<=' This indicates that property value appears only in the -ReferenceObject setReferences: powershell - compare two files and update the differences to 2nd file - Stack Overflow , Powershell : How to Compare Two Files, and List Differences | | Dotnet Helpers (dotnet-helpers.com) and compare-object not working : PowerShell (reddit.com)
I receive an automatic weekly export from a system in a .csv format. It contains a lot of usernames with the initials of the users (e.g. "fl", "nk"). A few of them have their first and last names, separated by coma (e.g. firstname.lastname). These are the ones, which have to be deleted from the .csv file.
My goal here is to write a Powershell script, which delete all rows, containing the character "." (dot) and then save the same .csv file by overwritting it.
Since I'm very new to Powershell, I'd highly appreciate a more detailed answer including the potential code. I tried various examples from similar issues, which I found here, but none of them worked and/or I am getting error messages, mostly because my syntax isn't correct.
Additional info. Here is a part of the table.
I tried this code:
Get-Content "D:\file.csv" | Where-Object {$_ -notmatch '\.'} | Set-Content "D:\File.csv"-Force -NoTypeInformation
As Mathias says, it is helpful to see what you have tried so we can help you come to a working result. It is easy to give you something like this:
$csv = Import-Csv -Path C:\Temp\temp.csv -Delimiter ";"
$newCSV = #()
foreach($row in $csv){
if(!$row.username -or $row.username -notlike "*.*"){
$newCSV += $row
}
}
$newCSV | Export-Csv -Path C:\Temp\temp.csv -Delimiter ";" -NoTypeInformation
The above code eliminates rows that have a dot on the username field. It leaves rows with an empty username intact with the 'if(!$row.username' part. But I have no idea whether this is helpful since there is no example CSV file, also there is no way to know what you have tried so far ;)
Note that I always prefer using ";" as delimiter, because opening the file in Excel will already be correctly seperated. If the current file uses ',' as a delimiter, you will need to change that when importing the CSV.
You were very close! For this you don't need a loop, you just need to do it using the correct cmdlets:
(Import-Csv -Path 'D:\file.csv' -Delimiter ';') |
Where-Object { $_.Initials -notmatch '\.' } |
Export-Csv -Path 'D:\file.csv' -Delimiter ';' -Force -NoTypeInformation
Get-Content simply reads a text file and returns the lines as string array, whereas Import-Csv parses the structure and creates objects with properties from the header line.
The brackets around the Import-Csv are needed to ensure the importing/parsing of the file is completely done before piping the results through. Without that, the resulting file may become completely empty because you cannot read and overwrite the same file at the same time.
I am trying to modify some Excel files that link to other Excel files.
The files that are linked to are in sub-directories. I am going to move all the files to a root directory and then run a script to change the links.
I am able to find the links within each file but I am unable to modify them (see below)
Any ideas?
Thanks
P
#get all the excel files in the directory
Get-ChildItem $sourceDir -Filter *.xl* |
Foreach-Object {
write-host -ForegroundColor Yellow $_.FullName
$workbook = $excel.Workbooks.Open($_.FullName)
foreach ($link in $workbook.LinkSources(1))
{
write-host $link.Address
#this gives me .... C:\temp\files\childfile1.xlsx etc
# $link.Address seems to be read only?
#$link.Address = $newLink
#this doesn't seem to work either ...
#$workbook. .ChangeLink($link,$newlink,1)
}
#$workbook.Save()
$workbook.Close()
If you are modifying XLSX files, you can update the link without using Excel. Ultimately these are a zip archive with a different extension. If you create a copy of the file to have a ZIP extension, you can use Expand-Archive to access the various files and update those accordingly, then Compress-Archive to generate a new Excel file.
In the archive, look for /workbook.xml, which will identify sheets by name. /rels/workbook.xlm.rels can be used to translate from sheetId to the worksheet (Target), a file in /worksheets (e.g. "worksheets/sheets3.xml") and you can infer the relationship file, which will be in /workseets/_rels (e.g. /worksheets/_rels/sheet3.xml).
Using the worksheet you can find the associated hyperlink based upon cell reference, using the ref attribute, which gives you the r:id attribute. Us can us this value to lookup up the appropriate Relationship by Id. You would then need to update the Target appropriately.
Of course, if you know your original link, and it is unique (or you are altering them all the same way), you could do a search and replace across the .rels files.
Once you've saved you change, you just need to create the new file, which you can do using Compress-Archive. You'll need to do this to a file with a .Zip extension, then rename.
Here is an example based upon a XSLX with a link to Yahoo on the first sheet (note: the first 3 sheets normally have the XML and sheet names match, until altered. Don't count on that for production)
copy-item c:\temp\links.xlsx c:\temp\links.zip
expand-archive c:\temp\links.zip c:\temp\links_zip
$content = get-content c:\temp\links_zip\xl\worksheets\_rels\sheet1.xml.rels -raw # allow file to close
$content | %{ $_ -replace 'http://www.yahoo.com','http://www.google.com'} | set-content c:\temp\links_zip\xl\worksheets\_rels\sheet1.xml.rels
compress-archive c:\temp\links_zip\* c:\temp\links_alt.zip
remove-item c:\temp\links_zip -Recurse
rename-item c:\temp\links_alt.zip c:\temp\links_alt.xlsx
c:\temp\links_alt.xlsx
So I have this code:
$userprofile=Get-ChildItem Env:USERPROFILE
$localpath="$userprofile\some\path"
I would expect output of the below from $localpath:
c:\users\username\some\path
However what I get is:
System.Collections.DictionaryEntry\some\path
So, of course, something like cd $localpath fails. How would I accomplish what I need?
A convenient way to obtain the string value rather than the dictionary entry (which is technically what Get-ChildItem is accessing) is to just use the variable syntax: $Env:USERPROFILE rather than Get-ChildItem Env:USERPROFILE.
$localpath = "$env:USERPROFILE\some\path"
For more information:
PowerShell Environment Provider
about_Environment_Variables
Also, the Join-Path cmdlet is a good way to combine two parts of a path.
$localpath = Join-Path $env:USERPROFILE 'some\path'
I have a Powershell script that converts Office documents to PDF. I would like to multithread it, but cannot figure out how based on other examples I have seen. The main script (OfficeToPDF.ps1) scans through a list of files and calls separate scripts for each file type/office application (ex. for .doc files WordToPDF.ps1 is called to convert). The main script passes 1 file name at a time to the child script ( I did this for a couple of reasons).
Here is an example of the main script:
$documents_path = "C:\Documents\Test_Docs"
$pdf_out_path = "C:\Documents\Converted_PDFs"
$failed_path = "C:\Documents\Failed_to_Convert"
# Sets the root directory of this script
$PSScriptRoot = Split-Path -parent $MyInvocation.MyCommand.Definition
$date = Get-Date -Format "MM_dd_yyyy"
$Logfile = "$PSScriptRoot\logs\OfficeToTiff_$Date.log"
$word2PDF = "$PSScriptRoot\WordToPDF.ps1"
$arguments = "'$documents_path'", "'$pdf_out_path'", "'$Logfile'"
# Function to write to log file
Function LogWrite
{
Param ([string]$logstring)
$time = Get-Date -Format "hh:mm:ss:fff"
Add-content $Logfile -value "$date $time $logstring"
}
################################################################################
# Word to PDF #
################################################################################
LogWrite "*** BEGIN CONVERSION FROM DOC, DOCX, RTF, TXT, HTM, HTML TO PDF ***"
Get-ChildItem -Path $documents_path\* -Include *.docx, *.doc, *.rtf, *.txt, *.htm? -recurse | ForEach-Object {
$original_document = "$($_.FullName)"
# Verifies that a document exists before calling the convert script
If ($original_document -ne $null)
{
Invoke-Expression "$word2PDF $arguments"
#checks to see if document was successfully converted and deleted. If not, doc is moved to another directory
If(Test-Path -path $original_document)
{
Move-Item $original_document $failed_path
}
}
}
$original_document = $null
[gc]::collect()
[gc]::WaitForPendingFinalizers()
Here is the script (WordToPDF.ps1) that is called by the main script:
Param($documents, $pdf_out_path, $Logfile)
# Function to write to the log file
Function LogWrite
{
Param ([string]$logstring)
$time = Get-Date -Format "hh:mm:ss:fff"
Add-content $Logfile -value "$date $time $logstring"
}
$word_app = New-Object -ComObject Word.Application
$document = $word_app.Documents.Open($_.FullName)
$original_document = "$($_.FullName)"
# Creates the output file name with path
$pdf_document = "$($pdf_out_path)\$($_.BaseName).pdf"
LogWrite "Converting: $original_document to $pdf_document"
$document.SaveAs([ref] $pdf_document, [ref] 17)
$document.Close()
# Deletes the original document after it has been converted
Remove-Item $original_document
LogWrite "Deleting: $original_document"
$word_app.Quit()
Any suggestions would be appreciated.
Thanks.
I was just going to comment and link you to this question: Can PowerShell run commands in Parallel. I then noted the date of that question and the answers, and with PowerShell v3.0 there are some new features that might work better for you.
The question goes over use of the PowerShell jobs. Which can work but require you to keep up with the job status, so can add a bit extra coding to manage.
PowerShell v3 opened up the door a bit more with workflow which is based on Windows Workflow Foundation. A good article on the basics of how this new command works can be found on Script Guy's blog here. You can basically adjust your code to run your conversion via workflow and it will perform this in parallel:
workflow foreachfile {
foreach -parallel ($f in $files) {
#Put your code here that does the work
}
}
Which from what I can find the thread limit this has is 5 threads at a time. I am not sure how accurate that is but blog post here noted the limitation. However, being that the Application com objects for Word and Excel can be very CPU intensive doing 5 threads at a time would probably work well.
I have a multithreaded powershell environment for indicator of compromise scanning on all AD devices- threaded 625 times with Gearman. http://gearman.org
It is open source and allows for an option to go cross platform. It threads with a server worker flow and runs via Python. Extremely recommended by yours truly- someone that has abused threading in powershell. This isn't so much an answer but something that I had never heard of but love and use daily. Pass it forward. Open source for the win :)
I have also used psjobs before and they are great until a certain point of magnitude. Maybe it is my lack of .net expertise but ps has some querky subtle memory nuances that in a large scale can create some nasty effects.