I receive an automatic weekly export from a system in a .csv format. It contains a lot of usernames with the initials of the users (e.g. "fl", "nk"). A few of them have their first and last names, separated by coma (e.g. firstname.lastname). These are the ones, which have to be deleted from the .csv file.
My goal here is to write a Powershell script, which delete all rows, containing the character "." (dot) and then save the same .csv file by overwritting it.
Since I'm very new to Powershell, I'd highly appreciate a more detailed answer including the potential code. I tried various examples from similar issues, which I found here, but none of them worked and/or I am getting error messages, mostly because my syntax isn't correct.
Additional info. Here is a part of the table.
I tried this code:
Get-Content "D:\file.csv" | Where-Object {$_ -notmatch '\.'} | Set-Content "D:\File.csv"-Force -NoTypeInformation
As Mathias says, it is helpful to see what you have tried so we can help you come to a working result. It is easy to give you something like this:
$csv = Import-Csv -Path C:\Temp\temp.csv -Delimiter ";"
$newCSV = #()
foreach($row in $csv){
if(!$row.username -or $row.username -notlike "*.*"){
$newCSV += $row
}
}
$newCSV | Export-Csv -Path C:\Temp\temp.csv -Delimiter ";" -NoTypeInformation
The above code eliminates rows that have a dot on the username field. It leaves rows with an empty username intact with the 'if(!$row.username' part. But I have no idea whether this is helpful since there is no example CSV file, also there is no way to know what you have tried so far ;)
Note that I always prefer using ";" as delimiter, because opening the file in Excel will already be correctly seperated. If the current file uses ',' as a delimiter, you will need to change that when importing the CSV.
You were very close! For this you don't need a loop, you just need to do it using the correct cmdlets:
(Import-Csv -Path 'D:\file.csv' -Delimiter ';') |
Where-Object { $_.Initials -notmatch '\.' } |
Export-Csv -Path 'D:\file.csv' -Delimiter ';' -Force -NoTypeInformation
Get-Content simply reads a text file and returns the lines as string array, whereas Import-Csv parses the structure and creates objects with properties from the header line.
The brackets around the Import-Csv are needed to ensure the importing/parsing of the file is completely done before piping the results through. Without that, the resulting file may become completely empty because you cannot read and overwrite the same file at the same time.
Related
I have a process where files containing data are generated in separate locations, saved to a networked location, and merged into a single file.
And the end of the process, I would like to check that all locations are present in that merged file, and notify me if not.
I am having a problem finding a way to identify that a string specific to each location isn't present, to be used in an if statement, but it doesn't seem to be identifying the string correctly?
I have tried :
get-childitem -filter *daily.csv.ready \\x.x.x.x\data\* -recurse | where-object {$_ -notin 'D,KPI,KPI,1,'}
I know it's probably easier to do nothing if it is present, and perform the warning action if not, but I'm curious if this can be done in the reverse.
Thank you,
As Doug Maurer points out, your command does not search through the content of the files output by the Get-ChildItem command, because what that cmdlet emits are System.IO.FileInfo (or, potentially, System.IO.DirectoryInfo) instances containing metadata about the matching files (directories) rather than their content.
In other words: the automatic $_ variable in your Where-Object command refers to an object describing a file rather than its content.
However, you can pipe System.IO.FileInfo instances to the Select-String cmdlet, which indeed searches the input files' content:
Get-ChildItem -Filter *daily.csv.ready \\x.x.x.x\data\* -Recurse |
Where-Object { $_ | Select-String -Quiet -NotMatch 'D,KPI,KPI,1,' }
Problem:
Update a specific string within numerous configuration files that are found within the subfolders of a partial path using PowerShell.
Expanded Details:
I have multiple configuration files that need a specific string to be updated; however, I do not know the name of these files and must begin my search from a partial path. I must scan each file for the specific string. Then I must replace the old string with the new string, but I must make sure it saves the file with its original name and in the same location it was found. I must also be able to display the results of the script (number of files affected and their names/path). Lastly, this must all be done in PowerShell.
So far I have come up with the following on my own:
$old = "string1"
$new = "string2"
$configs = Get-ChildItem -Path C:\*\foldername\*.config -Recurse
$configs | %{(Get-Content $_) -Replace $old, $new | Set-Content $_FullName
When I run this, something seems to happen.
If the files are open, they will tell me that they were modified by another program.
However, nothing seems to have changed.
I have attempted various modifications of the below code as well. To my dismay, it only seems to be opening and saving each file rather than actually making the change I want to happen.
$configFiles = GCI -Path C:\*\Somefolder\*.config -Recurse
foreach ($config in $configFiles) {
(GC $config.PSPath) | ForEach-Object {
$_ -Replace "oldString", "newString"
} | Set-Content $config.PSPath)
}
To further exasperate the issue, all of my attempts to perform a simple search against the specified string seems to be posing me issues as well.
Discussing with several others, and based on what have learned via SO... the following code SHOULD return results:
GCI -Path C:\*\Somefolder\*.config -Recurse |
Select-String -Pattern "string" |
Select Name
However, nothing seems to happen. I do not know if I am missing something or if the code itself is wrong...
Some questions I have researched and tried that are similar can be found at the below links:
UPDATE:
It is possible that I am being thwarted by special characters such as
+ and /. For example, my string might be: "s+r/ng"
I have applied the escape character that PowerShell says to use, but it seems this is not helping either.
Replacing a text at specified line number of a file using powershell
Find and replacing strings in multiple files
PowerShell Script to Find and Replace for all Files with a Specific Extension
Powershell to replace text in multiple files stored in many folders
I will continue my research and continue making modifications. I'll be sure to notate anything that get's me to my goal or even a step closer. Thank you all in advance.
This may be really ridiculous and simple but I am missing something. I have a very basic script block, but the output is doing something funky. Code Block is:
Get-Content C:\test.txt | ForEach-Object{(Get-ADComputer -Identity $_ -Properties description) | Select-Object name, description `
| Export-Csv -Path 'C:\test.csv' -NoTypeInformation -Force}
The strange thing is that if I comment out the export-csv cmdlet, the code works perfectly by grabbing everything in the text file and lists all the descriptions (as it should). However, when i include it, the Export only lists the very last item on the text file. I have read a few other questions similar to this but no success. Like I said, I am sure it is something simple but I can't seem to find it.
Thank You!
I found it.. i was missing -append in the export-csv and that worked.
Please help. Trying to figure out how to replace a string in PowerShell, but don't know the rest of the string. I have this:
(Get-Content $file) -replace[regex]::Escape('file='*''),('file='+$_.BaseName) | Set-Content $file
I don't know what comes after file=
I tried my code, but it replaces it multiple times instead of just once.
So trying to replace file=* with filename=$_.BaseName.
Thanks for looking.
Just an FYI for anyone using the latest version of PowerShell Community Extensions (http://pscx.codeplex.com), there is a new command called Edit-File that handles this sort of thing nicely (works hard to preserve the file's original encoding):
Get-Item test.txt | Foreach {$bn=$_.BaseName; $_} |
Edit-File -Pattern '(file=).*' -Replace "`${1}$bn"
In theory I shouldn't need the Foreach stage but it seems I've found a limitation in how -PipelineVariable does not work with parameters that aren't pipeline bound. Hmm, add that to the Pscx backlog.
I was wondering if anybody knows a way to achieve this without breaking/mesing with the data itself?
I have a CSV file which is delimited by '|' which was created by retrieving data from Sharepoint using an SPQuery and exported using out-file (because export-csv is not an option since I would have to store the data in a variable and this would eat at the RAM of the server, querying remotely unfortuntely will also not work so i have to do this on the server itself). Nevertheless I have the Data i need but i want to perform some manipulations and move and autocalc certain data within an excel file and export the said excel file.
The problem I have right now is that I sort of need a header to the file. I have tried using the following code:
$header ="Name|GF_GZ|GF_Title|GF_UniqueId|GF_OldId|GFURL|GF_ITEMREDIRECTID"
$file = Import-Csv inputfilename.csv -Header $header | Export-Csv D:\outputfilename.csv
In powershell but the issue here is that when i perform the second Export-Csv it will delimit at anything that has a comma and thus remove it, i sort of need the data to remain intact.
I have tried playing with the -Delimit '|' setting both on the import and the export path but no matter what i do it seems to be cutting off the data. Is there a better way to simply add a row at the Top (a header) without messing with the already existing file structure?
I have found out that using a delimiter such as -delimiter '°' or any other special case character will remove my problem entirely, but i can never be sure if such a character is going to show up in the dataset and thus (as stated already) am looking for a more "elegant" solution.
Thanks
One option you have is to create the original CSV with the headers first. Then when you are exporting the SharePoint data, use the switch -Append in the Out-File command to append the SP data to the CSV.
I wouldn't even bother messing with it in csv format.
$header ="Name|GF_GZ|GF_Title|GF_UniqueId|GF_OldId|GFURL|GF_ITEMREDIRECTID"
$in_file = '.\inputfilename.csv'
$out_file = '.\outputfilename.csv'
$x = Get-Content $in_file
Set-Content $out_file -Value $header,$x
There's probably a more eloquent/refined two-liner for some of this, but this should get you what you need.