I was wondering if anybody knows a way to achieve this without breaking/mesing with the data itself?
I have a CSV file which is delimited by '|' which was created by retrieving data from Sharepoint using an SPQuery and exported using out-file (because export-csv is not an option since I would have to store the data in a variable and this would eat at the RAM of the server, querying remotely unfortuntely will also not work so i have to do this on the server itself). Nevertheless I have the Data i need but i want to perform some manipulations and move and autocalc certain data within an excel file and export the said excel file.
The problem I have right now is that I sort of need a header to the file. I have tried using the following code:
$header ="Name|GF_GZ|GF_Title|GF_UniqueId|GF_OldId|GFURL|GF_ITEMREDIRECTID"
$file = Import-Csv inputfilename.csv -Header $header | Export-Csv D:\outputfilename.csv
In powershell but the issue here is that when i perform the second Export-Csv it will delimit at anything that has a comma and thus remove it, i sort of need the data to remain intact.
I have tried playing with the -Delimit '|' setting both on the import and the export path but no matter what i do it seems to be cutting off the data. Is there a better way to simply add a row at the Top (a header) without messing with the already existing file structure?
I have found out that using a delimiter such as -delimiter '°' or any other special case character will remove my problem entirely, but i can never be sure if such a character is going to show up in the dataset and thus (as stated already) am looking for a more "elegant" solution.
Thanks
One option you have is to create the original CSV with the headers first. Then when you are exporting the SharePoint data, use the switch -Append in the Out-File command to append the SP data to the CSV.
I wouldn't even bother messing with it in csv format.
$header ="Name|GF_GZ|GF_Title|GF_UniqueId|GF_OldId|GFURL|GF_ITEMREDIRECTID"
$in_file = '.\inputfilename.csv'
$out_file = '.\outputfilename.csv'
$x = Get-Content $in_file
Set-Content $out_file -Value $header,$x
There's probably a more eloquent/refined two-liner for some of this, but this should get you what you need.
Related
I have an XML file that I'm trying to read with PowerShell. However when I read it, the output of some of the XML objects have the following characters in them: ​
I simply downloaded an XML file I needed from a third-party, which opens in Excel. Then I grab the columns I need and paste them into a new Excel Workbook. Then I map the fields with an XML Schema and then export it as an XML file, which I then use for scripting.
In the Excel spreadsheet my data looks clean, but then when I export it and run the PS script, these strange characters appear in the output. The characters even appear in the actual XML file after exporting. What am I doing wrong?
I tried using -Encoding UTF8, but I'm relatively new to PowerShell and am not sure how to appropriately apply it to my script. Appreciate any help!
PowerShell
$xmlpath = 'Path\To\The\File.xml'
[xml]$xmldata = (Get-Content $xmlpath)
$xmldata.applications.application.name
Example of Output
​ABC_DEF_GHI​.com​​
​JKL_MNO_PQRS​.com​
TUV_WXY_Z.com
AB_CD_EF_GH​.com
This is a prime example of why you shouldn't use the idiom [xml]$xmldata = (Get-Content $xmlpath) - as convenient as it is.[1] The problem is indeed one of character encoding: your file is UTF-8-encoded, but Windows PowerShell's Get-Content cmdlet interprets it as ANSI-encoded in the absence of a BOM - this answer explains the encoding part in detail.Thanks, choroba.
Instead, to ensure that the XML file's character encoding is interpreted correctly, use the following:
# Note: If you know that $xmlPath contains a *full*, native path,
# you don't need the Convert-Path call.
($xmlData = [xml]::new()).Load((Convert-Path -LiteralPath $xmlPath))
This delegates interpretation of the character encoding to the System.Xml.XmlDocument.Load .NET API method, which not only assumes the proper default for XML (UTF-8), but also respects any explicit encoding specification as part of the XML declaration, if present (e.g., <?xml version="1.0" encoding="iso-8859-1"?>)
See also:
the bottom section of this answer for background information.
GitHub proposal #14505, which proposes introducing a New-Xml cmdlet that robustly parses XML files.
[1] If you happen to know the encoding of the input file ahead of time, you can get away with using Get-Content's -Encoding parameter in your original approach ([xml]$xmldata = (Get-Content -Encoding utf8 $xmlpath), but the .Load()-based approach is much more robust.
I receive an automatic weekly export from a system in a .csv format. It contains a lot of usernames with the initials of the users (e.g. "fl", "nk"). A few of them have their first and last names, separated by coma (e.g. firstname.lastname). These are the ones, which have to be deleted from the .csv file.
My goal here is to write a Powershell script, which delete all rows, containing the character "." (dot) and then save the same .csv file by overwritting it.
Since I'm very new to Powershell, I'd highly appreciate a more detailed answer including the potential code. I tried various examples from similar issues, which I found here, but none of them worked and/or I am getting error messages, mostly because my syntax isn't correct.
Additional info. Here is a part of the table.
I tried this code:
Get-Content "D:\file.csv" | Where-Object {$_ -notmatch '\.'} | Set-Content "D:\File.csv"-Force -NoTypeInformation
As Mathias says, it is helpful to see what you have tried so we can help you come to a working result. It is easy to give you something like this:
$csv = Import-Csv -Path C:\Temp\temp.csv -Delimiter ";"
$newCSV = #()
foreach($row in $csv){
if(!$row.username -or $row.username -notlike "*.*"){
$newCSV += $row
}
}
$newCSV | Export-Csv -Path C:\Temp\temp.csv -Delimiter ";" -NoTypeInformation
The above code eliminates rows that have a dot on the username field. It leaves rows with an empty username intact with the 'if(!$row.username' part. But I have no idea whether this is helpful since there is no example CSV file, also there is no way to know what you have tried so far ;)
Note that I always prefer using ";" as delimiter, because opening the file in Excel will already be correctly seperated. If the current file uses ',' as a delimiter, you will need to change that when importing the CSV.
You were very close! For this you don't need a loop, you just need to do it using the correct cmdlets:
(Import-Csv -Path 'D:\file.csv' -Delimiter ';') |
Where-Object { $_.Initials -notmatch '\.' } |
Export-Csv -Path 'D:\file.csv' -Delimiter ';' -Force -NoTypeInformation
Get-Content simply reads a text file and returns the lines as string array, whereas Import-Csv parses the structure and creates objects with properties from the header line.
The brackets around the Import-Csv are needed to ensure the importing/parsing of the file is completely done before piping the results through. Without that, the resulting file may become completely empty because you cannot read and overwrite the same file at the same time.
I am developing a PowerShell script to import an Excel file and output the data to a flat file. The code that I have below works fine except that it fails to preserve leading zeros; when the CSV file is opened in a text editor, the leading zeros are not present. (Leading zeros are necessary for certain ID numbers, and the ID numbers are stored in Excel using a custom format.) Does anyone have any thoughts on how to get the ImportExcel module to preserve the leading zeros, or, perhaps another way of getting to the same goal? I would like to do this without using the COM object and without having to install Excel on the server; that's why I've been trying to make the ImportExcel module work.
$dataIn = filename.xlsx ; $dataOut = filename.csv
Import-Excel -Path $dataIn | Export-Csv -Path $dataOut
I presume you're using the ImportExcel module?
I just did this and it worked. I created a spreadsheet like:
Name ID1 ID2
Steven 00012345 00012346
I gave them a custom number format of 00000000 then ran:
Import-Excel .\Book1.xlsx | Export-Csv .\book1.csv
When looking at the csv file I have both ID numbers as quoted strings:
"Name","ID1","ID2"
"Steven","00012345","00012346"
Is there anything else I need to do to reproduce this? Can you give the specifics of the custom number format?
Also withstanding your answer to above. You can modify the properties of each incoming object by converting them to strings. Assuming there's a fixed number of digits you can use the string format with the .ToString() method like:
(12345).ToString( "00000000" )
This will return "00012345"...
So redoing my test with regular numbers (no custom format):
$Input = #(Import-Excel \\nynastech1\adm_only\ExAdm\Temp\Book1.xlsx)
$Input |
ForEach{
$_.ID1 = $_.ID1.ToString( "00000000" )
$_.ID2 = $_.ID2.ToString( "00000000" )
}
This will convert ID1 & ID2 into "00012345" & "00012345" respectively.
You can also use Select-Object, but you might need to rename the properties. If you are interested I can demo that approach.
Note: the #() wrapping in my example is because I only have the 1 object, and is partly force of habit.
Let me know how it goes.
I'm exporting an AD report via power shell using the code below.
$Get-ADUser -Filter 'enabled -eq $true' -SearchBase "OU=Staff,OU=Users,OU=OSA,DC=domian,DC=org" -properties mail, employeeID | select employeeID, mail, ObjectGUID | Export-CSV "C:\Reports\ADExports\Students.csv" -notypeinformation
It outputs the csv file and everything looks fine except, the 'Data type' of all columns are set to 'Short Text'.
I require the Employee ID column type to be 'Numbers'. Is it possible to export a csv with custom field type.
I hope this make sense.
Thanks in advance.
CSVs are plain text and do not contain type information. However, you can use the following module, which provides an Export-Excel cmdlet. This cmdlet takes various Excel parameters, including a -NumberFormat.
$x | Export-Excel -Numberformat 'Number' -Path 'test.xlsx' #This worked for me.
You will probably have to play around with it a little depending on your exact use case. Good luck!
ok guys, I'm going to assume that you cannot export a csv via powershell with custom data types.
However, I found a way around my issue. I've converted the data type when importing the csv in to access and managed to solve my issue.
If anyone's interested you can find the exact issue and solution here - Joining Two Tables with Different Data types MS ACCESS - 'type mismatch in expression' error
Thank you for bring the 'ImportExcel' module to my attention. Now i know there's module available for this and you can do quite a bit on excel manipulation.
Thank you all for your comments/answers.
Thanks.
Problem:
Update a specific string within numerous configuration files that are found within the subfolders of a partial path using PowerShell.
Expanded Details:
I have multiple configuration files that need a specific string to be updated; however, I do not know the name of these files and must begin my search from a partial path. I must scan each file for the specific string. Then I must replace the old string with the new string, but I must make sure it saves the file with its original name and in the same location it was found. I must also be able to display the results of the script (number of files affected and their names/path). Lastly, this must all be done in PowerShell.
So far I have come up with the following on my own:
$old = "string1"
$new = "string2"
$configs = Get-ChildItem -Path C:\*\foldername\*.config -Recurse
$configs | %{(Get-Content $_) -Replace $old, $new | Set-Content $_FullName
When I run this, something seems to happen.
If the files are open, they will tell me that they were modified by another program.
However, nothing seems to have changed.
I have attempted various modifications of the below code as well. To my dismay, it only seems to be opening and saving each file rather than actually making the change I want to happen.
$configFiles = GCI -Path C:\*\Somefolder\*.config -Recurse
foreach ($config in $configFiles) {
(GC $config.PSPath) | ForEach-Object {
$_ -Replace "oldString", "newString"
} | Set-Content $config.PSPath)
}
To further exasperate the issue, all of my attempts to perform a simple search against the specified string seems to be posing me issues as well.
Discussing with several others, and based on what have learned via SO... the following code SHOULD return results:
GCI -Path C:\*\Somefolder\*.config -Recurse |
Select-String -Pattern "string" |
Select Name
However, nothing seems to happen. I do not know if I am missing something or if the code itself is wrong...
Some questions I have researched and tried that are similar can be found at the below links:
UPDATE:
It is possible that I am being thwarted by special characters such as
+ and /. For example, my string might be: "s+r/ng"
I have applied the escape character that PowerShell says to use, but it seems this is not helping either.
Replacing a text at specified line number of a file using powershell
Find and replacing strings in multiple files
PowerShell Script to Find and Replace for all Files with a Specific Extension
Powershell to replace text in multiple files stored in many folders
I will continue my research and continue making modifications. I'll be sure to notate anything that get's me to my goal or even a step closer. Thank you all in advance.