I'm exporting data from multiple paths from our network drives. If a document is older than a certain date, I export it to a CSV file. But when I open up Excel, theres no formatting. Everything is all jammed up in the "A" column. I would like the "Name" to be in column A, "LastWriteTime" to be in column B, etc.
Here is my code:
foreach($path in $SharedFolder)
{
Get-ChildItem -Path $path -Recurse | Where-Object {!$_.PSIsContainer -and $_.LastWriteTime -lt $DateLimit} |
Select-Object Name, LastWriteTime, LastAccessTime, Length, DirectoryName |
Export-Csv -Path $HOME\Desktop\ExcelDoc.csv -NoTypeInformation
}
Any help would be appreciated.
Setting the delimiter to the system default delimiter should take care of this.
You can do this by adding -Delimiter to your command with the character specified.
For example: -Delimiter ';'
Related
Lets say I have a bunch of text files with people's names, that all have this as the content:
number
I want to replace "number" with a value from a CSV or text file, sequentially, and based on the file name. CSV has two columns, name and number:
Joe 5551011000
Gary 5551011001
Clark 5551011002
So I want to find the text file named Joe, and replace the "number" with "5551011000", and the text file named Gary, and replace "number" with "5551011001".
Thank you!
I didn't get too far:
Get-ChildItem "C:\test\*.txt" -Recurse | ForEach-Object -Process {
(Get-Content $_) -Replace 'changeme', 'MyValue' | Set-Content $_
}
This gets me party there, but I don't know how to find a specific file, then replace "number" in that file with the correct value that matches the name.
I also tried a different approach, with manual entry, and it works, but I need it to just be automated:
get-childitem c:\Marriott -recurse -include *.txt |
select -expand fullname |
foreach {
$new = Read-Host 'What is the new value you want for ' $_
(Get-Content $_) -replace 'number',$new |
Set-Content $_
}
I would convert your CSV to a hashtable, then this gets pretty simple.
$ReplaceHT = #{}
Import-Csv c:\path\to\file.csv -Delimiter ' ' -Header 'FileName','Number' | ForEach-Object {$ReplaceHT.add($_.FileName,$_.Number)}
Get-ChildItem c:\Marriott -recurse -include *.txt -PipelineVariable 'File'|Where{$_.name -in $ReplaceHT.Keys} |ForEach-Object{
(Get-Content $File.FullName) -replace 'changeme', $ReplaceHT[$File.Name] | Set-Content $File.FullName
}
I am looking for some script in PowerShell that will compare the date present in an inside text file as content and compare if that date is >today`+15 days then print the file name.
Also, if that script can compare the date as mentioned above along with the other string if both conditions are matching then print the file name.
The below command gives me the output for those which have matching string same as hello and was created 30 days back. But now I want to fulfill the above two conditions no matter when the file was created.
Get-ChildItem -Path C:\Users\vpaul\Downloads\functional-script\*.txt -Recurse | Select-String -Pattern 'Hello', 'Hell' | Where CreationTime -lt (Get-Date).AddDays(-6)| Export-Csv C:\Users\vpaul\Downloads\functional-script\File_Name.csv -NoTypeInformation
The output from Select-String doesn't have a CreationTime property, which is why your filtering fails - CreationTime doesn't resolve to anything so it's always "less than" any value you provide.
Either do the filtering on CreationTime before piping to Select-String:
Get-ChildItem ... |Where-Object CreationTime -lt (Get-Date).AddDays(-6) |Select-String 'Hell' | ...
Or use the Path property on the output from Select-String to look up the files attributes again:
Get-ChildItem ... |Select-String 'Hell' |Where-Object {(Get-ItemPropertyValue -LiteralPath $_.Path -Name CreationTime) -lt (Get-Date).AddDays(-6)} |...
Since it looks like you're trying to get and compare a date from a matched text string inside the file, as well as CreationTime file attribute... +15 Days and -6 Days respectively...
Example Text file Content:
Hello 4/1/2021
You could try something similar to this:
$ALL_RECURSED_TXTs = Get-ChildItem -Path '[Folder to Recurse]\*.txt' -Recurse | Where-Object { $_.CreationTime -lt (Get-Date).AddDays(-6) };
foreach($File in $ALL_RECURSED_TXTs) {
Get-Content -Path $File.FullName | Select-String -Pattern 'Hello', 'Hell' |
ForEach-Object {
# Find a RegEx match for your Date String that is in the File
$_ -match 'Hello\s(\d+\/\d+\/\d{4}).*' | Out-Null;
if((Get-date($matches[1])) -gt ((Get-Date).AddDays(15))) {
"$($File.FullName)" | Out-File -FilePath '[Path to Output]\MyPrintedFileNames.txt' -Append;
}
}
}
If you want to see your matched lines in your outfile...
"$_ : $($File.FullName)" | Out-File -FilePath '[Path to Output]\MyPrintedFileNames.txt' -Append;
"but now I want to fulfill the above two conditions no matter when the file was created."
Scrap the Where-Object filter on Get-ChildItem if you want all txt files.
Edit: Getting confused again. Lol. If your txt file date string is not on same line as your "Hello|Hell" it'll get more complex. Good Luck!
I try do write a script where i can choose a folder and powershell shows me the Name, Size,.... of all the files in that folder. After that powershell should export the Informations in a Excel Table.
But im stuck and dont know what to do :C
Here is my code that i tried to build
Function Get-Folder($initialDirectory)
{
[System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms")|Out-Null
$foldername = New-Object System.Windows.Forms.FolderBrowserDialog
$foldername.Description = "Select a folder"
$foldername.rootfolder = "MyComputer"
if($foldername.ShowDialog() -eq "OK")
{
$folder += $foldername.SelectedPath
}
return $folder
}
$a = Get-Folder
$folder = $a
Get-ChildItem -Path $folder | SELECT Name, #{Name="Size In KB";Expression={$_.Length / 1Kb}}, Attributes, LastaccessTime, #{n='Owner';e={(get-acl $_.Fullname).Owner}}| Format-Table -AutoSize
Export-Csv "C:\Users\DZimmermann\Desktop\Test.csv" -Delimiter ";" -Append
As commented, using Format-Table -AutoSize simply outputs the info in a table format to console. It returns nothing, so there is nothing to write in the csv file..
Doing like this will create the CSV file and writes the info in there:
Get-ChildItem -Path $folder |
Select-Object Name,
#{Name="Size In KB";Expression={$_.Length / 1Kb}},
Attributes, LastaccessTime,
#{n='Owner';e={(get-acl $_.Fullname).Owner}} |
Export-Csv "C:\Users\DZimmermann\Desktop\Test.csv" -Delimiter ";"
This will not get you the info on screen. If you also want that, capture the result in a variable first:
$result = Get-ChildItem -Path $folder |
Select-Object Name,
#{Name="Size In KB";Expression={$_.Length / 1Kb}},
Attributes, LastaccessTime,
#{n='Owner';e={(get-acl $_.Fullname).Owner}}
#output on screen
$result | Format-Table -AutoSize
# write the CSV file:
$result | Export-Csv "C:\Users\DZimmermann\Desktop\Test.csv" -Delimiter ";"
P.S. judging by the title of this question, I think you only want info about Files, not Directories..
If that is the case, add -File switch to the Get-ChildItem cmdlet (for PS 3 and up). For PS versions below 3 use
Get-ChildItem -Path $folder | Where-Object { !$_.PSIsContainer }
I have a folder of spreadsheets which all start with "WEEK COMM" and then the date. For example "WEEK COMM 24-12-2018". What I am looking to do is a script that finds the latest file, makes a copy of it in the SAME folder, and changes the date in the filename by adding 7 days.
So far I have got this, which successfully locates the most up to date file, and copies it as "WEEK COMM TEST", but I am unable to find anything about adding a date to the filename.
Get-ChildItem -Path "\\DESKTOP-88SIUP6\Users\User\Desktop\Shared\STOCK ORDERS\2018" |
Sort-Object -Property CreationTime -Descending |
Select-Object -First 1 |
Copy-Item -Destination "\\DESKTOP-88SIUP6\Users\User\Desktop\Shared\STOCK ORDERS\2018\WEEK COMM TEST.xlsx" -Force
Could anyone please help me?
this will capture the string date from the file .BaseName property, convert it to a [datetime] object, add 7 days, convert that to a date string in your [backwards [grin]] format, and then replace the old date string with the new date string in the .FullName of the file.
that should make the rename process pretty direct & easy. [grin]
if you can, you would likely have a rather easier time if you switched to the more logical, properly sortable yyyy-MM-dd format. you may not be able to do so, but it's worth trying ...
$FileName = [System.IO.FileInfo]'WEEK COMM 24-12-2018.xlsx'
$FN_StringDate = $FileName.BaseName.Split(' ')[2]
$FN_Date = [datetime]::ParseExact($FN_StringDate, 'dd-MM-yyyy', $Null)
$NewFN_StringDate = $FN_Date.AddDays(7).ToString('dd-MM-yyyy')
$NewFileName = $FileName.FullName -replace $FN_StringDate, $NewFN_StringDate
$FileName.FullName
$NewFileName
output [old, then new] ...
D:\Data\Scripts\WEEK COMM 24-12-2018.xlsx
D:\Data\Scripts\WEEK COMM 31-12-2018.xlsx
edit to add a very specific example. it's untested since i haven't any such location ... that is why the -WhatIf is there. [grin]
$SourceDir = '\\DESKTOP-88SIUP6\Users\User\Desktop\Shared\STOCK ORDERS\2018'
Get-ChildItem -Path $SourceDir |
Sort-Object -Property CreationTime -Descending |
Select-Object -First 1 |
ForEach-Object {
$FN_StringDate = $_.BaseName.Split(' ')[2]
$FN_Date = [datetime]::ParseExact($FN_StringDate, 'dd-MM-yyyy', $Null)
$NewFN_StringDate = $FN_Date.AddDays(7).ToString('dd-MM-yyyy')
$NewFileName = $_.FullName -replace $FN_StringDate, $NewFN_StringDate
# remove the "-WhatIf" when you are ready to do this for real
Copy-Item -LiteralPath $_.FullName -Destination $NewFileName -WhatIf
}
I'm creating in Excel a sub-folder in a directory and save there multiple CSV-files from a Excel Workbook
My problem is that I need to do this on a system where the list separator is a ','. The CSV files are getting read from a system where the default list separator is a ';'. I cannot change this
So I need to change the ',' in the CSV files into a ';'. My idea is to achieve this using PowerShell.
My first attempt was to change the delimiter of the CSV immediately after creating it in excel by passing to a script the file-name. I manage to change the delimiter for a certain file but I struggle to pass the pathname to the script (no error but also no change in the file):
Script Code:
param([string]$path)
$content = [IO.File]::ReadAllText($path) #readParameter
Import-CSV -Path $content -Delimiter ','|Export-CSV -Path C:\Users\Desktop\temp.csv -Delimiter ';' -NoTypeInformation #Export a CSV-File with ;
(Get-Content C:\Users\Desktop\temp.csv) | % {$_ -replace '"', ""} | out-file -FilePath C:\Users\Desktop\temp.csv -Force -Encoding ascii #remove " from file
Remove-Item -Path $content #remove old CSV-file
Rename-Item -Path C:\Users\Desktop\temp.csv -NewName $content #change file name
Excel Call:
Call Shell("powershell.exe -ExecutionPolicy Unrestricted -File C:\Users\Desktop\delimiterChange.ps1 -path """ & location & """", 1)
Thank You
If you want to use PS, this is the easy quick and dirty. Works like a charm.
$csv = Import-csv "C:\initial.csv"
$csv | Export-Csv "C:\converted.csv" -NoClobber -NoTypeInformation -Delimiter ";"
param([string]$path)
$content = [IO.File]::ReadAllText($path) #readParameter
Import-CSV -Path $content -Delimiter ','|Export-CSV -Path C:\Users\Desktop\temp.csv -Delimiter ';' -NoTypeInformation
Your code reads the content of the CSV (assuming that the path to the CSV is passed via the parameter -Path) and tries to pass that as the path to Import-Csv. Change the above to this:
param([string]$path)
Import-CSV -Path $path |
Export-CSV -Path C:\Users\Desktop\temp.csv -Delimiter ';' -NoType
You can even replace the content of the file if you run Import-Csv in an expression:
(Import-Csv -Path $path) |
Export-Csv -Path $path -Delimiter ';' -NoType
I'd recommend keeping the double quotes, but if you must remove them you can do that in the pipeline like this:
(Import-Csv -Path $path) |
ConvertTo-Csv -Delimiter ';' -NoType |
% { $_ -replace '"', '' } |
Set-Content -Path $path
In Control Panel > Regional Settings > Additional Settings
Set the List Separator to the semi-colon:
Then, in Excel SaveAs CSV
I would take a different approach.
In a Macro enabled Excel workbook:
1) Create a routine which will import a semi-colon-delimited file. It should take a filename as a parameter and return a workbook.
2) Create a routine which will export a workbook as a CSV. It should take a workbook as a parameter, a file name as a parameter, and export/close the workbook
3) Create a routine which reads a file list from the directory and then runs 1) and 2) on each file.
Additionally, I would not name the semi-colon delimited files CSV if you have any control over the original file names. By definition, CSV means Comma Separated Values. Name them something else. Then your routine only has to find the semi-colon files and can skip the CSVs because those have already been converted to comma separated.