Create a series of PowerShell variables from Excel to execute in Robocopy - excel

Using the PowerShell module importexcel I have created the variables and the job is running but with a few caveats. The data from the source is dumped into the root of the destination (which is expected based on the variables I've created - which is one of many areas I ask for help).
$dataJob01 = Import-Excel "C:\temp\test.xlsx"
ForEach ($project in $dataJob01){
$source = $($dataJob01.Source)
$destination = $($dataJob01.Destination)
robocopy $source $destination /e /copy:DATU /dcopy:DAT /log:C:\temp\log.txt
}
Is there a way to append the "Folder Name" to the destination variable? Making the $destination = \\server2\share\PROJECTS\304401
Until the issue in problem 1 is solved, problem 2 is irrelevant but ideally I want the job to process 100s of lines below the excel snippet above. Unfortunately, that is failing with the code I've written. It's as if the source and destination are being bunched together vs. being processed on each line. I thought ForEach was appropriate but I'm a noob.
When I run this against multiple lines, I receive no output and the job fails.
Hopefully I'm not going about this the wrong way and someone has some pointers for me.

Is there a way to append the "Folder Name" to the destination variable? Making the $destination = \server2\share\PROJECTS\304401
Here you can use Join-Path which works perfectly for what you need:
$destination = Join-Path $project.Destination -ChildPath $project.'Folder Name'
I've noticed you're using $dataJob01, which is the array you're using to iterate, inside your foreach loop. Try using this instead (I changed the name of the variables, so it helps you understand. $line is exactly that).
$xlsx = Import-Excel "C:\temp\test.xlsx"
ForEach ($line in $xlsx)
{
$source = $line.Source
$destination = Join-Path $line.Destination -ChildPath $line.'Folder Name'
robocopy $source $destination /e /copy:DATU /dcopy:DAT /log:C:\temp\log.txt
}

Related

New-item "Illegal Characters in path" when I use a Variable that contains a here string

foreach ($Target in $TargetUSBs)
{
$LogPath= #"
$SourceUSB\$(((Get-CimInstance -ClassName Win32_volume)|where {$_.DriveType -eq "2" -and $_.DriveLetter -eq $Target}).SerialNumber)_
$(((Get-CimInstance -ClassName Win32_OperatingSystem).LocalDateTime).Year)$(((Get-CimInstance -ClassName Win32_OperatingSystem).LocalDateTime).Month)
$(((Get-CimInstance -ClassName Win32_OperatingSystem).LocalDateTime).Day)_$(((Get-CimInstance -ClassName Win32_OperatingSystem).LocalDateTime).Hour)
$(((Get-CimInstance -ClassName Win32_OperatingSystem).LocalDateTime).Minute)$(((Get-CimInstance -ClassName Win32_OperatingSystem).LocalDateTime).Second).txt
"#
$LogPath = $LogPath.Replace("`n","").Trim()
New-item -Path "$LogPath"
}
The Irony is when I copy and paste the contents of my variable and manually create a new-item -path and paste said contents it works but when I use the variable it does not...
Brief summary of my goal I am taking a USB labelled ORIGINAL and obtaining the S/N of every USB plugged in at the time and creating separate log files for each with the title consisting of SERIALNUMBER_DATE_TIME.txt and these files are created in ORIGINAL USB
$LogPath contains for example the following: E:\Mattel\1949721369_2018912_93427.txt
Yet when I use the Variable in New-item it indicates "Illegal characters in Path"
FYI $LogPath is a System.String not an object
$TargetUSBs is filled with all USB drives plugged into the system
this method of using a variable for a path usually works fine for me only difference is the here-string I used this time around does this cause my problem? I hope not because I really don't want to fill that variable all on one line. New-Item's helpfiles shows <String[]> for -path parameter does this mean I have to use a string array? and if so how do I convert this to make this work?
Your problem is that Windows uses CRLF line endings (Unix only LF), so you still have CR chars in your path.
To fix this just use:
.Replace("`r`n","")
However you can easily simplify your code so you do not require the messy here-string or replace/trim...
By using a single Get-Date call you can format it to your desired output. This means you can just build the Path as a simple string and involves much less code:
foreach ($Target in $TargetUSBs)
{
$SerialNumber = Get-CimInstance -ClassName Win32_volume | where {$_.DriveType -eq "2" -and $_.DriveLetter -eq $Target} | Select-Object -ExpandProperty SerialNumber
$DateTime = Get-Date -Format "yyyyMd_Hms"
New-item -Path "$SourceUSB\$SerialNumber_$DateTime.txt"
}

Rename multiple files with string from .txt file using PowerShell

Im currently working on a programm that needs a .xml file, reads it into a Oracle Database and afterwards exports a new .xml file. But the problem is that the new file has to have the exact same name as the original file.
I saved the original filenames into a .txt file and i'm now trying to search for a keyword inside the lines to rename the right files with the correct names inside the .txt file. Here an example:
My 4 files (exported from the Database):
PM.Data_information.xml
PM.Data_location.xml
PM.Cover_quality.xml
PM.Cover_adress.xml
Content of Namefile.txt (original names):
PM.Data_information_provide_SE-R-SO_V0220_657400509_3_210.xml
PM.Data_location_provide_SE-R-SO_V0220_9191200509_3_209.xml
PM.Cover_quality_provide_SE-R-SO_V0220_354123509_3_211.xml
PM.Cover_adress_provide_SE-R-SO_V0220_521400509_3_212.xml
I only worked out how to get a line by selecting the linenumber:
$content = Get-Content C:\Namefile.txt
$informationanme = $content[0]
Rename-Item PM.Data_information.xml -NewName $informationname
Isn't there a way to select that line by searching for the keyword inside the string?
$content = Get-Content C:\temp\ps\NewFile.txt
$files = Get-ChildItem c:\temp\ps\
$content |
%{
$currentLine = $_
$file = $files | Where-Object { $currentLine.StartsWith($_.Name.Replace(".xml", "")) }
Rename-Item $file.Name $currentLine
}
This code should do the trick. Note you will need to have all of your files that need renaming in one folder. Set the folder path to the $files variable (currently set to c:\temp\ps). Set the path where your NewFile.txt is to the $content path.
The code works by looping around each line in the NewFile.txt and finding any file where the name matches the start of the line (if there are any files that do not follow this pattern you will obviously need to update the code but hopefully gives you a good starting point).
other solution ;)
gci -Path "c:\temp" -File -Filter "*.xml" | % { rni $_.fullname (sls "C:\temp\Namefile.txt" -Pattern ([System.IO.Path]::GetFileNameWithoutExtension($_.fullname))).Line }

Creating Powershell script to loop through excel file and create folders

I'm new to Powerscript and looking at writing a solution to loop through an excel file and create a folder for each row in the file.
At the moment I have the following:
$objExcel = new-object -comobject excel.application
$objExcel.Visible = $True
$ExcelFilesLocation = “D:\Users\”
$UserWorkBook = $objExcel.Workbooks.Open($ExcelFilesLocation + “ProjectCodes.xlsx”)
$UserWorksheet = $UserWorkBook.Worksheets.Item(1)
$intRow = 2
Do {
$Projectcode = $UserWorksheet.Cells.Item($intRow, 1).Value()
$pos = $userLogOnEmail.IndexOf(“#”)
$intRow++
} While ($UserWorksheet.Cells.Item($intRow,1).Value() -ne $null)
$objExcel.Quit()
$a = Release-Ref($UserWorksheet)
$a = Release-Ref($UserWorkBook)
$a = Release-Ref($objExcel)
The idea is to loop through the project code column for each row. Then create a folder that is named for the project code.
Having spent painful hours wrangling Excel COM objects with PowerShell, my advice is to give up! Office COM objects are confusing, unwieldy and slow.
Replace the technology you use to access Excel in PowerShell with something better. For example, this module: https://github.com/dfinke/ImportExcel. You can use Install-Module ImportExcel -Scope CurrentUser if you're using PS 5.
Apart from being easier to use and much faster it doesn't require Excel to be installed, making your final script more portable.
Admitedly, you don't get full Excel functionality with this module but since you seem to be doing no more than reading cells, you should be fine.
Alternatively, Save your Excel file as CSV and use Import-Csv instead.
To create a new directory, use New-Item
For example, assuming $Projectcode is a string containing a valid path:
New-Item -Path $Projectcode -Type Directory

Powershell - Pulling string from txt, splitting it, then concatenating it for archive

I have an application where I am getting a list of new\modified files from git status, then I take the incomplete strings from that file, concatenate them with the root dir file path, then move those files to an archive. I have it half working, but the nature of how I am using powershell does not provide error reports and the process is obviously erroring out. Here is the code I am trying to use. (It has gone through several iterations, please excuse the commented out portions) Basically I am trying to Get-Content from the txt file, then replace ? with \ (for some reason the process that creates the txt love forward slashes...), then split that string at the spaces. The only part of the string I am interested in is the last part, which I am trying to concatenate with the known working root directory, then I am attempting to move those to an archive location. Before you ask, this is something we are not willing to track in git, due to the nature of the files (they are test outputs that are time stamped, we want to save them on a per test run basis, not in git) I am still fairly new to powershell and have been banging my head against this rock for far too long.
Get-Content $outfile | Foreach-Object
{
#$_.Replace("/","\")
#$lineSplit = $_.Split(' ')
$_.Split(" ")
$filePath = "$repo_dir\$_[-1]"
$filePath.Replace('/','\')
"File Path Created: $filePath"
$untrackedLegacyTestFiles += $filePath
}
Get-Content $untrackedLegacyTestFiles | Foreach-Object
{
Copy-Item $_ $target_root -force
"Copying File: $_ to $target_root"
}
}
the $outfile is a text file where each line has a partial file path leading to a txt file generated by a test application we use. This info is provided by git, so it looks like this in the $outfile txt file:
!! Some/File/Path/Doc.txt
The "!!" mean git sees it as a new file, however it could be several characters from a " M" to "??". Which is why I am trying to split it on the spaces and take only the last element.
My desired output would be to take the the last element of the split string from the $outfile (Some/File/Path/Doc.txt) and concatenate it with the $repo_dir to form a complete file path, then move the Doc.txt to an archive location ($target_root).
To combine a path in PowerShell, you should use the Join-Path cmdlet. To extract the path from your string, you can use a regex:
$extractedPath = [regex]::Match('!! Some/File/Path/Doc.txt', '.*\s(.+)$').Groups[1].Value
$filePath = Join-Path $repo_dir $extractedPath
The Join-Path cmldet will also convert all forward slashes to backslashes so no need to replace them :-).
Your whole script could look like this:
Get-Content $outfile | Foreach-Object {
$path = Join-Path $repo_dir ([regex]::Match($_, '.*\s(.+)$').Groups[1].Value)
Copy-Item $path $target_root -force
}
If you don't like to use regexin your code, you can also extract the path using:
$extractedPath = '!! Some/File/Path/Doc.txt' -split ' ' | select -Last 1
or
$extractedPath = ('!! Some/File/Path/Doc.txt' -split ' ')[-1]

Checking file names in a directory with entries in an excel spreadsheet; What am I doing wrong?

I'm attempting to write a PowerShell script (my first ever, so be gentle) to go through all the file names in a directory and check if they exist in an excel spreadsheet that I have. If a file name does exist in both, I want to move/copy that file to a new directory.
Right now it runs with no errors, but nothing actually happens.
So far I have:
#open excel sheet
$objexcel=new-object -com excel.application
$workbook=$objexcel.workbooks.open("<spreadsheet location>")
#use Sheet2
$worksheet = $workbook.sheets.Item(2)
#outer loop: loop through each file in directory
foreach ($_file in (get-childitem -path "<directory to search>"))
{
$filename = [system.IO.path]::GetFileNameWithoutExtension($_)
#inner loop: check with every entry in excel sheet (if is equal)
$intRowCount = ($worksheet.UsedRange.Rows).count
for ($intRow = 2 ; $intRow -le $intRowCount ; $intRow++)
{
$excelname = $worksheet.cells.item($intRow,1).value2
if ($excelname -eq $filename)
{ #move to separate folder
Copy-Item -path $_file -Destination "<directory for files to be copied to>"
}
#else do nothing
}
}
#close excel sheet
$workbook.close()
$objexcel.quit()
You're trying to define $filename based on the current object ($_), but that variable isn't populated in a foreach loop:
$filename = [system.IO.path]::GetFileNameWithoutExtension($_)
Because of that $filename is always $null and therefore never equal to $excelname.
Replace the foreach loop with a ForEach-Object loop if you want to use $_. I'd also recommend to read the Excel cell values into an array outside that loop. That improves performance and allows you to use the array it in a -contains filter, which would remove the need for having a loop in the first place.
$intRowCount = ($worksheet.UsedRange.Rows).count
$excelnames = for ($intRow = 2; $intRow -le $intRowCount; $intRow++) {
$worksheet.cells.item($intRow,1).value2
}
Get-ChildItem -Path "<directory to search>" |
Where-Object { $excelnames -contains $_.BaseName } |
Copy-Item -Destination "<directory for files to be copied to>"
On a more general note: you shouldn't use variable names starting with an underscore. They're too easily confused with properties of the current object variable ($_name vs. $_.name).

Resources