Powershell Search Outlook Email By Title and Extract Most Recent Excel (.xls) File - excel

I am an intern at a testing organisation.
I am trying to automate a very long task and I'm using Powershell to do most of the work.
Task:
We have a corporate email and we receive a LOT of emails during the day. Of course, we have rules set up to make our lives a little more bearable.
Every day there are specific emails being sent to a folder "XYZ" and I want to search for the most recent email using the following criteria:
- Email Title
- Latest Email which contains the search string
Every one of those emails contains an Excel file. If the title of the body matches search criteria, I want to download the latest attachment. Unless there is a way to open and parse the file without downloading it.
I'm super new to Powershell but I have a programming background, so don't be pushed back to simplify yourselves down.
Best regards,
Alex

You'll need to do most of this yourself, but this is code from a similar script I have, i've broken it down to make it a bit more readable, should hopefully get you started.
#Params
$Account = "Mailbox.Searchme#contoso.com"
$Folder = "Inbox"
$SubjMatch = "Reports"
#Create outlook COM object to search folders
$Outlook = New-Object -ComObject Outlook.Application
$OutlookNS = $Outlook.GetNamespace("MAPI")
#Get all emails from specific account and folder
$AllEmails = $OutlookNS.Folders.Item($Account).Folders.Item($Folder).Items
#Filter to emails with attatchments and specific subject line (-match uses RegEx)
$ReportsEmails = $AllEmails | ? { ($_.Subject -match $SubjMatch) -and ($_.Attachements.Count -gt 0) }
#Grab the most recently recieved email
$LatestReportEmail = $ReportsEmails | Sort ReceivedTime | Select -Last 1
#Get the xlsx file(s) and save them
$LatestReportEmail.Attachments | ? {$_.FileName -match "\.xlsx$"} | % {
$_.SaveAsFile("C:\path\to\$($_.FileName)")
}
#Quit Outlook COM Object
$Outlook.Quit()
you should have Outlook closed before you try and run this, also this can be EXTREMELY slow on big folders (mostly the filter part for some reason), good luck.

Related

PowerShell Dynamic Creation of ADUsers from imported Excelsheet

first of all, i want to say, that i'm very, very new to PowerShell. These are the first PS-scripts i wrote.
I'm currently working on a PS-Script for AD-Administration. Currently the Scripts for Adding/Deleting SmbShares, Adding or removing Users from Groups, and so on, are already done.
I already had a working script for creating the users in AD, but it wasn't dynamic, as in hard coded variables that all would have to be entered into a new-ADUser command. As the code will be used for more than one specific set of parameters, it has to be dynamic.
I'm working with Import-Excel and found a great function here, but I'm having two problems wih this function.
$sb = {
param($propertyNames, $record)
$propertyNames | foreach-object -Begin {$h = #{} } -Process {
if ($null -ne $record.$_) {$h[$_] = $record.$_}
} -end {New-AdUser #h -verbose}
}
Use-ExcelData -Path $Path -HeaderRow 1 -scriptBlock $sb
The dynamic part of this is, that the table headers will be used as the parameternames for New-ADUser. Only thing one needs to change if the amount of parameters needed changes is add or delete a column in the excel sheet. The column header always needs the same name as the parameter of New-ADUser.
Screenshot of excel table
My Problem now is the "Type" Header i've got at column A. It is needed to specify the type of the user for adding the user to specific ADGroups. But due to the function above using all headers as parameters this doesn't work.
Has anyone an idea how to change the function $sb so that it starts with the second column? I've tried aroung with skip 1 and tried a lot of other workarounds, but with my non-experience nothing ssemed to come close to what i need.
SOLVED PROBLEM BELOW: added -DataOnly to Use-ExcelData and now it works.
The second problem would be, that the function does not stop trying to create users once there are no more values for the parameters. For trying around i deleted the column "Type". In the example of trying to create the two users testuser and testuser2, Powershell creates the users with no problems but then asks for a name for a new-ADUser.
AUSFÜHRLICH: Ausführen des Vorgangs "New" für das Ziel "CN=Test User,CN=Users,DC=****,DC=**".
AUSFÜHRLICH: Ausführen des Vorgangs "New" für das Ziel "CN=Test2 User2,CN=Users,DC=****,DC=**".
Cmdlet New-ADUser an der Befehlspipelineposition 1
Geben Sie Werte für die folgenden Parameter an:
Name:
Thank you in advance, sorry for my english and please tell me if I did something wrong forumwise.
I see you I would save the excel sheet as a CSV file and then import it. It's faster and easier to consume. The headers become your parameter names and the import behaves like any other object.
$csvData = Import-csv -path <path to csv file>
From here, iterate the rows and access the values as properties of the row. No need to import the data into a hashtable, it's already accessible with property names defined by the header row.
foreach ($row in $csvData) {
Write-Host $row.Name
Write-Host $row.Path
}
Once the loop reaches the end of the file, it stops trying to create users.
FYI, The use of single letter variables is going to make your code very difficult to maintain. My eyes hurt just looking at it.

Update links in Excel files

I am trying to modify some Excel files that link to other Excel files.
The files that are linked to are in sub-directories. I am going to move all the files to a root directory and then run a script to change the links.
I am able to find the links within each file but I am unable to modify them (see below)
Any ideas?
Thanks
P
#get all the excel files in the directory
Get-ChildItem $sourceDir -Filter *.xl* |
Foreach-Object {
write-host -ForegroundColor Yellow $_.FullName
$workbook = $excel.Workbooks.Open($_.FullName)
foreach ($link in $workbook.LinkSources(1))
{
write-host $link.Address
#this gives me .... C:\temp\files\childfile1.xlsx etc
# $link.Address seems to be read only?
#$link.Address = $newLink
#this doesn't seem to work either ...
#$workbook. .ChangeLink($link,$newlink,1)
}
#$workbook.Save()
$workbook.Close()
If you are modifying XLSX files, you can update the link without using Excel. Ultimately these are a zip archive with a different extension. If you create a copy of the file to have a ZIP extension, you can use Expand-Archive to access the various files and update those accordingly, then Compress-Archive to generate a new Excel file.
In the archive, look for /workbook.xml, which will identify sheets by name. /rels/workbook.xlm.rels can be used to translate from sheetId to the worksheet (Target), a file in /worksheets (e.g. "worksheets/sheets3.xml") and you can infer the relationship file, which will be in /workseets/_rels (e.g. /worksheets/_rels/sheet3.xml).
Using the worksheet you can find the associated hyperlink based upon cell reference, using the ref attribute, which gives you the r:id attribute. Us can us this value to lookup up the appropriate Relationship by Id. You would then need to update the Target appropriately.
Of course, if you know your original link, and it is unique (or you are altering them all the same way), you could do a search and replace across the .rels files.
Once you've saved you change, you just need to create the new file, which you can do using Compress-Archive. You'll need to do this to a file with a .Zip extension, then rename.
Here is an example based upon a XSLX with a link to Yahoo on the first sheet (note: the first 3 sheets normally have the XML and sheet names match, until altered. Don't count on that for production)
copy-item c:\temp\links.xlsx c:\temp\links.zip
expand-archive c:\temp\links.zip c:\temp\links_zip
$content = get-content c:\temp\links_zip\xl\worksheets\_rels\sheet1.xml.rels -raw # allow file to close
$content | %{ $_ -replace 'http://www.yahoo.com','http://www.google.com'} | set-content c:\temp\links_zip\xl\worksheets\_rels\sheet1.xml.rels
compress-archive c:\temp\links_zip\* c:\temp\links_alt.zip
remove-item c:\temp\links_zip -Recurse
rename-item c:\temp\links_alt.zip c:\temp\links_alt.xlsx
c:\temp\links_alt.xlsx

Issue with Powershell Excel attempting to cast .csv Values

At this point, I believe it may be a file I/O issue.
While utilizing a Powershell script invoking Excel methods to go through a .csv file from a website, powershell is attempting to cast placeholders for data that is too long for a cell "#######" instead of the date and time contained within the 'cell' (search engines may need 'pound sign' or 'hashtag' to reach this result).
Below is the offending portion of the script.
[DateTime]$S = $sheet.Cells.Item($rowS+$i,$colS).text
[DateTime]$G = $sheet.Cells.Item($rowG+$i,$colG).text
[DateTime]$A = $sheet.Cells.Item($rowA+$i,$colSWScan).text
The data should exist as MM/DD/YYYY HH:MM, but is being read by Powershell/PSExcelModule as #######, which is what is displayed with the Excel GUI when opening the file.
This is only a portion of what the entire script does. Any suggestions on how to resolve the error while maintaining usage of PSExcel-Module would be most helpful.
Stackoverflow seems to have an issue with me posting the verbose error message, and this is my first post. Let me know if that would be helfpul with troubleshooting.
Edit for comment #1:
# Create an instance of Excel.Application and Open Excel file
$objExcel = New-Object -ComObject Excel.Application
# Open the file
$workbook = $objExcel.Workbooks.Open($file)
# Activate the first worksheet
$sheet = $workbook.Worksheets.Item($sheetName)
$objExcel.Visible=$false
After getting my head out of 'Excelland', I realized it may be easier to re-write the script to utilize the .csv organization (the original imported file for the script was a .xlsx), but I am admittedly unfamiliar with .csv scripting. However, the original question still stands while I re-write the code as I may need to switch back to .xlsx imported documents. Thank you for the suggestion J E Carter II.
Answer:
$objExcel.Cells.EntireColumn.AutoFit()
Credit to J E Carter II
When you open an excel file as an object under windows, you're launching excel.
You might be able to add the following commands to your excel object handle to get the data to represent correctly.
$objExcel.Cells.Select
$objExcel.Cells.EntireColumn.AutoFit
Do this before getting values from cells. If that doesn't work, let me know and I can find some csv handling examples.
Those might work better on the $workbook object. I can't remember which is implicit when recording a macro (which is how I got those).
It's also possible you may need to precede those two lines with something like
$workbook.Sheets("sheetname").Select

Why is perl saving two copies of Excel spreadsheet?

This is similar to A copy of Excel Addin is created in My Documents after saving, except that I'm working with Perl instead of VBA, and xls files instead of xlsm, and the negative impact of the behavior is different.
I've inherited a Perl script (Perl 5.8.8) that is running on Windows 2003 Server as SYSTEM. After copying an Excel 2003 template file to a unique, fully defined path location, it opens the unique file in Excel using OLE, edits the file, saves the file, and closes the file. What results is the edited file being saved both in the correct, fully-defined path location, and also in the Default User profile's Documents folder.
This causes thousands of these files to accumulate on the C: drive, as every new admin to be hired gets a copy in his Documents folder.
Adding the code that sets the value of $OUT:
if (!$db->Sql("EXEC GetDetails 'name'"))
{
while ($db->FetchRow()>0)
{
#DataIn = $db->Data();
$name = $DataIn[0];
$IN = $DataIn[1];
$OUT = $DataIn[2];
opendir(DIR,"$OUT") || die "$OUT directory does not exist $!\n";
#... loop of proprietary code
#...
#Completed = $db1->Data();
#...
&formatExcelReport #The code that I previously posted
#...
# more proprietary code
# end of loop
} #end of while
}#end of if
The code I originally posted:
# Initialize Excel object
eval {Win32::OLE->new('Excel.Application', 'Quit')};
eval {$Excel = Win32::OLE->GetActiveObject('Excel.Application')};
unless (defined $Excel)
{
$Excel = Win32::OLE->GetActiveObject('Excel.Application')
|| Win32::OLE->new('Excel.Application', 'Quit');
}
$infiles = "Report_Template.xls";
$infiles = $OUT."/".$infiles;
$db6->Sql("EXEC FormatResults '".$Completed[0]."','".$Completed[1]."'");
$row = 2;
$fileName = $Completed[0]."_".$Completed[1];
$uniquefile = $fileName.$printdate.".xls";
# $OUT is a fully defined path on the E: drive
$reportfile = "$OUT"."\\".$uniquefile;
copy($infiles,$reportfile);
$Book = $Excel->Workbooks->Open("$reportfile");
$sheetnum = 1;
my $Sheet = $Book->Worksheets($sheetnum);
# Set Headers
$Header = $Sheet->PageSetup->{'CenterHeader'};
$Header = $Header." Results Test Code: ".$Completed[0]." Worksheet: ".$Completed[1]." Date: ".$headerdate;
$Sheet->PageSetup->{'CenterHeader'}= $Header;
# More file editing
# ...
$Book->Save();
$Book->Close(0);
Win32::OLE->new('Excel.Application', 'Quit');
Is the root of this problem the Save() command? Should I be using SaveAs() instead?
Any other feedback about how Excel is being used welcome, as well.
Thanks!
I don't see what causes this behavior, but here are a few things to try.
The template and the file it is copied to have names
$infiles = $OUT."/".$infiles;
$reportfile = "$OUT"."\\".$uniquefile;
Use the same separator.
Try to suppress some possible setting dictating that another copy be made. Perhaphs
$Excel->Application->{CreateBackup} = 0;
However, this may not be the correct property -- search the VB or Excel documentation for properties that may result in Excel saving an extra copy. (It needn't be "backup".)
Try to create a new file and use SaveAs, as a test to see whether you get two files again. The template copying may be setting it off to Save an extra copy (even though I don't see how). I'd say it's either that, or some general setting that need be turned off.
The rest is the original post, about using SaveAs, whereby I thought that a new file is created
You would use SaveAs to write a new file. See saveas in MSDN library
Saves changes to the workbook in a different file.
Using the save method may result in saving two files fro some reason, as noted in the answer by Borodin. This page also advises to use SaveAs for a new file
The first time you save a workbook, use the SaveAs method to specify a name for the file.
Once you change to using SaveAs there should be a confirmation dialog to deal with. If you want to suppress that you can set a property, with one (or either?) of
$Excel->Application->{DisplayAlerts} = 0;
# or
$Excel->{DisplayAlerts} = 0;
For a number of options, including backups for example, see the Chapter on OLE automation in PERL in a Nutshell.
A note on some other resources. There is a cookbook of sorts in this post on perlmonks. A listing of various operations is given in this SO post.
Finally, I don't know how deep the reasons for using OLE are but if it is only about writing some Excel files there are other modules. For example the very well regarded Spreadsheet::WriteExcel and Excel-Writer-XLSX.
That's very strange Perl code. eval without checking $# afterwards is just wrong -- you need to know if a step of your code has failed for the following steps to make sense
It looks like the problem is in your call to copy($infiles, $reportfile). That will save one copy of the file, while $Book->Save and $Book->Close will save another

Perl: Read email and use message content as stdin for an excel update

I am looking for some help with starting a Perl script. I'm relatively inexperienced with Perl so help would be appreciated :)
Basically, want to start a project to write a script that helps keep up to date with hours I have been working. Basically I would like the script to E-mail (automated using cron) me reminding me to send my hours each day, then I send an e-mail back with the message as something like
"03/02/14 7.30 18.30"
The script will then read the data and update an excel spreadsheet keeping a log of hours.
I know how to do everything except having the script read an e-mail. I have been doing research into MIME::* MAIL::* but I'm not entirely sure which package would be the best and how to actually go about it.
As #mpapec suggested you could read email using IMAP or a local mailbox on a linux box.
In windows you could use OLE and read emails in an outlook: Perl: Win32::OLE and Microsoft Outlook - Iterating through email attachments efficiently
You could read emails on exchange in this way: http://metacpan.org/pod/Email::Folder::Exchange
If I were you I would use IMAP to access emails. It is platform independent and not too hard to use (I used it in the past and it was reliable).
http://metacpan.org/pod/Net::IMAP::Client
my $imap = Net::IMAP::Client->new(
server => 'mail.you.com',
user => 'USERID',
pass => 'PASSWORD');
# select folder
$imap->select('INBOX');
#newest first
my $messages = $imap->search({
FROM => 'you',
SUBJECT => 'your email subjet',
}, [ '^DATE' ]);
# fetch full message (newest)
my $data = $imap->get_rfc822_body($messages->[0]);
#process
store_data_in_excel($data);
#move to archive
$imap->copy([$messages->[0]], 'Archive');
$imap->add_flags(\#msg_ids, '\\Deleted');
$imap->expunge;

Resources