I'm trying to build some automation into Outlook, I receive a series of emails throughout the day; around 75. In each of these emails there is a particular string that I need to lookup on an Excel sheet and forward the email based on the retrieved results. I've researched several methods and they all seem clunky or problematic to me, I was hoping to get feedback on best practices to accomplish this.
So far, I've investigated :
Opening a background Excel doc hosted on a network drive, perform the vlookup and close the document. However, the network can sometimes bog down the opening of the file and having Outlook intermittently freeze is unacceptable.
Opening a background Excel doc hosted on a network drive and leave the file open in the background; to be used when needed. My issue is that I'm having a hard time keep the reference to the document as Outlook likes to "forget" it was there.
Converting the file to an Access Database (I have very limited experience here) hoping that Access has better tools for quickly querying files.
Again, I'm looking for advice on best practices.
Maybe you can copy the Excel file to your computer every day and then do the lookups on that.
Related
The question here is: Am I on the right path (this is the first time I'm trying this), and if not, what would be smarter to try? If this is the right path, can you offer suggestions on how to do this best, because if this works, I am going to use it often on a lot of different tasks in this app.
I'm running a PowerApps Canvas app. As part of its program, I want it to be able to reference (read-only) a collection of data. That data is in ServiceNow, and my group is not permitted to access ServiceNow using the API.
During testing of the app, I just had it reference a SharePoint list (which I had filled with some dummy data), but I can re-code those lines as needed to pull from some other data source.
Because I am touching a few different systems here, I am not sure if this is the right way to go and I'm afraid I'll spend too long trying and find out that it would never have worked because of x. Thus my question.
This is what I think will work. Am I headed in the right direction?
Set up the scheduled report in ServiceNow. (Done!)
Program ServiceNow to email the Excel file output. Make sure it is
always the same title. (Done!)
Build a Power Automate flow to capture that email and save the
attached file to a location (OneDrive?) that can be accessed by the
app. If there is a file there already, delete it first.
Add the Excel file as a data source to the app, and start
referencing it as needed.
8-12 hours later, ServiceNow pushes out another scheduled data
drop, and the whole thing updates again.
In my perfect world, this system would work completely unattended.
Offhand, a glitch I can see is that ServiceNow generates an Excel file, but it's not a table, and PowerApps I think must ingest as a data source an Excel file that is a table. But (shrug) I might be wrong.
Am I thinking of this correctly? Is this the best avenue to follow?
I have 500 Excel documents. I want users to keep working as if that was excel (I'll provide app for that) yet cross-reference data in-between that documents. What database can feet such needs?
So, if i get it ok then you need to get data from ~500 excel files while people may access and change them in real time! I can think of 4 ways of approach:
live links of all files to 1 workbook... hurts me to even think the maintenance and setting ... but it will be "live".
powerQuery: group them all in one data table using PowerQueries or PowerBI or similar, then load them on workbook OR save as csv... 1 button refresh, relatively quick, no actual coding needed
use VBA: access all files (or changed ones...) and get what you want, when you want it. If implemented expertly will only take a few seconds for full scan in modern pc, yet needs someone good at coding VBA.
setup 1) using VBA instead of manually, then using VBA to check for errors etc. Result will be "live" but requires again serious VBA coding...
I believe that 2) is the easiest choice with good maintenance features, ease of setting and good speed... (start in excel ...Data / new Query/from File/from Folder ...)
I have recently observed an issue regarding my data in a column that I use to perform data validation on my spreadsheet.
So There is nothing wrong with the formula, neither is there anything from with the use of data validation.
It should be looking for duplicate entries, which works quite fine.
The issue is that it no longer recognizes input made from a smartphone using the excel app.
so what i did was to retype cell text field from my PC and it worked perfectly.
Is there a way that I can continue using this technique (Data validation) without having to re-enter data from a PC in order for it to process?
Certainly! Yes, that is possible.
But... with all the possibilities in today's world, is your current strategy the one that is the best for you?
That is something I cannot answer for you.
That is something I cannot enumerate for you.
But... There is something that I can introduce to you.
PowerQuery
PowerQuery was a free add-on for Excel 2010 and 2013 and it has been baked directly into Excel for more than half a decade. So, if you're using the mobile app then you probably have a modern version of Excel with PowerQuery right at your finger tips.
Your first step if to determine how you want to make your data available for Excel to get. Go to the Data Tab on the ribbon and review your options in the "Get Extetnal Data" group.
It doesn't matter if free data is your Creed and your most intimate moments are publicly available through your raw data feed. Or if paranoia is the reason why you constantly drive around the block scraping SSIDs before squirreling them away to SQL server for detailed analysis. Or if you're using a USB cable to transfer photos to your PC because your mom walked in on you without knocking and was so disgusted by what she saw on your desktop that you're banned from the family LAN... For life. None of that matters because Excel can connect to your data in so many ways that one of them will be perfect for you.
There is a sense of familiarity when Importing your data into PowerQuery. It's not unlike following those timeless MS Wizards; but nothing like the uncanny sensation of being dropped into the PowerQuery editor. It is simultaneously the same as Excel and different from Excel and it may be the closest you ever come to visiting a parallel universe. Many of the same tools are available but they behave just slightly differently. And in some cases, like the Text To Columns tool, it is light years ahead of Excel and you will find yourself cursing at MS for not using it as a replacement for the old tool.
When you're done transforming your data, you'll have a tight clean table. But the real prize, is that you have fully automated pipe from source to product .
I figured that the phone user included extra spaces when inputting the data.
So i Used the TRIM() function which takes care of the extra spaces between, before, or after each word, and that did the job.
Therefore the major error was that there were additional spaces that was not recognized in the tested data.
I am hoping someone might be able to help me.
My problem is essentially this: A Colleague enters information into an excel, which I then have to check and pass on by email. This is fairly time-critical.
What I would like to do is
have the colleague press a button that calls the macros on my computer (worksheet running continuously),or
have my colleague email me and I have a macro in Outlook which checks for specific subject lines, or
he saves it on the network, and I check every minute for new files in that folder.
While the last two of these are possible, the outlook solution is - for several company policy reasons - the very last resort, and I would also like to avoid the ongoing checking for files as I am already having slight performance issues (large worksheet with lots of external links that are being feed real time).
I am also open to all other suggestions someone might have. Thanks a lot!!
I'd go with the third option.
Make his workbook save information to a 'queue' file or folder. Your workbook can query this queue, which should be on the network somewhere, and notify you when its changed. Wouldn't even have to open it unless it has changed if you set it to compare modify dates, and could be small if it is saved as text or in XML format.
First option won't work because VBA framework is pretty locked down. Cross workbook macro activation isn't possible from what i understand.
Second option is more work than necessary, and VBA/Outlook will warn you every 5 minutes that its trying to access your mail folder since that's what malicious software typically does.
Like i said, the third option would be best, and his macro could be set to only write, could even encrypt the text using simple encryption methods so that others can't easily modify it if that is a concern.
We have a SharePoint 2007 deployment which will have a substantially large document library. My client wants the ability to export this library to an Excel spreadsheet, but specifically wants the ability to divide the spreadsheet into several worksheets based on a specific field. Is this possible to accomplish in WSS 3.0, through the object model or otherwise?
There is a out-of-the-box Export to Spreadsheet, but it does not appear to support automated subdivision of the list items into separate worksheets. I do not know if Excel Services that come with MOSS are capable of it, but we do not have MOSS so we cannot consider it an option for now.
EDIT
It seems that by mentioning "out-of-the-box", I am implying that I'd prefer something quick and simple. Let's dispel that. I do a lot of heavy work in the object model. I only mentioned the Export to Spreadsheet because that's the only available method I know of off-hand, and its options are limitted. So I am comfortable with all manner of work level that can be suggested.
I should also note that keeping the list linked with the spreadsheet is undesired. We want to be able to download the spreadsheet as a reference. Because of the number of people who will be working on the list, it would be absolute chaos to try and synchronize all of the linked files. My client has agreed that it'll be easier to handle obsolete copies than to try some synchronized system.
The solution also needs to be deployable. So things which do not tailor to an individual site are best.
You won't be able to do this OOTB. You will have to write some code to iterate through the records of the list either using
The SharePoint OM - Better performance and richer API but has to run on a Web Front End
The web service - Can run on any machine
Then you can build up the Excel spreadsheet either by
Using the Excel object model (aka Automation) if this is a quick kludge running from a workstation - but excel wasn't designed to be used from an unattended server and/or high volume so you may also want to look at
A 3rd party component such as SpreadsheetGear to generate the Excel spreadsheet files.
A good bet is to quickly create views for your items (using filters as you want) mirroring your desired worksheets and then export those views into excel. Those views update with the list and you can manually grab new versions later. Still manual but OOTB and no excel hacking needed.
I posted this on SharePoint Overflow. One of the answers I received there was very useful, regarding the utility of the Open XML SDK. Thank you to those who answered... I looked over your suggestions. My client has decided to go through with this one on account that it does not cost money to implement (as Spreadsheet Gear or datapresentation's plugin would).