The software we currently use is limited to only generating a report that shows all fields with very limited filtering options.
We are able to download the full report in a CSV file and looking to find a solution / code that can allow us to easily open a CSV file in Google Sheets that will filter the data and present it in a format like a report from the current system automatically.
Here is what the current report looks like with the header fields:
sample trust report
The fields we want to extract are:
Transaction Date
Reason
Paid To
Debit
Credit
Once those fields are extracted, then we apply a filter on the field "Paid To" which allows us to select the entity we only want to show.
Once that is filtered down to something like this:
filtered sample trust report
Is there a solution that could allow for a CSV file to be imported and then automatically run through the steps above to present it in a format within Google Spreadsheets?
Here is a sample of the csv file so you can see and apply the steps above:
Sample CSV Files
Once the steps have been applied above, we want to print out the report in an A4 sheet as a pdf file.
We have tried manually processing the csv file and filtering it, but with many entities it will take a lot of manual steps. We want to streamline this process and looking for some assistance in providing an automated script or code to run.
I would appreciate any help and please feel free to ask any other questions.
Many thanks.
Related
I have an XML file generated by EasyPower (electrical software). If I open the file in Excel it comes up as a series of formatted sheets like the image below. It appears this way without any prompts or dialogs.
I’m creating a Power Query routine that can extract the data from the sheets. Unfortunately when I use the Power Query wizard to select the XML file as a source, it doesn’t see the data as sheets, but rather a table with columns of Tables, seemingly an infinite number of levels deep. Digging through them I’m unable to clearly see the data. This is not a very good approach.
A work-around is for me to manually open the XML file with Excel and save it as XLSX, then it’s easy to work with the data in Power Query. I know a VBA script could be used to this but my question is, is there a way for Power Query to open an XML file and interpret the layout the same way that Excel does? This way would allow my script to also work within Power BI.
Edit: A sample file has been requested. This link will provide a very simple example containing two worksheets when opened in Excel. EasyPower_Test_Schedule.xml
I'm currently setting up a new Poll Result for a Webinar Survey in Webex Application.
I will only able to collect Result in this PollResults.txt (notepad file) and will need to transform this data into an excel table for further analysis. As i will have multiple Poll Results to collect everyday, therefore transforming them into a data table would helps a lot for further compilation/analysis.
I'm thinking about creating an VBA to transform accordingly, but currently has no idea how should i start.
I will share the download link for both of my raw data and expected result file in the download link, appreciate if anyone could guide me on it.
Link to download files
Thanks in advance!
I have no previous experience in Access, VBA coding or in Excel macros prior to teaching myself the past month via these forums. Thank you forums and contributors. I have enjoyed my Access learnings so far, the challenge that it has provided and appreciate any help that I can get. As such, the code and methods that I have used to this point may well be convoluted and confusing. I will do my best to provide relevant details and accurate terminology.
I work in a lab and I am creating an Access Form for semi-automated reporting. Samples are received from clients and are logged into the Excel Table R&D Log. The worksheet is InProcess. Samples are sorted based on the site in which they originate and given a one or two letter site code (G, D, WH, etc.) and an ID "yy-000" in separate Excel columns (i.e. D 18-096). Samples may be submitted for multiple analyses (Metals, Water, Soil, etc.) and may even have multiple rows of reporting if multiple analytes are identified in the sample. There are several other columns, such as receipt date, reporting date, units, etc. Once samples are reported, I manually copy and paste them into the Archived worksheet, and delete the record and blank row from the InProcess worksheet. Since one sample may have multiple analyses and even more potential results, each record would be reported on a new Excel row (with the same D 18-096 ID number). Thus, there is not a single unique identifier or primary key for each sample in the current format. R&D Log is updated manually by lab technicians and the worksheet InProcess is a linked table in an Access Database.
The Access Database is using two combo boxes on a Form frmInProcess to filter a Query qryInProcess of the linked table. The combo boxes are filtering the report destination (one client may receive multiple site codes) and the analysis (reports are separated based on type of analysis). The Query is also filtering out blank results and blank dates, so only completed samples will appear on the filtered Form. I have generated VBA code to this point that will export the Form to a .pdf, save the file with unique filename, and open outlook to mail out the report. I have also managed to export the filtered Form frmInProcess to an Excel file Access Test (not the linked file).
What I would like to do now is to automate the transfer of completed test results from the Excel worksheet R&D Log: InProcess to R&D Log: Archived and delete the record from the InProcess worksheet. I am not sure if I can export the filtered Form into a linked Excel table, or if I must use a separate Excel file (or if it even matters for simplicity of code?). I would now like to read the exported filtered Form in Excel Access Test, lookup matching rows in R&D Log based on several criteria (site, ID, Analysis, Analyte, Report Date) and automate the transfer of records between R&D Log worksheets. End result being that Access generates reports for completed tests, and the records are removed from InProcess testing and transferred to Archived testing in Excel. I am guessing that I may need to close the Access application and perform this in Excel. Hope this is easy enough to follow.
Thank you.
In my experience, importing an Excel document into a temporary NEW (or totally empty) Access table is usually the easiest way to go. Then you do not have to worry about cell references like you do in Excel VBA. Even if the Excel document has old data in it with just a few new changes each time, importing it into a temporary Access table could be the simplest way to go, because then you can compare the data in this table with the data in another, permanent Access table and update the latter based on the former.
As far as the original Excel file, if you need to delete rows there, it might be quicker to export a new Excel file with just the data the old one is supposed to end up with, and then use VBA to delete (or - safer! - rename) the old file.
So the development process goes something like this:
Save import steps by first importing an Excel file via Access' ribbon options "External Data" (tab) ->"Excel" and when you finish, be sure to check the "Save import steps" box and note the name you give the "saved import" because you will need that in your VBA code.
In Access, write a function for deleting the table. The VBA code is:
Const cTable = "MyExcelTempTable"
If TableExists(cTable) Then
DoCmd.DeleteObject acTable, cTable
End If
Now you can test your delete function on the data you imported.
Write VBA code to import the same spreadsheet to create the same table:
Const cSavedImport = "Import-MyExcelTempTable"
' Import the Excel file
DoCmd.RunSavedImportExport cSavedImport
Write more VBA function(s) to check the imported table for bad data and then to copy it into the permanent table. You might be updating existing records or adding new ones. Either way, you could use Access queries or SQL to do this and run them from VBA.
Write a VBA function to rename the old Excel file. (You could use an InputBox if the Excel file name is different each time. I do this for importing Excel files, and I set a default value so I do not have to type as much.)
Write a VBA function to export the new version of the Excel file.
Make yourself a button on a form that, when clicked, runs a VBA function. Inside that function, run Steps 2 through 6, above.
I am not sure my answer exactly matches what you are trying to do, but hopefully you get enough of a picture of the workflow to figure out the details of what you need.
Part of my job is to pull a report weekly that lists patching information for around 75000 PCs. I have to filter some erroneous data, based on certain criteria, and then summarize this data myself and update it in a separate spreadsheet. I am comfortable with pivot tables / formulas, but it ends up taking a good couple of hours.
Is there a way to import data from a CSV file into a template that already has in place my formulas/settings, etc. if the data has the same columns, but a different amount of rows each time?
If you're confortable with programming, then, you can use macros, on this case, you will connect to your CSV file, then extract the information and put it in the corresponding places on your spreadsheet, on this question you can find most of what you need to start off: macro to Import csv file into an excel non active worksheet.
I need to import tabular data into my database. The data is supplied via spreadsheets (mostly Excel files) from multiple parties. The format of each of these files is similar but not the same and various transformations will be necessary to massage the data into the final format suitable for import. Furthermore the input formats are likely to change in the future. I am looking for a tool that can be run and administered by regular users to transform the input files.
Now let me list some of the transformations I am looking to do:
swap columns:
Input is:
|Name|Category|Price|
|data|data |data |
Output is
|Name|Price|Category|
|data|data |data |
rename columns
Input is:
|PRODUCTNAME|CAT |PRICE|
|data |data|data |
Output is
|Name|Category|Price|
|data|data |data |
map columns according to a lookup table, like in the above examples:
replace every occurrence of the string "Car" by "automobile" in the column Category
basic maths:
multiply the price column by some factor
basic string manipulations
Lets say that the format of the Price column is "3 x $45", I would want to split that into two columns of amount and price
filtering of rows by value: exclude all rows containing the word "expensive"
etc.
I have the following requirements:
it can run on any of these platform: Windows, Mac, Linux
Open Source, Freeware, Shareware or commercial
the transformations need to be editable via a GUI
if the tool requires end user training to use that is not an issue
it can handle on the order of 1000-50000 rows
Basically I am looking for a graphical tool that will help the users normalize the data so it can be imported, without me having to write a bunch of adapters.
What tools do you use to solve this?
The simplest solution IMHO would be to use Excel itself - you'll get all the Excel built-in functions and macros for free.Have your transformation code in a macro that gets called via Excel controls (for the GUI aspect) on a spreadsheet. Find a way to insert that spreadsheet and macro in your client's Excel files. That way you don't need to worry about platform compatibility (it's their file, so they must be able to open it) and all the rest. The other requirements are met as well. The only training would be to show them how to enable macros.
The Mule Data Integrator will do all of this from a csv file. So you can export your spreadsheet to a CSV file, and load the CSV file ito the MDI. It can even load the data directly to the database. And the user can specify all of the transformations you requested. The MDI will work fine in non-Mule environments. You can find it here mulesoft.com (disclaimer, my company developed the transformation technology that this product is based on).
You didn't say which database you're importing into, or what tool you use. If you were using SQL Server, then I'd recommend using SQL Server Integration Services (SSIS) to manipulate the spreadsheets during the import process.
I tend to use MS Access as a pipeline between multiple data sources and destinations - but you're looking for something a little more automated. You can use macros and VB script with Access to help through a lot of the basics.
However, you're always going to have data consistency problems with users mis-interpreting how to normalize their information. Good luck!