please dont hurt me for this question... I know it is not the ideal way to handle mass data- but I need to try...
I got a folder with 10 csv Files which are always under the xlsx row limitation of 1.048.576 rows.
Now I try to combine those files in one file. The combination of all files reach over 1.048.576 rows. With the import dialog I always get the error saying: not possible to load all data etc..
I found a way to load the data only in the data model of power query and not directly in the sheet. But I cannot find any way to split the data into different sheets.
Ideal split for example:
Sheet 1: File 1-3
Sheet 2: File 4-8
Sheet 3: File 9-10.
Is there a way to get for each file a different query and then to append those queries in the sheets? I would like to get 10 queries, which I can append the way mention above.
Thank you for your Input!
You can load each CSV file separately as a unique query, with each File... Close and Load saved as Connection Only. Then create separate queries that use a Table.Combine() to put together the combinations you need [data .. Get data … combine queries .. Append...] in separate queries that you file load as either tables or pivot reports back on the sheets
Related
So I have 4,000 spreadsheets that contains data arranged with the same set of columns inside. Instead of opening each spreadsheet, copying all data and combine into one spreasheet. Is there a faster way to do it? I tried it in wordstat and ASAP utilities but it doesnt have that feature.
You can use Power Query to combine the files. That is:
On the Data tab
Get & Transform Data
Get Data
From File
From Folder
Then follow the wizard.
In order to view customer data in an Excel sheet I have used the functions index/match to retrieve data. However, due to the large amount of customers the file has gotten very large. Right now it is 13MB. This file is regularly sent through mail, so it is a real headache having to open it every time.
Is there a way to replace Index/Match with something else in order to reduce the file size? Transforming the source file into an SQL file? Adding a connection to the source file?
Thanks.
I am trying to set up a query that will simply combine data from CSVs into a table as new files get added to a specific folder, where each row contains the data from a separate file. While doing tests with CSVs that I created in excel, this was very simple. After expanding the content column, I would see an individual row of data for each file.
In practice however, where I am trying to use CSVs that are put out from a proprietary android app, expanding the content column leads to 1 single row, with data from all files placed end to end.
Does this have something to do with there not being and "end of line" character in the CSVs the app is producing? If so, is there an easy way to remedy this without changing the app? If not, is there something simple and direct I can ask the developer to change which would prevent this behavior?
Thanks for any insight!
I'm new to Power Query and running into a strange issue that I haven't been able to resolve.
I'm creating a query to to extract data from roughly 300 Excel files. Each file has one sheet, 115 Columns and around 100 rows. However, the query is only returning the data from the first two columns and rows and I'm not sure why the query won't return all of the data on the sheet.
Ex:
Header 1 Header 2
Data Data
I converted one file to a .csv file and the query will return all data from the file. I've scoured Google and I haven't been able to find anything that seems to relate to this issue. Is there an excel file limitation that I'm not aware of?
I'm assisting someone that is not technical savvy so I would like to try to avoid VB code and Access if possible. Also, I can't really provide a file I'm working with because the data contains PHI.
Thank you in advance!
Part of my job is to pull a report weekly that lists patching information for around 75000 PCs. I have to filter some erroneous data, based on certain criteria, and then summarize this data myself and update it in a separate spreadsheet. I am comfortable with pivot tables / formulas, but it ends up taking a good couple of hours.
Is there a way to import data from a CSV file into a template that already has in place my formulas/settings, etc. if the data has the same columns, but a different amount of rows each time?
If you're confortable with programming, then, you can use macros, on this case, you will connect to your CSV file, then extract the information and put it in the corresponding places on your spreadsheet, on this question you can find most of what you need to start off: macro to Import csv file into an excel non active worksheet.