I have a similar performance issue to that in slow exporting from access to excel, exporting data to MS Excel by setting individual cell values looping on:
objSheet.Cells(table_1_row_no + intcounter, table_1_colA_col_no) = table_1_array(intcounter, 0)
however my data comes from an array, and needs to be processed in excel after transfering it there, hence i need to have an open sheet to work with after transfering the data. As a result non of the methods noted in the above post appear ideally suited.
I have an MS access application that pulls data from the database does a fair bit of processing of the data in vba prior to it being put into excel. As a result the data to be transfered is in array rather than a recordset. The simplest way to get it back to a recordset to allow the .CopyFromRecordset method to be used, is to create a temporary table and recordset, but this seams a fairly lengthy path to go down unless theres nothing better.
The spreadsheet does a lengthy calculation before Access picks a result back. I'm keen to leave this in excel as this gives flaxibility for engineer's using it to make adjustments where needed for differant situations (the environment is controlled so unauthorised changes arent a concern). As a result options that simply drop out a csv, or an excel file without opening it aren't suited either.
Any suggestions much appreciated.
I had the same problem, try to convert your data source to recordset
If you use EXCEL 2000,2002,2003, or 2007 then Use CopyFromRecordset
'Copy the recordset to excel sheet we start in cell A1
sheet1.Cells(1, 1).CopyFromRecordset rst
Be carefull CopyFromRecordset will fail if recordset contains OLE object field or array data such as hierarchical recordset
Related
I have a large number of measurement data in excel files. I need to read and process this data using matlab. The problem I have is that not all excel files contain the same amount of data.
I have tried the following
excel = actxserver('excel.application');
xlswb = excel.Workbook.Open(myFile);
data = xlswb.Sheets.Item(mySheet).get('Range', 'A1', 'L10000').Value;
This "works" as there will not be more than 10000 rows and 8 columns (in my case). However, this is very dirty code and I have to check for each file where the data actually ends.
Is there a way to read a whole sheet (similar to the xlsread function)?
Thanks!
Sheets("SheetName").UsedRange will get you a collection every used cell in that sheet. However, if cell L10000 had data and it was cleared, it will still make part of that collection.
I am working on a new database in access to automate a lot of hand entry into Excel. Right now I have come to a point where Excel can graph and be distribeted easier than Access data.
I would like to place a button on a form that when pressed it will take data from Access and fill out an existing 'template' Excell sheet that has all the formulas and graph pulling from a set of cells. The data in Access (can be made into an sql query/record set in VBA) is just some totals, averages, and a YTD calculation and needs to be placed into a specific set of cells in an existing excel sheet and saved as a different one (as to not overwrite the template). It is only 13 x 5 cells of new data.
Is this possible?
Yes, it can be done. The Export will need to be meticulous though, because you already have a predefined range of cells you're using in Excel.
First step would be to get the data into the format you need - your SQL statement.
Make sure this format will work for your template and is returning the correct data.
Once this is done, you can write to your Excel file. This is the format.
expression.OutputTo(ObjectType, ObjectName, OutputFormat, OutputFile, AutoStart, TemplateFile, Encoding)
Here is an example:
Dim outputFileName As String
outputFileName = "C:\WhereYouWantYourFileToGo "
DoCmd.TransferSpreadsheet acExport, acSpreadsheetTypeExcel12, "YourQueryName", outputFileName, True
MsgBox "Data exported to WhereverYouExportedItTo.xlsx"
Note: When using DoCmd.TransferSpreadsheet, acSpreadsheetTypeExcel12 is used for Excel 2010, the list of versions is here.
I'm assuming the rest will be handled by Excel.
I am new to writing excel add-in (in C#) and trying to figure out the right way to save some internal data structures so I can restore the state the next time file is opened. I can convert data in xml or base64 strings if it makes things easier. I don't want to maintain a separate file and would like to embed this information inside excel worksheet.
Many thanks for your help.
Use a cell in an invisible sheet (you can name it, for example, "internal data sheet") for storing the information. Excel sheets have a Visible property which can be set programmatically to `xlVeryHidden' which means it can only be made visible again by a program. Here you find some more information:
http://support.microsoft.com/kb/142530/en-us
We are trying to generate MS Excel workbook using OOXML and populate data using SSIS.
We are able to generate Workbook and sheets, also able to create columns and insert data in the Header cell. We can also populate data using SSIS.
But the Sheet (DocumentFormat.OpenXml.Spreadsheet.Sheet) and all cells (DocumentFormat.OpenXml.Spreadsheet.Cell) becomes OpenXmlUnknownElement. So we are not able to read sheet / cell using following code: Sheet sheet = workbookPart.Workbook.Descendants<Sheet>().Where(s => s.Name == "Sheet1").SingleOrDefault<Sheet>();
We are able to read the same file if we first open it using MS Excel and save. Does anyone know how to resolve this?
You probably forgot to give your sheet a name. You can see this by using
Sheet sheet = workbookPart.Workbook.Descendants<Sheet>().FirstOrDefault
and you'll see that your sheet name is either undefined or garbage text.
If that does not help. Create a simple document in code save it in OOXML and open it in a xml viewer. then make a copy open it in Excel and save it and see the difference in xml. That is often a good start seeing what excel has added by default to the document.
Excel is very tolerant of things that you have done wrong when creating the document in code, and magically fix them when you open the document again.
A bad hack would be using interop to open the document saving it again in code. wich would fix everything for you.
Workbook wrkbk = app.Workbooks.Open(#"c:\del.xls");
wrkbk.Save();
wrkbk.Close();
I wrote a macro that imports a CSV file into my spreadsheet, using a QueryTable. My goal is to import the CSV data and be able to save it in the spreadsheet for future reference.
However, the QueryTable updates with the external CSV file, of course, and I would like to prevent this (because I want to forget about the external file). Is there a QueryTable setting I can use to prevent refreshes, or is this the wrong functionality to use altogether?
Well.. if your goal is to have the data imported and not to keep up that data up to date (i.e. refreshing again with the CSV) then at the end of your VBA after .Refresh command for the query table, just delete the query table.
e.g.
Sheet1.QueryTables(1).Refresh (False)
Sheet1.QueryTables(1).Delete
Be sure to pass False to refresh command to ask Excel to do forground query, where your code execution will hold till you get your data in. Then your cell where you are adding the query table will be treated as regular cell.
You can use VBA to just open the CSV as a workbook. Something like this.
Set myCSVbook = Workbooks.Open Filename:="C:\dir\myfile.csv", Format:=2 '2 is csv
Once you've done that, you can copy individual cells, or you could copy the whole sheet into another workbook in one line. I think there's a Copy method on the Worksheet object to do that.