Wow, my first stack question despite using the answers for years. Very exciting.
I'm fairly new to VBA and Excel and entirely new to Access, full disclosure. So Im trying to create a core database of lab reports, and I have a form for entering the information about a new report which adds info about the report to a master table of all reports, including assigning it a unique label. After entering the info, I then have a button which allows the user to select the Excel .csv file accompanying the report and imports it into the DB as a new table. It returns a success or error message. And it works! (And the code came from somewhere on here)
The problem is I'd like to then add a field to the new table that adds the label assigned to the new report to all records so it can be referenced by queries through the use of that label. I'd also like to add an index field to the new table if possible as it doesn't seem like importing the .csv as a table creates an index. I figure I'll make another sub that gets passed the new report name as a name for the new field (which will also be the value of the field through all records) and the table to append that to.
How do I pass this sub the newly imported table if I just imported it? I need this all to work from the button as it will mostly be my manager using this form/button to import new files, and they won't be able to just manually go into the tables as they are created and add fields (yes, I know that's the obvious solution, but trust me...this must be a button)
Heres the code I'm using (yes, I know lots of it could be done differently but it works!)
Public Function ImportDocument() As String
On Error GoTo ErrProc
Const msoFileDIalogFilePicker As Long = 3
Dim fd As Object
Set fd = Application.FileDialog(msoFileDIalogFilePicker)
With fd
.InitialFileName = "Data Folder"
.Title = "Enthalpy EDD Import"
With .Filters
.Clear
.Add "Excel documents", "*.xlsx; *.csv", 1
End With
.ButtonName = " Import Selected "
.AllowMultiSelect = False 'Manual naming currently requires one file at a time be imported
'If aborted, the Function will return the default value of Aborted
If .Show = 0 Then GoTo Leave 'fb.show returns 0 if 'cancel' is pressed
End With
Dim selectedItem As Variant
Dim NewTableName As String
NewTableName = InputBox(Prompt:="Enter the Report Name", _
Title:="Report Name")
For Each selectedItem In fd.SelectedItems 'could later be adapted for multiple imports
DoCmd.TransferText acImportDelim, , NewTableName, selectedItem, True 'Imports csv file selected, true is 'has headers'
Next selectedItem
'Return Success
ImportDocument = "Success"
'Append report label and index
AppendReportLabelField(NewTableName, #What to put here as the table to append to?)
'error handling
Leave:
Set fd = Nothing
On Error GoTo 0
Exit Function
ErrProc:
MsgBox Err.Description, vbCritical
ImportDocument = "Failure" 'Return Failure if error
Resume Leave
End Function
The AppendReportLabelField would get passed the name (and value) of the field and the name of the (newly imported) table. How do I pass it the table? NewTableName is just a string currently. If I can pass the new sub the table I'm sure the rest will be simple.
Thanks in advance for the help!
Consider storing all user input data in a single master table with all possible fields and use a temp, staging table (a replica of master) to migrate CSV table to this master table. During the staging, you can update the table with needed fields.
SQL (save as stored queries)
(parameterized update query)
PARAMETERS [ParamReportNameField] TEXT;
UPDATE temptable
SET ReportNameField = [ParamReportNameField]
(explicitly reference all columns)
INSERT INTO mastertable (Col1, Col2, Col3, ...)
SELECT Col1, Col2, Col3
FROM temptable
VBA
...
' PROCESS EACH CSV IN SUBSEQUENT SUBROUTINE
For Each selectedItem In fd.SelectedItems
Call upload_process(selectedItem, report_name)
Next selectedItem
Sub upload_process(csv_file, report_name)
' CLEAN OUT TEMP TABLE
CurrentDb.Execute "DELETE FROM myTempStagingTable"
' IMPORT CSV INTO TEMP TABLE
DoCmd.TransferText acImportDelim, , "myTempStagingTable", csv_file, True
' RUN UPDATES ON TEMP TABLE
With CurrentDb.QueryDefs("myParameterizedUpdateQuery")
.Parameters("ParamReportNameField").Value = report_name
.Execute dbFailOnError
End With
' RUNS APPEND QUERY (TEMP -> MASTER)
CurrentDb.Execute "myAppendQuery"
End Sub
If CSV uploads vary widely in data structure, then incorporate an Excel cleaning step to standardize all inputs. Alternatively, force users to use a standardized template. Staging can be used to validate uploads. Databases should not be a repository of many, dissimilar tables but part of a relational model in a pre-designed setup. Running open-ended processes like creating new tables by users on the fly can cause maintenance issues.
Related
I am currently working on a project that will import data from multiple different sources in a variety of formats and structures - e.g., CSV, fixed-length, other-delimited (tab, pipe, etc.) plain-text, and Excel worksheets/workbooks. For this, I'm attempting to build "generic" readers for these files which will throw the files' contents into a DataTable/DataSet I can use in other methods. The plain-text files are pretty simple as I've created a large SCHEMA.INI file which contains field definitions for each of the files the system will handle. That SCHEMA.INI resides in a "processing folder" where the files are temporarily stored until their data has been integrated with other systems. A defined text files' data can be easily extracted using this method:
Private Function TextFileToDataTable(ByVal TextFile As IO.FileInfo) As DataTable
Dim TextFileData As New DataTable("TextFileData")
Using TapeFileConnect As New OleDb.OleDbConnection("Provider=Microsoft.Jet.OleDb.4.0;Data Source='" + TextFile.DirectoryName + "';Extended Properties='Text';")
Using TapeAdapter As New OleDb.OleDbDataAdapter(String.Format("SELECT * FROM {0};", TextFile.Name), TapeFileConnect)
Try
TapeAdapter.Fill(TextFileData)
Catch ex As Exception
TextFileData = Nothing
End Try
End Using
End Using
Return TextFileData
End Function
This works well because a plain-text file isn't terribly complex in its data structure. A single file generally (at least for my requirements) contains, at most, one single table's worth of data - unless, of course, it's some sort of complex XML or JSON structure file, which can/should be handled completely differently anyway - so there's no need to go iterating through different elements beyond this.
NOTE: The code above is dependent on the SCHEMA.INI file being present in the same directory as the plain-text file being read and there being a section within that SCHEMA.INI defined with the same name as that plain-text file.
EXAMPLE:
[EXAMPLE_TEXT_FILE.TXT]
CharacterSet=ANSI
Format=FixedLength
ColNameHeader=FALSE
DateTimeFormat="YYYYMMDD"
COL1=CUSTOMER_NUMBER TEXT WIDTH 20
COL2=CUSTOMER_FIRSTNAME TEXT WIDTH 30
COL3=CUSTOMER_LASTNAME TEXT WIDTH 40
COL4=CUSTOMER_ADDR1 TEXT WIDTH 40
COL5=CUSTOMER_ADDR2 TEXT WIDTH 40
COL6=CUSTOMER_ADDR3 TEXT WIDTH 40
...
Excel workbooks, however, can be a bit trickier. Several of the workbooks I have to process contain multiple worksheets worth of data that I want to consolidate into a single DataSet with a DataTable for each worksheet. The basic functionality is, again, fairly straightforward and I've come up with the following method to read any and all sheets into a DataSet:
Private Function ExcelFileToDataSet(ByVal ExcelFile As IO.FileInfo, ByVal HasHeaderRow As Boolean) As DataSet
Dim ExcelFileData As New DataSet("ExcelFileData")
Dim ExcelConnectionString As String = String.Empty
Dim UseHeaders As String = "NO"
Select Case ExcelFile.Extension.ToUpper.Trim
Case ".XLS"
ExcelConnectionString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source={0};Extended Properties='Excel 8.0;HDR={1}'"
Case ".XLSX"
ExcelConnectionString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source={0};Extended Properties='Excel 8.0;HDR={1}'"
End Select
If HasHeaderRow Then
UseHeaders = "YES"
End If
ExcelConnectionString = String.Format(ExcelConnectionString, ExcelFile.FullName, UseHeaders)
Try
Using ExcelConnection As New OleDb.OleDbConnection(ExcelConnectionString)
Dim ExcelSchema As New DataTable
ExcelConnection.Open()
ExcelSchema = ExcelConnection.GetOleDbSchemaTable(OleDb.OleDbSchemaGuid.Tables, Nothing)
For Each ExcelSheet As DataRow In ExcelSchema.Rows
Dim SheetTable As New DataTable
Using ExcelAdapter As New OleDb.OleDbDataAdapter
Dim SheetName As String = ExcelSheet("TABLE_NAME").ToString
Dim ExcelCommand As New OleDb.OleDbCommand
SheetTable.TableName = SheetName.Substring(0, SheetName.Length - 1)
ExcelCommand.Connection = ExcelConnection
ExcelCommand.CommandText = String.Format("SELECT * FROM [{0}]", SheetName)
ExcelAdapter.SelectCommand = ExcelCommand
ExcelAdapter.Fill(SheetTable)
End Using
ExcelFileData.Tables.Add(SheetTable)
Next ExcelSheet
End Using
Catch ex As Exception
ExcelFileData = Nothing
End Try
Return ExcelFileData
End Function
The above code will work in a majority of the cases I deal with, but my "difficulty" is that there may be some worksheets that have header rows and some that don't within the same workbook. Also, for those worksheets that do not have a header row, I'd like to be able to define the field names and data types similar to how I can with the plain-text SCHEMA.INI. The only thing I have going for me in these cases is that the "client" provides me with a data map to help me identify what data elements are in each field.
What I'd like to know is if there is a way similar to the text file's SCHEMA.INI to define the structure of an Excel workbook and the worksheet(s) it contains - including column data types to avoid the OleDb driver from "misinterpreting" a column's data - ahead of time. I imagine this could be any sort of structured file such as INI, XML, or whatever, but it would need to be capable of identifying whether or not a particular sheet contains a header row or, in lieu of such a row, the (expected) column definitions. Does any such "standard definition" file exist for Excel workbooks?
One thing to note: As you may have noticed in the code for the ExcelFileToDataSet() method, I may be dealing with the older .XLS (97-03) format or the .XLSX (07+) format, so I can't necessarily rely on the workbook being Open XML compliant. I suppose I could try breaking the methods out to one for each extension, but I'd rather find something that I can use regardless of which file format the Excel file is using.
Excel crashes, VBA raises Error 3218 “Could Not Update” Record Locking Errors when multiple users try to update same table in a shared MS-Access database using DAO.
I have a special configuration like this: a MS-Access database located in shared network folder, multiple user connect to update that database using VBA DAO build on Excel file. The VBA code in each Excel file is the same. The problem happens when there are 2 users click on update button at the same time. User Excel file turn hanging, or showing error 3218 "Could not update".
Sub ExportToAccess()
Dim oSelect As Range, i As Long, j As Integer, sPath As String
'tblSuppliers.Active
Set oSelect = Application.InputBox("Range", , Range("A1").CurrentRegion.Address, , , , , 8)
Dim oDAO As DAO.DBEngine, oDB As DAO.Database, oRS As DAO.Recordset
sPath = "\\sharedfolder\Database.accdb"
Set oDAO = New DAO.DBEngine
Set oDB = oDAO.OpenDatabase(sPath)
Set oRS = oDB.OpenRecordset("tblSuppliers")
For i = 2 To oSelect.Rows.Count 'skip label row
oRS.AddNew
For j = 1 To oSelect.Columns.Count 'Field(0) is Auto#
oRS.Fields(j) = oSelect.Cells(i, j)
Next j
oRS.Update
Next i
oDB.Close
MsgBox ("Updated Done!")
End Sub
I know my configuration is not good for database application, however I have to stick with this for a while. Could you please advise any solutions to avoid error when multiple users update Access database in this case ? Is there a way to detect if database is being updating by others and script to wait until that process to finish first. Any technical solution for this issue is welcome!
Thank you!
You need some type of flag to tell if anyone is updating the table or not. Examples of what this flag can be:
An Excel file cell (that is probably the easiest in your case; if multiple excel files are used, just link to the one cell)
A field in an Access table (even a table with a single field and a single record dedicated just for that)
A (text) file in your shared drive (the flag can be the content of the file or even whether the file exists or not)
Then your update process would be:
- Check the flag, if set, loop until flag is cleared
- Set the flag
- Update the table
- Clear the flag
You will probably also need some way for the users (or just you) to clear the flag manually, in case something else goes wrong while updating the table and the flag gets stuck raised.
Well , this is not probably the most elegant solution
but you may create a field in a table , and ask for it before working with the table
something like this :
Set LockedStatus= oDB.OpenRecordset("mycontroltable")
if LockedStatus("lockedSuppiers")=False then
oDB.Execute"update mycontroltable set lockedSuppiers=true"
Set oRS = oDB.OpenRecordset("tblSuppliers")
For i = 2 To oSelect.Rows.Count 'skip label row
oRS.AddNew
For j = 1 To oSelect.Columns.Count 'Field(0) is Auto#
oRS.Fields(j) = oSelect.Cells(i, j)
........
......
oDB.Execute"update mycontroltable set lockedSuppiers=false"
end if
Is there any way to read all the data from excel and put it in the datatable or any other container so that i can filter the data based on the conditions required. As shown in attached image i want to get the CuValue of a Partnumber whose status is Success and i want the latest record based on the Calculation date(Latest calculation date). In the below example i want the CuValue 11292 as it is the latest record with status Success..lue.
Thanks in advance
Your question seems very broad, but you're right to ask because there are many different possibilities and pitfalls.
As you don't provide any sample code, i assume you are looking for a strategy, so here is it.
In short: create a database, a table and a stored procedure. Copy the
data you need in this table, and then query the table to get the
result.
You may use ADO for this task. If it is not available on your machine you can download and install the MDAC redistributable from the Microsoft web site.
The advantage vs. OLE Automation is that you doesn't need to install Excel on the target machine where the import shall be executed, so you can execute the import also server-side.
With ADO installed, you will need to create two Connection objects, a Recordset object to read the data from the Excel file and a Command object to execute a stored procedure which will do the INSERT or the UPDATE of the subset of the source fields in the destination table.
Following is a guideline which you should expand and adjust, if you find it useful for your task:
Option Explicit
Dim PartNo as String, CuValue as Long, Status as String, CalcDate as Date
' objects you need:
Dim srcConn As New ADODB.Connection
Dim cmd As New ADODB.Command
Dim rs As New ADODB.Recordset
Dim dstConn As New ADODB.Connection
' Example connection with your destination database
dstConn.Open *your connection string*
'Example connection with Excel - HDR is discussed below
srcConn.Open "Provider=Microsoft.Jet.OLEDB.4.0;" & _
"Data Source=C:\Scripts\Test.xls;" & _
"Extended Properties=""Excel 8.0; HDR=NO;"";"
rs.Open "SELECT * FROM [Sheet1$]", _
srcConn, adOpenForwardOnly, adLockReadOnly, adCmdText
' Import
Do Until rs.EOF
PartNo = rs.Fields.Item(0);
CuValue = rs.Fields.Item(1);
CalcDate = rs.Fields.Item(6);
Status = rs.Fields.Item(7);
If Status = "Success" Then
'NumSuccess = NumSuccess + 1
' copy data to your database
' using a stored procedure
cmd.CommandText = "InsertWithDateCheck"
cmd.CommandType = adCmdStoredProc
cmd(1) = PartNo
cmd(2) = CuValue
cmd(3) = CalcDate
cmd.ActiveConnection = dstConn
cmd.Execute
Else
'NumFail = NumFail + 1
End If
rs.MoveNext
Loop
rs.Close
Set rs = Nothing
srcConn.Close
Set srcConn = Nothing
dstConn.Close
Set dstConn = Nothing
'
By using a stored procedure to check the data and execute the insert or update in your new table, you will be able to read from Excel in fast forward-only mode and write a copy of the data with the minimum of time loss, delegating to the database engine half the work.
You see, the stored procedure will receive three values. Inside the stored procedure you should insert or update this values. Primary key of the table shall be PartNo. Check the Calculation Date and, if more recent, update CuValue.
By googling on the net you will find enough samples to write such a stored procedure.
After your table is populated, just use another recordset to get the data and whatever tool you need to display the values.
Pitfalls reading from Excel:
The provider of your Excel file shall agree to remove the first two or three rows, otherwise you will have some more work for the creation of a fictitious recordset, because the intelligent datatype recognition of Excel may fail.
As you know, Excel cells are not constrained to the same data type per-column as in almost all databases.
If you maintain the field names, use HDR=YES, without all the first three rows, use HDR=NO.
Always keep a log of the "Success" and "Fail" number of records read
in your program, then compare these values with the original overall
number of rows in Excel.
Feel free to ask for more details, anyway i think this should be enough for you to start.
There are lots ways you can do this.
1. You can create an access DB table and import by saving your sheet as can file first, into the access table. Then you can write queries.
2. You can create a sql DB and a table, write some code to import the sheet into that table.
3. You can Write some code in VBA and accomplish that task if your data is not very big.
4. You can write c# code to access the sheet using excel.application and office objects, create a data table and query that data table
Depends on what skills you want to employ to accomplish your task.
I want to write a set of 100 select queries in DB2 10.1 to return all rows in each table in the database and have the results exported to an excel spreadsheet with a new tab for each result set.
Is this possible and if so how can I do it?
At the moment the only way I can do this looks like to export each result set and then manually create the multi tabbed spreadsheet by copying each tab across.
Thanks
You can use EasyXLS API for Excel with scripting languages like VB Script.
The VBS code should be similar with this one:
'The class that exports result set to Excel file
Set xls = CreateObject("EasyXLS.ExcelDocument")
' The class used to format the cells
Dim xlsAutoFormat
set xlsAutoFormat = CreateObject("EasyXLS.ExcelAutoFormat")
xlsAutoFormat.InitAs(AUTOFORMAT_EASYXLS1)
For query = 1 To 100
' Add a new sheet
xls.easy_addWorksheet_2("Sheet" & query)
set xlsSheet = xls.easy_getSheetAt(query - 1)
' Create the record set object
Dim objResultSet
Set objResultSet = CreateObject("ADODB.Recordset")
objResultSet.Open queryString, objDBConnection
' Create the list that will store the values of the result set
Dim lstRows
Set lstRows = CreateObject("EasyXLS.Util.List")
' Add the header to the list
Dim lstHeaderRow
Set lstHeaderRow = CreateObject("EasyXLS.Util.List")
lstHeaderRow.addElement("Column 1")
lstHeaderRow.addElement("Column 2")
lstHeaderRow.addElement("Column 3")
lstRows.addElement(lstHeaderRow)
' Add the values from the database to the list
Do Until objResultSet.EOF = True
set RowList = CreateObject("EasyXLS.Util.List")
RowList.addElement("" & objResultSet("Column 1"))
RowList.addElement("" & objResultSet("Column 2"))
RowList.addElement("" & objResultSet("Column 3"))
lstRows.addElement(RowList)
' Move to the next record
objResultSet.MoveNext
Loop
xlsSheet.easy_insertList_2 lstRows, xlsAutoFormat
Next
' Export result sets to Excel file
xls.easy_WriteXLSFile("c:\Result sets.xls")
Check also this link about exporting lists of data to Excel.
If you will choose a non scripting language, the API has methods that insert data directly from the result set and the lists can be skiped. Check another sample here.
I have an Excel Spreadsheet that calculates a risk (of perioperative mortality after aneurysm repair) based on various test results.
The user inputs the test results into the spreadsheet (into cells) and then out comes a set of figures (about 6 results) for the various models that predict mortality. The spreadsheet acts as a complex function to produce the results one patient at a time.
I also have a (separate) access database holding data on multiple patients - including all the data on test results that go into the spreadsheet. At the moment I have to manually input this data into the spreadsheet, get the results out and then manually enter them onto the database.
Is there a way of doing this automatically. Ie can I export data1, data2, data3... from Access into the spreadsheet to the cells where the data needs to be input and then get the results (result1, result2, result3...) from the cells where the results are displayed ported back into access.
Ideally this could be done live.
I suppose I could try to program the functionality of the spreadheet into a complex function in access, but if I'm honest, I am not really sure how the algorithm in the spreadsheet works. It was designed by anaesthetists who are much cleverer than me....
Hope this makes sense. Any help much appreciated.
Chris Hammond
It's possible to automate Excel from Access.
Const cstrFile As String = "C:\SomeFolder\foo.xls"
Dim xlApp As Object
Dim xlWrkBk As Object
Dim xlWrkSt As Object
Set xlApp = CreateObject("Excel.Application")
xlApp.Workbooks.Open cstrFile, ReadOnly:=True
Set xlWrkBk = xlApp.Workbooks(1)
Set xlWrkSt = xlWrkBk.Worksheets(1)
With xlWrkSt
.Range("A1") = 2
.Range("A2") = 19
Debug.Print .Range("A3")
End With
xlWrkBk.Close SaveChanges:=False
However, that seems like it would be cumbersome to repeat for each row of an Access table and I'm uncertain whether doing that live is reasonable.
I would try to adapt the Excel calculations to Access VBA functions and use those custom functions in an Access query. But I don't know how big of a task that would be. I suggest you shouldn't be scared off the the anaesthetists' cleverness; that doesn't mean they actually know much more about VBA than you. At least look to see whether you can tackle it.
To push the data back to Access, you can insert data from within the Excel VBA as follows:
dim val as variant
dim db as DAO.Database
val=thisworkbook.range("a1").value
set db=OpenDatabase("c:\myAccessDB.accdb")
db.execute "insert into patientData (someField) values (" & val & ")",dbFailOnError
db.Close
You'll need to add a reference to the Microsoft Office Access Database Engine Object Library.
Not sure to perfectly understand what you want, but if you just want to export the results of a query to a spreadsheet, you could use the following:
Private Sub ExportAccessDataToExcel()
Dim SqlString As String
SqlString = "CREATE TABLE testMeasurements (TestName TEXT, Status TEXT)"
DoCmd.RunSQL (SqlString)
SqlString = "INSERT INTO testMeasurements VALUES('Average Power','PASS')"
DoCmd.RunSQL (SqlString)
SqlString = "INSERT INTO testMeasurements VALUES('Power Vs Time','FAIL')"
DoCmd.RunSQL (SqlString)
SqlString = "SELECT testMeasurements.TestName, testMeasurements.Status INTO exportToExcel "
SqlString = SqlString & "FROM testMeasurements "
SqlString = SqlString & "WHERE (((testMeasurements.TestName)='Average Power'));"
DoCmd.RunSQL (SqlString)
DoCmd.TransferSpreadsheet acExport, acSpreadsheetTypeExcel7, "exportToExcel", "C:\TestMeasurements.xls", True, "A1:G12"
End Sub
Source: http://www.ehow.com/how_7326712_save-access-query-excel-vba.html
This could be done either directly from the database or from Excel (you would need to open the database with Excel VBA to do so, but most of the Office Suite products interact well with each other).
If you want to push the data of your spreadsheet into an Access database, that's different. You just have to open the database and loop through INSERT query. Here is a quick example, you just need to add the loop:
Dim db as DAO.Database
Set db = OpenDatabase(myDataBase.mdb)
Call db.Execute("INSERT INTO myTable (Field1, Field2) VALUES('Value1', 'Value2')")