Coded UI tests - run multiple tests - coded-ui-tests

I have following problem.
I need to run 10 tests in a row - driven by excel workbook. Each row is one test case.
My problem is when I add the following line:
[DataSource("System.Data.Odbc",
"Dsn=Excel Files;Driver={Microsoft Excel Driver (*.xls)};dbq=|DataDirectory|\\RecordedSteps\\Input.xlsx;defaultdir=.;driverid=790;maxbuffersize=2048;pagetimeout=5;readonly=true",
"List1$", DataAccessMethod.Sequential), TestMethod]
It iterates over excel workbook in one test case.
What I need is to read one line of excel worksheet, fill input and complete test case and so on...

The [DataSoure(...)] and [TestMethod] attributes apply to the method that immediately follows them. The [DataSoure(...)] attribute instructs Coded UI to run the method a number of times; once for each set of data, eg for each row in a spreadsheet.
To run 10 different tests requires 10 methods each having a [TestMethod] attribute. Any of these methods can be data driven, but will need their own [DataSource(...)] attributes.

Related

Multi-variable data table in Excel causing it to hang up

I have 13 long sets of data of different variations of an instrument. For a particular set of parameters (2 parameters - say X collectively), I have a series of formulas that calculate certain outputs (26 outputs - 2 for each data set, say Y collectively) in a table.
Now, I build a new table populated with about 20 sets of all different Xs and all the resultant Ys using a method suggested here, wherein the INDEX function and the data table feature of Excel are used along with 2 selectors (1 for the parameter set and 1 for the output set) to create desired multi-variable table for a table full of different parameters.
I have used this method a few times before as well without any problems. This time, however, the data tables are too long and each output calculation needs going through all of them using multiple INDEX functions. This is causing Excel to hang up a lot. For a single set of parameters it works just fine. As soon as I try to populate my multi-parameter table, it starts acting up. So bad that my PC almost freezes.
Any suggestion on how can this process be made faster? I had seen some options for multi-variable tables and found this a convenient one. In terms of computational complexity is this not an efficient method and is there another more efficient one?
I tried using Excel Online hoping for Microsoft to use it's own computing power there, but it acts weirdly online - random deletion of cells and stuff. Has anyone had success with Excel Online in such computation issues? (I have Office 365 subscription)
My PC specs:
Intel Core i7 5500U
16GB DDR3 RAM
Windows 10 x64
Let me know if any other information is required. Thank you in advance.
Edit 1: A minimal example is best represented on this page - https://www.mathscinotes.com/2016/10/excel-data-table-with-more-than-two-input-variables/
My own sheet uses exactly the same method except with higher number of parameter sets and output sets and larger amount of data to create output from.
Edit 2: I am attaching a screenshot of my Excel sheet as an attempt to explain my own formulas, as suggested by BigBen. My Excel sheet's formulas' screenshot
In the event that the link I posted dies, I am also attaching the image of the example that I had previously mentioned. Mark Biegert of Math Encounters' example

Copy and paste Excel rows between two workbooks based on criteria from exported Access data

I have no previous experience in Access, VBA coding or in Excel macros prior to teaching myself the past month via these forums. Thank you forums and contributors. I have enjoyed my Access learnings so far, the challenge that it has provided and appreciate any help that I can get. As such, the code and methods that I have used to this point may well be convoluted and confusing. I will do my best to provide relevant details and accurate terminology.
I work in a lab and I am creating an Access Form for semi-automated reporting. Samples are received from clients and are logged into the Excel Table R&D Log. The worksheet is InProcess. Samples are sorted based on the site in which they originate and given a one or two letter site code (G, D, WH, etc.) and an ID "yy-000" in separate Excel columns (i.e. D 18-096). Samples may be submitted for multiple analyses (Metals, Water, Soil, etc.) and may even have multiple rows of reporting if multiple analytes are identified in the sample. There are several other columns, such as receipt date, reporting date, units, etc. Once samples are reported, I manually copy and paste them into the Archived worksheet, and delete the record and blank row from the InProcess worksheet. Since one sample may have multiple analyses and even more potential results, each record would be reported on a new Excel row (with the same D 18-096 ID number). Thus, there is not a single unique identifier or primary key for each sample in the current format. R&D Log is updated manually by lab technicians and the worksheet InProcess is a linked table in an Access Database.
The Access Database is using two combo boxes on a Form frmInProcess to filter a Query qryInProcess of the linked table. The combo boxes are filtering the report destination (one client may receive multiple site codes) and the analysis (reports are separated based on type of analysis). The Query is also filtering out blank results and blank dates, so only completed samples will appear on the filtered Form. I have generated VBA code to this point that will export the Form to a .pdf, save the file with unique filename, and open outlook to mail out the report. I have also managed to export the filtered Form frmInProcess to an Excel file Access Test (not the linked file).
What I would like to do now is to automate the transfer of completed test results from the Excel worksheet R&D Log: InProcess to R&D Log: Archived and delete the record from the InProcess worksheet. I am not sure if I can export the filtered Form into a linked Excel table, or if I must use a separate Excel file (or if it even matters for simplicity of code?). I would now like to read the exported filtered Form in Excel Access Test, lookup matching rows in R&D Log based on several criteria (site, ID, Analysis, Analyte, Report Date) and automate the transfer of records between R&D Log worksheets. End result being that Access generates reports for completed tests, and the records are removed from InProcess testing and transferred to Archived testing in Excel. I am guessing that I may need to close the Access application and perform this in Excel. Hope this is easy enough to follow.
Thank you.
In my experience, importing an Excel document into a temporary NEW (or totally empty) Access table is usually the easiest way to go. Then you do not have to worry about cell references like you do in Excel VBA. Even if the Excel document has old data in it with just a few new changes each time, importing it into a temporary Access table could be the simplest way to go, because then you can compare the data in this table with the data in another, permanent Access table and update the latter based on the former.
As far as the original Excel file, if you need to delete rows there, it might be quicker to export a new Excel file with just the data the old one is supposed to end up with, and then use VBA to delete (or - safer! - rename) the old file.
So the development process goes something like this:
Save import steps by first importing an Excel file via Access' ribbon options "External Data" (tab) ->"Excel" and when you finish, be sure to check the "Save import steps" box and note the name you give the "saved import" because you will need that in your VBA code.
In Access, write a function for deleting the table. The VBA code is:
Const cTable = "MyExcelTempTable"
If TableExists(cTable) Then
DoCmd.DeleteObject acTable, cTable
End If
Now you can test your delete function on the data you imported.
Write VBA code to import the same spreadsheet to create the same table:
Const cSavedImport = "Import-MyExcelTempTable"
' Import the Excel file
DoCmd.RunSavedImportExport cSavedImport
Write more VBA function(s) to check the imported table for bad data and then to copy it into the permanent table. You might be updating existing records or adding new ones. Either way, you could use Access queries or SQL to do this and run them from VBA.
Write a VBA function to rename the old Excel file. (You could use an InputBox if the Excel file name is different each time. I do this for importing Excel files, and I set a default value so I do not have to type as much.)
Write a VBA function to export the new version of the Excel file.
Make yourself a button on a form that, when clicked, runs a VBA function. Inside that function, run Steps 2 through 6, above.
I am not sure my answer exactly matches what you are trying to do, but hopefully you get enough of a picture of the workflow to figure out the details of what you need.

To print PASS/FAIL status in excel , after coded UI tests

I am using Coded UI for testing.
I have a set of 5 fields to be validated in one page.
I have added assert and record methods and all that is working fine.
Is it possible to import an excel file with field level validations and mark Pass/Fail status in that excel sheet through Coded UI?
Like, the test result should be printed in the last column of the excel sheet across each of the fields.
Is that possible?
Yes it is possible.
Here's how to read from excel:
http://msdn.microsoft.com/en-us/library/ee624082.aspx
And here are some links to help write to excel:
Writing to an existing Excel File using c#
http://msdn.microsoft.com/en-us/library/dd264733.aspx
http://www.codeproject.com/Tips/613782/Read-and-Write-Excel-File-Dynamically

Excel MATCH function not working with automatic data update

I have an excel function that looks like this:
=MATCH($A3,Table[Column],FALSE)+1
What this function does is return me the row number of the matching row where
table[column] == $A3
and this is working fine in a static file. However as soon as I made the data from Table to automatically update every minute it just stopped working (I get #N/A error as the function return).
After imports/updates to tables and pivot tables, you often need to use the Refresh All button in the data toolbar. That should clear this issue right up! I would even create a VBA macro to call the refresh automatically after every update.
I've also found it's better to use named ranges or dynamic named ranges in situations such as this. Check this out: http://support.microsoft.com/kb/830287

Is there a code-generator to create DataTable definition block from Excel Work sheet?

Basically the thing I want to achieve is to have a data-table that I want to use in my unit tests. And when I run my unit tests, I do not want to read any excel file into a data-table -or any call to Db-.
So, I would like to have method that returns a data-table with the values that I can use in my test.
Is there already any written tool to read an excel sheet and generate a code that defines an ADO.Net DataTable?
Thanks,
burak ozdogan
The Microsoft ADO.NET driver for excel should do the trick: How To Use ADOX with Excel Data from Visual Basic or VBA
I think QuickUnit does: http://www.quickunit.com/

Resources