Need the Table name in ALM where the testcase status is stored - alm

I need to run the test cases given in ALM and update the status of each test case. Can anyone tell me which TABLE in ALM Database where the test case status is stored ?

There are different kind of status in QC (and different tables where they are stored):
In the TEST table is a column TS_STATUS (the test execution status)
In the TESTCYCL table is a column TC_STATUS (the test instance's execution status)
In the RUN table is a column RUN_STATUS (the run status).

Related

How terminate pipelines in Azure Synapse when query returns no rows

I have a pipeline A that is invoke by a main pipeline D. It invokes 2 other pipelines B and C. When pipeline A is invoked an extraction query is executed that can return rows or nothing.
In case it returns no rows I would like it to terminate without sending an error message.
It should also terminate the main pipeline D. In other words pipelines B and C shouldn’t be invoked.
How can I invoke such a terminal activity in Azure Synapse? I would like to avoid using a Fail activity as it would be a false negative.
Since your child pipeline has the look up output count, and there is direct way to pass the count to master pipeline, you can consider changing the pipeline configuration.
Instead of using lookup to get the count, you can directly use a copy data activity and write the count of records to a new table.
You can get this data (new table data) using look up in master pipeline and perform the check (whether count is 0 or not).
Look at the following demonstration. I have a table with no records in my azure SQL database. In Pipeline A, I have used the following query as source of copy data activity and auto created a table in sink.
-- in source. Querying the required table for count
select count(*) as count from demo
Now in master pipeline, use an additional lookup to read records from above created count_val table. The output will be as follows:
Now you can use this count in if condition using the following dynamic content:
#equals(activity('Lookup1').output.value[0].count,0)
The condition will be true when the count is 0 (as show below) and hence the flow stops (no activities inside true case). If there are records, then the next pipelines will be executed (false case).

How to create a audit table in azure data factory which will hold the status for the pipeline run in Azure Data Factory

I have a requirement where a Azure Data Pipeline is running and inside that we have a data flow where different tables are loaded from ADLS to Azure Sql Database. So the issue is I wanted to store the status of the pipeline like success or failure in an audit table as well as Primary Key column ID which is present in Azure SQL database table so that when I want to filter job I on the primary key like for which ID job is success I should get from the audit table.i managed to did something in stored procedure and store the status in a table but I am unable to add a column like ID .Below is the screen shot of pipeline.
The Report_id column is from the table which is loaded from Dataload pipeline.How to add that in audit table so the every time when a pipline runs Report_id is captured and stored in audit table
Audit Table where I want to add Report id
Any help will be appreciated.Thanks
The Data Flow must have a sink. So, after the Data Flow completes, you need to use a Lookup activity to get the value of that Report_Id from the sink. Then, you can set that to a variable and pass that into your Stored Procedure. (You could also just pass it directly to the Stored Procedure from the Lookup using the same expression you would use to set the variable.)

How to create a report subscription in SSRS which passes today's date to paramters?

I created a report with a StartDate and EndDate parameter. If I want to see the information for a single day, I use the same Date in both parameters. I now want to create a subscription for this report so that it runs everyday. How can I use the current date and pass it to these parameters when the report runs? Thanks!
Step 1: You will have your defaultDates dataset as so. can be query or you can wrap it as a Stored proc.
Select TodaysDate = cast(getdate() as date)
Step 2: then , under default values for both the params you will anchor get value from dataset and point to this dataset, which is defaultDates.
Step 3: test it locally. Make sure to delete .DATA from your working directory to enforce fresh data.
Step 4: build and deploy to whatever test location.
EDIT: This will only work with Enterprise edition.
First, write a query that gets the current date and formats it to match the VALUE in your parameter (for example, is it DD-MM-YYYY, YYYY-MM-DD?). Make sure to name your column something meaningful like "CurrentDate".
select cast(current_timestamp as date) as CurrentDate
Then create a new subscription for your report. Instead of Standard Subscription, choose data driven subscription. Now select your SQL datasource and paste in your query. Press validate to make sure it runs fine. Hit OK.
Now you can go down to your subscription parameters at the very bottom of the page. Set Source of Value to be "Get value from dataset" then pick your "CurrentDate" from the drop down.
That's it, data driven subscription with current date.

Transistioning from Excel to Access - challenges in assigning/determining a Primary Key

I am currently working on attempting to transition a large database of testing records from Excel to Access to allow for better relational analysis between different groups; however, with how our team completes tests the same tests are repeated on a cycle making it difficult to capture a unique ID -
I have considered assigning one for each test but then realized the ID would still repeat when the test is ran again. I have considered using the timeframe of the review but that would also be repeated unless I build a separate table for each individual test. I have considered using the issue number that would be assigned to items requiring action, but this would not apply to all rows therefore access would not allow it.
Within our current database we capture test name, type, timeframe reviewed, start and end dates, result types, descriptions and root cause, and issue identifiers if remediation is required.
Does anyone have any suggestions on how I might transition this data into Access without losing the Primary Key feature on certain tables?
I would suggest creating a master table to identify your 'test types' (the things that can be run multiple times). You populate this table with the unique entries in the test name column in your Excel data. This column in your Excel data then becomes a 'foreign key' pointing to your master 'test types' table.
Ideally I would recommend using an autonumber as the primary key in the master table and then replacing the test name column in your Excel data with the numeric values generated when you populate the 'test types' table.
You would then import your Excel data into a table that is recording 'Test Run Instances'. Again I would set an autonumber field as the primary key on this table. Other columns in your Excel data that contain repeating data would make candidates for other 'master' tables (e.g. you talk about 'result types' - is there a set list of these that could be moved out to a master table?).

Optional Prompt in Cognos 8.3

I'm new to Cognos report studio. I have data organized in query items per business date. The Cognos report shows a data filter by business date. For business date there is a separate table join. This is always one column, one row table with current business date. The Cognos report is scheduled to run everyday after data is populated in the data source. This part works just fine.
However, I want to have ability to run this same report on demand and when I run this report on demand it should prompt me for a date instead of picking it from the database table. I know how to add a prompt but when I add a prompt, my scheduled reports doesn't work! I can't provide any default as it has to be dynamically picking it from the table.
Any pointers would be helpful!
Make the date filter (the one derived from the prompt) optional and set a condition on the other filter to be ignored if the prompt has a value.
This is late but could be helpful for other guys. I suggest this:
- In the metadata, make a prompt macro with a default value of current business date
- In the report, make a date prompt using the parameter defined by prompt macro
- In report schedule, leave the date prompt blank to refresh that everyday
This way your schedule runs with refreshed date and also you can pick for a date while manually running a report.

Resources