Using ALM OTA, how do we know if an ALM Resource is checked out? - alm

I have a requirement to upload a modified Resource file into ALM Test Resources. Before uploading, I need to verify if the Resource is Checked out, if not I need to Checkout that Resource.
I need to perform all these actions using OTA. I am able to get the particular resource object, and able to checkout/checkin.
But, I am not able to get the Version Control status (checkedout/checkedin). I found from ALM OTA API Reference that IsCheckedOut property can give us this result, but I am not getting on how this property to be used. Below is my code -
objFilter.Filter("RSC_FOLDER_NAME") = QCResourceFolderPath
Set objResourcesList = objFilter.NewList
For Each Resource In objResourcesList
If Resource.Name = strFileName Then
Resource.VC.Checkout ""
Exit For
End If
Next
This piece of code is performing Checkout operation, but am not able to use IsCheckedOut Property here.

got the answer. below piece of code gives me the checkedout status
Resource.VersionData.IsCheckedOut
So my code would be like this -
objFilter.Filter("RSC_FOLDER_NAME") = QCResourceFolderPath
Set objResourcesList = objFilter.NewList
For Each Resource In objResourcesList
If Resource.Name = strFileName Then
If Not(Resource.VersionData.IsCheckedOut) Then
Resource.VC.Checkout ""
Exit For
End If
End If
Next

Found one more way to get the Check Out Status
Resource("RSC_VC_STATUS") would give us the status as "Checked_In" or "Checked_Out"
using which we can implement our logic

Related

Issue with validating Arm template using DeploymentsOperations.StartValidate

I am currently working on a project where i deploy multiple arm templates each deploying a VM and doing few operations on them. I wanted handle quota issues by calling template validation before triggerring the first deployment. So, i created a template which has logic to create required VMs and i am using this template only for validation (to check if quota will not be exceeded).
Since our code already has the ResourceManagementClient, i tried the following code:
Deployment parameters = new Deployment(
new DeploymentProperties(DeploymentMode.Incremental)
{
Template = templateFile,
Parameters = parameterFile,
});
DeploymentsValidateOperation dp = deployments.StartValidate(groupName, "validation", parameters);
But when i try to access the Value from the variable dp, i keep getting the following exception:
Generic Exception System.InvalidOperationException: The operation has
not completed yet. at Azure.Core.ArmOperationHelpers`1.get_Value()
at
Azure.ResourceManager.Resources.DeploymentsValidateOperation.get_Value()
at DeployTemplate.Program.d__3.MoveNext() in
\Program.cs:line 88
I even added a loop after the "StartValidate" to wait till the dp.HasCompleted is set to true. But this seems to run indefinetly. I also tried the "StartValidateAsync" method, which seems to have the same issue.
I wanted to understand if i am using this method correctly? if there is a better way to do the template validations? I could not find any examples on this method`s usage. if possible please share any code snippet where this method is used for my reference.
Note: Currently, Since this is not working, i am testing with Fluent Api way. That seems to be working. But, it requires lot of changes in our code as it creates ambiguity with many classes in "Azure.ResourceManager.Resources" which are already used for other operations.
I found that even though the deployment operations HasCompleted field is not set, when I call dp.GetRawResponse(), it returns the exact errors expected.
I now use this to validate my templates.

Terraform databricks labs provider

I have an issue with terraform databricks labs provider, the below code give me an error
"status 400: err Response from server {"error_code":"INVALID_PARAMETER_VALUE","message":"Path must be absolute: databricks"}
"
There is nothing in documentation about path parameter, I tried without "dbfs:" bit it did not like it as well ?
Any help would be welcome.
resource "databricks_dbfs_file" "log4j_files" {
content = filebase64("${path.module}/log.txt")
path = "dbfs://databricks/spark-log"
overwrite = true
mkdirs = true
validate_remote_file = true
}
Please make sure to use the latest version. Format of error message suggests you're using something before version 0.2.3.
Please do use source syntax of dbfs file resource, as shown this example for integration tests:
resource "databricks_dbfs_file" "show_variables" {
source = "${path.module}/log.txt"
path = "dbfs:/path/to/log.txt"
}
The path parameter needs to include the filename.
Here's the documentation you'll need as well: https://registry.terraform.io/providers/databrickslabs/databricks/latest/docs/resources/dbfs_file

Associate existing shared steps with a test case programmatically

I am trying to add a shared step to a test case that is being created using
CreateWorkItemAsync()
It is no problem to create test steps and add them to the test case
ITestStep testStep1 = testBase.CreateTestStep();
but I am trying to add an existing shared step to the test case. I cannot find a way in the Azure Devops SDK to do so.
There is a ISharedStepReference Interface which used to call a shared step set from a test case.
You should be able to add a shared step into a specific test case. A code snippet for your reference:
ITestManagementService testService = tfsCollection.GetService<ITestManagementService>();
ITestManagementTeamProject teamProject = testService.GetTeamProject(teamProjectName);
//find test case by testcase id
ITestCase testcase1 = teamProject.TestCases.Find(192); //192 is the testcase id
//find share steps by sharedstep id
ISharedStep sharedStep1 = teamProject.SharedSteps.Find(140); //140 is the shared step id
//Add shareSteps to the specific test case
ISharedStepReference sharedStepReference = testcase1.CreateSharedStepReference();
sharedStepReference.SharedStepId = sharedStep1.Id;
testcase1.Actions.Add(sharedStepReference);
testcase1.Save();
What I ended up doing is creating the test case using
CreateWorkItemAsync()
and then getting that work item using
GetWorkItemAsync()
and then editing the field
Microsoft.VSTS.TCM.Steps
as xml and inserting my own nodes into the proper places to add shared steps to the test case. You can see the format of the shared steps by getting a test case with an API GET request and looking at the format of Microsoft.VSTS.TCM.Steps.

Getting "required (...)+ loop did not match anything at input 'Scenario:'" error when using Background section in cucumber

I am writing a Karate DSL test to test a web service end point. I have defined my url base in karate-config.js file already. But when I try to use this in the Background section, I am getting the below error. Please help. Provided my feature file below.
Error: "required (...)+ loop did not match anything at input 'Scenario:'"
Feature: Test Data Management service endpoints that perform different operations with EPR
Background:
url dataManagementUrlBase
Scenario: Validate that the contractor's facility requirements are returned from EPR
Given path 'facilities'
And def inputpayload = read('classpath:dataManagementPayLoad.json')
And request inputpayload
When method post
Then status 200
And match $ == read('classpath:dataManagementExpectedJson.json')
You are missing a * before the url
Background:
* url dataManagementUrlBase

Extending WSS3 task lists to support recurring reminders

WSS 3.0 will let me send an email to a group when a new task is added to a task list. What I would like to do is to run a weekly task that sends out reminders of tasks due within certain periods, i.e. 2 days, 7 days, 14 days etc. I thought the simplest way would be to build a little C# app that sits on the WS2K3 box and queries the WSS database. Any ideas on which tables I should be checking? More generally is there an overall schema for the WSS3 database system?
If anyone is aware of an existing solution with code please let me know.
Thx++
Jerry
My suggestions:
don't create a console app, create a class that inherits from SPJobDefinition.
set SPJobLockType.Job to this timer, this will grant that the job is executed only once in the whole farm, even if you are running multiple front-end servers
in the, timer job, open the SPSite, SPWeb objects you need, then find the SPList\
Using SPQuery filter out only the items you need - I believe, you will have to filter out the ones where Status!=Complete
Loop through the results collection (which will be of type SPListItemCollection, apply your rules, checking the DueDate and Datetime.Now, send the e-mails
Since a task is simply a SPListItem, it has a Properties property, which is actually a property bag - you can add whatever properties you need. So, add a property My_LastSentReminderDate. Use this property to check if you are not sending too much of "corporate spam" :-)
To install your SPJobDefinition in a SharePoint farm, you can use a PowerShell script. I can give you examples, if needed.
Don't forget to Threading.Thread.CurrentThread.CurrentCulture = Your_SPWeb_Instance.Locale, otherwise date comparisons may not work if the web has a different locale!
EDIT: This is how a typical reminder looks like in my applications:
Public Class TypicalTimer
Inherits SPJobDefinition
Public Sub New(ByVal spJobName As String, ByVal opApplication As SPWebApplication)
'this way we can explicitly specify we need to lock the JOB
MyBase.New(spJobName, opApplication, Nothing, SPJobLockType.Job)
End Sub
Public Overrides Sub Execute(ByVal opGuid As System.Guid)
'whatever functionality is there in the base class...
MyBase.Execute(Guid.Empty)
Try
Using oSite As SPSite = New SPSite("http://yourserver/sites/yoursite/subsite")
Using oWeb As SPWeb = oSite.OpenWeb()
Threading.Thread.CurrentThread.CurrentCulture = oWeb.Locale
'find the task list and read the "suspects"
Dim oTasks As SPList = oWeb.Lists("YourTaskListTitle")
Dim oQuery As New SPQuery()
oQuery.Query = "<Where><Neq><FieldRef Name='Status'/>" & _
"<Value Type='Choice'>Complete</Value></Neq></Where>"
Dim oUndoneTasks As SPListItemCollection = oTasks.GetItems(oQuery)
'extra filtering of the suspects.
'this can also be done in the query, but I don't know your rules
For Each oUndoneTask As SPListItem In oUndoneTasks
If oUndoneTask(SPBuiltInFieldId.TaskDueDate) IsNot Nothing AndAlso _
CDate(oUndoneTask(SPBuiltInFieldId.TaskDueDate)) < Now().Date Then
' this is where you send the mail
End If
Next
End Using
End Using
Catch ex As Exception
MyErrorHelper.LogMessage(ex)
End Try
End Sub
End Class
To register a timer job, I typically use this kind of a script:
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Administration")
[System.Reflection.Assembly]::LoadWithPartialName("Your.Assembly.Name.Here")
$spsite= [Microsoft.SharePoint.SPSite]("http://yourserver/sites/yoursite/subsite")
$params = [System.String]("This text shows up in your timer job list (in Central Admin)", $spsite.WebApplication
$newTaskLoggerJob = new-object -type Your.Namespace.TypicalTimer -argumentList $params
$schedule = new-object Microsoft.SharePoint.SPDailySchedule
$schedule.BeginHour = 8
$schedule.BeginMinute = 0
$schedule.BeginSecond = 0
$schedule.EndHour = 8
$schedule.EndMinute = 59
$schedule.EndSecond = 59
$newTaskLoggerJob.Schedule = $schedule
$newTaskLoggerJob.Update()
Any time you need something in sharepoint that is executed periodically, 99 times out of a 100 you'll need to build a TimerJob. These are scheduled tasks that run inside SharePoint and you can create your own, then using a feature + featurereceiver to actually "install" the timoerjob (definition) and assign it a schedule.
For more info: see Andrew Connell's article on TimerJobs.
P.S. Never query /update the databases related to SharePoint directly! This will make you "unsupported", i.e. if anything happens microsoft will charge (a lot of) money to come and fix it, instead of being able to ask for regular support. (if you are say an MSDN subscriber you get up to 4 free support calls a year).
Don't bother trying to go directly to the database. You will have a very hard time because it's undocumented, unsupported, and not recommended. SharePoint does in fact have a full featured object model though.
If you reference Microsoft.SharePoint.dll (located in the Global Assembly Cache of a machine with SharePoint installed on it) you can access the data that way. The objects you'll want to start with are SPSite, SPWeb, SPList, SPQuery, and SPListItem. All of which you can find very easily on http://msdn.microsoft.com in a search.
Another less-flexible but code-free possibility you could try is creating several different views that include upcoming tasks then via the GUI set up an alert for when items are added to that view.

Resources