Disable Azure Automation Runbook Schedule using .net SDK - azure

I am trying to disable a Runbook schedule using .NET SDK
Retrieved the JobScheduled i want to disable and tried setting the associated runbook and schedule to null and "".
var schedulenm = new ScheduleAssociationProperty();
schedulenm.Name = "";
var runbooknm = new RunbookAssociationProperty();
runbooknm.Name = "";
jocsched.Properties.Schedule = schedulenm;
jobsched.Properties.Runbook = runbooknm;
Also tried directly querying the main schedule and set the IsEnabled property to false.
However that also doesnt have any impact.
What is the correct way to disable the schedule associated with a runbook?
( just want it disabled not deleted)

According to your description, if you want to disable the schedule associated with a runbook. You could use AutomationManagementClient.JobSchedules.Delete method.
The JobSchedules means the relationship between the runbook and schedule.
After calling this method, the runbook will not associate with schedule, but it will not delete the schedule.
More details, you could refer to below code sample:
var r2 = automationManagementClient.JobSchedules.List("groupname", "accountname").JobSchedules.First();
automationManagementClient.JobSchedules.Delete("groupname", "accountname", r2.Properties.Id);
Result:
You could see the schedule still existed.
Image1:
Image2:
Would that be the exact equivalent of setting the 'Enabled' property to No in the UI?
No, if you want to disable the schedule, you should use AutomationManagementClient.Schedules.Patch method.
More details, you could refer to this codes:
AutomationManagementClient automationManagementClient = new AutomationManagementClient(aadTokenCredentials, resourceManagerUri);
SchedulePatchParameters p1 = new SchedulePatchParameters("yourSchedulename");
SchedulePatchProperties p2 = new SchedulePatchProperties();
p2.IsEnabled = false;
p1.Properties = p2;
var result = automationManagementClient.Schedules.Patch("rgname", "am accountname", p1).StatusCode;
Result:

Related

Pulumi: Missing PrimaryAccessKey for EventGrid Topic

I'm using Pulumi 1.16 with dotnet/C# and the AzureNative stack. I try to create an EventGridTopic. To access the created resource's properties later I pull some output values.
Example code:
var topic = new Topic("eventgrid-topic-status", new TopicArgs
{
TopicName = "egt-status-dev",
ResourceGroupName = "rg-testapp-dev",
Location = "westeurope"
});
var endPointOutput = topic.Endpoint;
var endPointAccessKey = ""; // missing output property
The resource is being created. I found no way to get the access key properties:
PrimaryAccessKey
SecondaryAccessKey
In the former (elder) Azure stack the properties exist. But in Azure Native stack not. Is that on purpose, just work in progress, has been forgotten or is there some other way to retrieve these properties on this object?
This is output on Azure (old stack):
This is Azure Native, clearly the keys are missing:
I doubt that this happens accidentally and would like to understand what to do.
Azure API (and therefore Azure Native resources) return no sensitive information in their outputs automatically to minimize security risks. You have to make an explicit call to retrieve those.
In this case, you likely need to invoke the function listTopicSharedAccessKeys.
You will want to call the function from within an Apply to make sure that it's triggered only after the topic is created (e.g., not during preview):
var keys = topic.Name.Apply(topicName => ListTopicSharedAccessKeys.InvokeAsync(
new ListTopicSharedAccessKeysArgs
{
ResourceGroupName = "rg-testapp-dev",
TopicName = topicName
}));
If you don't want to hardcode the resource group name:
let keys = pulumi.all([rg.name, topic.name]).apply(arr =>
azn.eventgrid.listTopicSharedAccessKeys(
{
resourceGroupName: arr[0],
topicName: arr[1]
}
)
);
keys.apply(x => pulumi.log.info(x.key1 ?? ""));

How to create an image import job using KTA SDK?

I am trying to create a job using SDK. Simple job with send email activity work like a charm!
But when I try to create a job with variables input folder to import few images it doesn't work at all. Am I missing very trivial settings ?
My process has classification activity & extraction activities
Variables : DefaultImportFolder
FYI : My process works fine if I set import settings -> import sources. That tells me there is no issue with my process process Smile. But when I try to run through console app with dynamic variables, it doesn't work.
Following is my sample code. Any help?
ProcessIdentity processIdentity = new ProcessIdentity
{
Name = "SDK TestProcess"
};
var jobService = new TotalAgility.Sdk.JobService();
JobInitialization jobInitialization = new JobInitialization();
InputVariableCollection variablesCollections = new InputVariableCollection();
InputVariable inputVariable = new InputVariable
{
Id = "DefaultImportFolder",
Value = #"\\FolderPath",
};
variablesCollections.Add(inputVariable);
inputVariable = new InputVariable
{
Id = "ExportSuccess",
Value = "true"
};
variablesCollections.Add(inputVariable);
var createJobAndProgress = jobService.CreateJob(sessionId, processIdentity, jobInitialization);
Console.WriteLine($"Job ID {createJobAndProgress.Id}");
As Suggested by Steve, tried with WithDocuments method Still no luck .....
JobWithDocumentsInitialization jobWithDocsInitialization = new JobWithDocumentsInitialization();
Agility.Sdk.Model.Capture.RuntimeDocumentCollection documentsCollection = new Agility.Sdk.Model.Capture.RuntimeDocumentCollection();
Agility.Sdk.Model.Capture.RuntimeDocument runtimeDoc = new Agility.Sdk.Model.Capture.RuntimeDocument
{
FilePath = #"FolderPath\abc.tif",
};
documentsCollection.Add(runtimeDoc);
jobWithDocsInitialization.Documents = documentsCollection;
var jobIdentity = jobService.CreateJobWithDocuments(sessionId, processIdentity, jobWithDocsInitialization);
Console.WriteLine($"Job ID {jobIdentity.Id}");
A folder variable represents a reference to a folder that already exists in the KTA database, so you can't just set a file path to the variable. When you create a job via an import source, it is creating the folder and documents as part of creating the job.
To do the same in your code, you would use one of the "WithDocuments" APIs such as CreateJobWithDocuments which has parammeters specific to importing documents into the process, including by file path.
As discussed in this other answer (Kofax TotalAgility Send a PDF Document to Jobs Queue (KTA)), you may want to look at the sample code that is included with the product (that most people don't realize is available), and also look at other API functions for more context on the parameters needed for the "WithDocuments" APIs mentioned above.

Data tracking in DocumentDB

I was trying to keep the history of data (at least one step back) of DocumentDB.
For example, if I have a property called Name in document with value "Pieter". Now I am changing that to "Sam", I have to maintain the history , it was "Pieter" previously.
As of now I am thinking of a pre-trigger. Any other solutions ?
Cosmos DB (formerly DocumentDB) now offers change tracking via Change Feed. With Change Feed, you can listen for changes on a particular collection, ordered by modification within a partition.
Change feed is accessible via:
Azure Functions
DocumentDB (SQL) SDK
Change Feed Processor Library
For example, here's a snippet from the Change Feed documentation, on reading from the Change Feed, for a given partition (full code example in the doc here):
IDocumentQuery<Document> query = client.CreateDocumentChangeFeedQuery(
collectionUri,
new ChangeFeedOptions
{
PartitionKeyRangeId = pkRange.Id,
StartFromBeginning = true,
RequestContinuation = continuation,
MaxItemCount = -1,
// Set reading time: only show change feed results modified since StartTime
StartTime = DateTime.Now - TimeSpan.FromSeconds(30)
});
while (query.HasMoreResults)
{
FeedResponse<dynamic> readChangesResponse = query.ExecuteNextAsync<dynamic>().Result;
foreach (dynamic changedDocument in readChangesResponse)
{
Console.WriteLine("document: {0}", changedDocument);
}
checkpoints[pkRange.Id] = readChangesResponse.ResponseContinuation;
}
If you're trying to make an audit log I'd suggest looking into Event Sourcing.Building your domain from events ensures a correct log. See https://msdn.microsoft.com/en-us/library/dn589792.aspx and http://www.martinfowler.com/eaaDev/EventSourcing.html

SharePoint 2013 Activity Event in Newsfeed

I need to add custom notifications to the personal Newsfeed on people's MySites. I found several tutorials and code examples for SharePoint 2010 on the net and tried to do the same with SharePoint 2013. They're all about creating ActivityEvents with the ActivityManager.
Here's the code I tried:
var targetSite = new SPSite("URL to MySite webapp");
SPServiceContext context = SPServiceContext.GetContext(targetSite);
var userProfileManager = new UserProfileManager(context);
var ownerProfile = userProfileManager.GetUserProfile("domain\\user1");
var publisherProfile = userProfileManager.GetUserProfile("domain\\user2");
var activityManager = new ActivityManager(ownerProfile, context);
Entity publisher = new MinimalPerson(publisherProfile).CreateEntity(activityManager);
Entity owner = new MinimalPerson(ownerProfile).CreateEntity(activityManager);
ActivityEvent activityEvent = ActivityEvent.CreateActivityEvent(activityManager, 17, owner, publisher);
activityEvent.Name = "StatusMessage";
activityEvent.ItemPrivacy = (int)Privacy.Public;
activityEvent.Owner = owner;
activityEvent.Publisher = publisher;
activityEvent.Value = "HELLOOOO";
activityEvent.Commit();
ActivityFeedGatherer.BatchWriteActivityEvents(new List<ActivityEvent> { activityEvent }, 0, 1);
The Id 17 in the CreateActivityEvent function is for the StatusMessage activity type, which is layouted like {Publisher} says: {Value} in the ressource files, so I provide the Value property of my ActivityEvent.
The code runs without any exception and in the User Profile Service Application_ProfileDB database I can see the right entries appear in the ActivityEventsConsolidated table.
But the activity is not visible in the activity feed, neither on the Owner's one, nor on the Publisher's one, even though these people follow each other. I ran the Activity Feed Job in the CA manually to update the activity feed.
Also, I tried to do the same with custom ActivityTypes with own ressource files, same result: The entry in the ActivityEventsConsolidated table (or ActivityEventsPublished if Owner=Publisher) appear, but no entries on the MySite.
Can anyone help?
I found the solution for this problem myself.
In Central Administration, Setup MySites, you have to enable the Enable SharePoint 2010 activity migration setting in the Newsfeed section in order to support SP1010 legacy activities in SP2013.

unable to change the account reference inside the contact using sdk in crm2011

I am unable to change the client by updating the contact using crm 2011 sdk.Here is the code i am using to do that :
Entity contact = new Entity();
contact.LogicalName = "contact";
contact.Attributes = new AttributeCollection();
EntityReference clientLookup = new EntityReference();
clientLookup.Id = NewClientBId;
clientLookup.LogicalName = "account";
contact.Attributes.Add("parentcustomerid", clientLookup);
contact.Attributes.Add("contactid", workItem.Id);
SynchronousUtility.UpdateDynamicEntity(CrmConnector.Service, contact);
The code runs fine without any error but when i go to web portal and check the record ,it still points to the old account though updated the modofication time stamp.I also checked the sql profiler query which shows up as below :
exec sp_executesql N'update [ContactBase] set
[ModifiedOn]=#ModifiedOn0, [ModifiedBy]=#ModifiedBy0,
[ModifiedOnBehalfBy]=NULL where ([ContactId] =
#ContactId0)',N'#ModifiedOn0 datetime,#ModifiedBy0
uniqueidentifier,#ContactId0
uniqueidentifier',#ModifiedOn0='2013-07-04
09:21:02',#ModifiedBy0='2F8D969F-34AB-E111-9598-005056947387',#ContactId0='D80ACC4E-A185-E211-AB64-002324040068'
as can be seen above the column i have updated is not even there in the set clause of the update query.Can anyone help me with this ?
I tested your code and it works:
Entity contact = new Entity();
contact.LogicalName = "contact";
contact.Attributes = new AttributeCollection();
EntityReference clientLookup = new EntityReference();
clientLookup.Id = new Guid("3522bae7-5ae5-e211-9d27-b4b52f566dbc");
clientLookup.LogicalName = "account";
contact.Attributes.Add("parentcustomerid", clientLookup);
contact.Attributes.Add("contactid", new Guid("16dc4143-5ae5-e211-9d27-b4b52f566dbc"));
As you can see I used existing Id in my environment, and to perform the update I used
service.Update(contact);
Reasons why your code is not working:
NewClientBId is not the right account Guid
workItem.Id is not the right contact Guid
the function SynchronousUtility.UpdateDynamicEntity has errors

Resources