Can I determine which tasks in collection have subtask without additional API queries? - asana-api

I'm trying to deep-copy a task with nested subtasks using the API. Currently, I'm querying the subtasks endpoint for each task in my tree.
Is there any way to know if a task is going to have subtasks without having to hit the subtasks endpoint for each task?

Unfortunately you can't. We might make it possible to query subtasks on the task list in future, but it requires adding second level pagination which we'd need to design and that isn't on our roadmap.

Related

Azure data factory - Dashboard Log Query - Filter failed pipelines who successfully rerun

I've been tasked with reducing monitor overhead of a data lake (~80TiB) with multiple ADF pipelines running (~2k daily). Currently we are logging Failed pipeline runs by doing a query on ADFPipelineRun. I do not own these pipelines, nor do I know the inner workings of existing and future pipes, I cannot make assumptions on how to filter these by custom logic in my queries. Currently the team is experiencing fatigue with these, most failed piperuns are solved during their reruns.
How can I filter these failures so they dont show up when a rerun succeeds?
The logs exposes a few id's that initially looks interesting, like Id, PipelineRunId, CorrelationId, RunId, but none of these will link a failed pipe to a successful one.
The logs does however show an interesting column, UserProperties, that apparently can be dynamically populated during the pipeline run. There may be a solution to be found here, however it would require time and friction for all existing factories to be reconfigured.
Are there any obvious solutions I have overlooked here? Preferably Azure native solutions. I can see that reruns and failures are linked inside ADF Studio, but I cannot see a way to query it externally.
After a discussion with the owner of the ADF pipes we realized the naming convention of the pipelines would allow me to filter out the noisy failing pipes that would later succeed. It's not a universal solution but it will work for us as the naming convention is enforced across the business unit I am supporting

How to execute an Azure Function Once?

Reviewing the current options for CosmosDB (How to populate a CosmosDB collection by command line?) to prepopulate some Collections in CosmosDB, the unique alternative that I see, is the creation of an Azure Function and later disable it.
Using #TimerTrigger is possible to run a logic once? Or it is better to use #HttpTrigger?
What is your opinion?
Juan Antonio
The two are indeed very similar. The main difference is the way in which they are triggered, HTTP request vs a timer.
If you only want to execute an Azure function once, you should use an HttpTrigger function.
TimerTrigger functions are designed to offer the user the possibility of configuring their application to run every so often, which is not your use case.

Efficient way to do a batch job in spring

The requirement is : To query the database everyday at say 10 pm.Based on the result set from DB,call several 3rd party services and perform some business operations and then complete the job.
What is the best possible way to achieve this in spring.will spring batch or spring batch integration be good?
According to your steps it would be good that you will take a look into Spring Integration, too, and decide yourself what to be the best for you.
Spring Integration provides JDBC Inbound Channel Adapter, which really can poll DB using Cron Trigger. The result of the DB execution you really can to any other service, e.g. <int-ws:outbound-gateway> or just generic <service-activator>.
And even you can do some parallelism for several records from DB.
Not sure what you mean with "and then complete the job", but the work will be done automatically after the last records processed.
I think you really can come up with something similar using just Spring Batch, because there are really enough useful components, like to read DB, as well as implement your own to cal third party services.
Plus you can manage jobs via Repository.
To determine the difference and scope you should read manuals of both projects and decide yourself how to be further.

Dynamics CRM: Scheduling workflows

In my CRM I've applications which should be checked and processed by a workflow once a minute.
I was wondering if there anyway to automate this stuff using some sort of cron task or scheduling. I'm relatively new to CRM.
What should I do to make the stuff above, using standard CRM tools, or third party plugins?
Sultan.
CRM doesn't have a good way of handling this. Here are the options generally available inside CRM:
Create a workflow that runs, checks what you need it to do, waits for a period of time and calls itself recursively. If the interval you needed to check at was longer than a minute, this might work, however, CRM has loop detection built into the workflows, and running them once a minute will definitely trigger that.
Create an entity that represents one of your processes. Create a workflow that kicks off after create of this entity, waits one minute, and then creates a new record of your entity. This way, the workflow isn't calling itself recursively and it shouldn't trigger CRM's loop detection. However, you're creating a lot of dummy records and workflow instances that you'll need to clean up in this scenario.
I think both of these are kind of hacky. I would say that if you need to check something once every minute, I'd put it outside CRM in a Windows Service or a Scheduled Task. CRM just doesn't have this capability built in.

Unit/Automated Testing in a workflow system

Do you do automated testing on a complex workflow system like K2?
We are building a system with extensive integration between Sharepoint 2007 and K2. I can't even imagine where to start with automated testing as the workflow involves multiple users interacting with Sharepoint, K2 workflows and custom web pages.
Has anyone done automated testing on a workflow server like K2? Is it more effort than it's worth?
I'm having a similar problem testing workflow-heavy MOSS-based application. Workflows in our case are based on WWF.
My idea is to mock pretty much everything that you can't control from unit tests - documents storage, authentication, user rights and actions, sharepoint-specific parts of workflows for sharepoint (these mocks should be thoroughly tested to mirror behavior of real components).
You use inversion of control to make code choose which component to use at runtime - real or mock.
Then you can write system-wide tests to test workflows behavior - setting up your own environment, checking how workflow engine reacts. These tests are too big to call them unit-tests, still it is automated testing.
This approach seems to work on trivial cases, but I still have to prove it is worthy to use in real-world workflows.
Here's the solution I use. This is a simple wrapper around the runtime that allows executing single activity, simplifies passing the parameters, blocks the invoking thread until the workflow or activity is done, and translates / rethrows exceptions if any. Since my workflow only sends or waits for messages through a custom workflow service, I can mock out the service to expect certain messages from workflow and post certain messages to it and here I'm having real unit-tests for my WF! The credit for technology goes to Michael Kennedy.
If you are going to do unit testing, Typemock Isolator is the only tool that can currently mock SharePoint objects.
And by the way, Richard Fennell is working on a workflow mocking solution here.
We've just today written an application that monitors our K2 worklist, picks up certain tasks from it, fills in some data and submits the tasks for completion. This is allowing us to perform automated testing, find regressions, and run through as many different paths of the workflow in a fraction of the time that it would take people to do it. I'd imagine a similar program could be written to pretend to be sharepoint.
As for the unit testing of the workflow items themselves, we have a dll referenced from k2 which contains all of our line rule and processing logic. We don't have any code in the k2 workflows themselves, it is all referenced from these dlls. This allows us to easily write unit tests on them to test all of the individual line rules.
I've done automated integration testing on K2 workflows using the K2ROM API (probably SourceCode.Workflow.Client if you're using K2 blackpearl).
Basically you start a process on a test server with a known folio (I generate a GUID), then use the management API to delete it afterwards. I wrote helper methods like AssertAtClientActivity (basically calls ProvideWorkItem with criteria).
Use the IsSynchronous parameter to StartProcessInstance, WorklistItem.Finish, etc. so that relevant method calls will not return until the process instance has reached a stable state.
Expect tests to be slow and to occasionally fail. These are not unit tests.
If you want to write unit tests against other systems, you'll probably want to wrap the K2 API.
Consider looking at Windows Workflow 4 and the new workflow features in SharePoint 2010. You may not need K2.

Resources