Original Estimates in Azure DevOps - Max value? - azure

Our evolution of using DevOps is continuing (slowly but surely). One thing we've noticed is that some people are trying to but excessive estimates in for their time, but what we really want to be encouraging is for people to be breaking work down into multiple tasks.
Is there a way that we can set our DevOps work items to only accept a maximum value? I've had a look at the 'rules' and there doesn't seem to be anything there to let us do this, and because it's an out of the box field I don't think we can put a value limit against it.
I suppose what I want to understand is whether it would be possible to do this in some way? Could I do something with the existing 'Original Estimate' field or would I have to create a new custom field to have any chance of preventing people from putting in 100 hours for something that's actually more like 2?

If you are also using Boards, you could highlight work items where the original estimate is higher than a certain value. This would not prevent setting these values, but rather encourage the users to put in lower values.
https://learn.microsoft.com/en-us/azure/devops/boards/boards/customize-cards?view=azure-devops
Beware that this might not really help the underlying issue: People must be convinced of the benefits of splitting up tasks, otherwise they will just work around the tooling. Like always putting in the maximum value or not putting in the actual work hours.

Is there a way that we can set our DevOps work items to only accept a
maximum value?
I am afraid that setting the value limit for the Original Estimates field is currently not supported.
As workaround, you could need to create a custom field of type Picklist, and then specify the available values in the picklist.
You could add your request for this feature on our UserVoice site , which is our main forum for product suggestions.After suggest raised, you can vote and add your comments for this feedback. The product team would provide the updates if they view it.

Related

Getting started at tradingview and referencing balances

I may be new to tradingview but their pinescript programming language seems to be the best I've ever seen for automated trading. They seem to really want me to succeed but I cannot find where it tells me how to access the balances for certain balances. I am trying to make a code where I do not reinvest the extra I make so I have to be able to reference the available amount. I have not quite finished the manual yet but I do not see what variable or function allows me to do that, or at least not where I would expect it.
Have a look at strategy.equity. There are quite a few built-in variables for strategy values. You can inspect them from the refman by searching on "strategy".
You can also calculate your own metrics using a technique like this one if you don't find what you need in the built-ins.
And welcome to Pine! This is the best place to start your journey:
https://www.tradingview.com/?solution=43000561836

VariesAcrossGroups lost when ReInsert_ing doc.ParameterBindings?

Our plugin maintains some instance parameter values across many elements, including those in groups.
Occasionally the end users will introduce data that activates an unused Category,
so we have to update the document parameter bindings, to include those categories. However, when we call
doc.ParameterBindings.ReInsert()
our existing parameter values inside groups are lost, because our VariesAcrossGroups flag is toggled back to false?
How did Revit intend this to work - are we supposed to use this in a different way, to not trigger this problem?
ReInsert() expects a base Definition argument, and would usualy get an ExternalDefinition supplied.
To learn, I instead tried to scan through the definition-keys of existing bindings and match those.
This way, I got the document's InternalDefinition, and tried calling Reinsert with that instead
(my hope was, that since its existing InternalDefinition DID include VariesAcrossGroups=true, this would help). Alas, Reinsert doesn't seem to care.
The problem, as you might guess, is that after VariesAcrossGroups=False, a lot of my instance parameters have collapsed into each other, so they all hold identical values. Given that they are IDs, this is less than ideal.
My current (intended) solution is to instead grab a backup of all existing parameter values BEFORE I update the bindings, then after the binding-update and variesAcrossGroups back to true, then inspect all values and re-assign all parameter-values that have been broken. But as you may surmise, this is less than ideal - it will be horribly slow for the users to use our plugin, and frankly it seems like something the revitAPI should take care of, not the plugin developer.
Are we using this the wrong way?
One approach I have considered, is to bind every possibly category I can think of, up front and once only. But I'm not sure that is possible. Categories in themselves are also difficult to work with, as you can only create them indirectly, by using your Project-Document as a factory (i.e. you cannot create a category yourself, you can only indirectly ask the Document to - maybe! - create a category for you, that you request). Because of this, I don't think you can bind for all categories up front - some categories only become available in the document, AFTER you have included a given family/type in your project.
To sum it up: First, I
doc.ParameterBindings.ReInsert()
my binding, with the updated categories. Then, I call
InternalDefinition.SetAllowVaryBetweenGroups()
(after having determined IDEF.VariesAcrossGroups has reverted back to false.)
I am interested to hear the best way to do this, without destroying the client's existing data.
Thank you very much in advance.
(I'm not sure I will accept my own answer).
My answer is just, that you can survive-circumvent this problem,
by scanning the entire revit database for your existing parmater values, before you update the document bindings.
Afterwards, you reset VariesAcrossGroups back to its lost value.
Then, you iterate through your collected parameters, and verify which ones have lost their original value, and reset them back to their intended value.
One trick that speeds this up a bit, is that you can check Element.GroupId <> -1. That is, those elements that are group members.
You only need to track elements which are group members, as it's precisely those that are affected by this Revit bug.
A further tip is, that you should not only watch out for parameter-values that have lost their original value. You must also watch out for parameter-values that have accidentally GOTTEN a value, but which should be left un-set.
I just use FilteredElementCollector with WhereElementIsNotElementType().
Performance-wise, it is of course horrible to do all this,
but given how Revit behaves, I see no other solution if you have to ship to your clients.

Is It Possible To Reference TFS Work Item Fields More Than Once Within The Same Work Item?

We are currently in the process of upgrading from TFS 2008 to TFS 2012. When TFS 2008 was set up, the people involved didn't understand a lot of what the work item fields were for, and we ended up with very heavily customised templates and in fact lost a lot of default fields. As part of the upgrade to 2012 we are trying to return to the out of the box templates as much as possible to ensure we get to use as many of the features as possible, however there are a small number of custom fields that we need to include for reporting purposes.
Our product development process involves a roadmap for upcoming releases which includes new work as well as bug fixes. When a bug is assigned to be worked on by the developers we would like to be able to choose which release we're targeting the fix for - as far as I can see, Iteration is best suited for this. At the point the bug is closed though, we would also like to track what release it was actually fixed in, since things often get bumped from one release to the next if higher priority bugs or change requests come in, but this is where we come unstuck since I can't seem to assign Iteration to both fields such that the two show different values.
If possible we would prefer not to have global lists that have to be constantly updated with release numbers across our product range (we have around 8 different products which are constantly in development, each with their own release numbers), and leaving one of them as a text field leaves open the possibility that we will get inconsistencies in what people enter, eg 1.01 versus 1.1 which will show up in reporting as 2 different releases. As the fields are just looking up a set of values in the background, is there no way that the iteration list can be used twice? Or does someone have an alternative suggestion as to how we get round this?
What I think I'd suggest in this case is using a COPY rule on a state change event, so that when you move your work item into the Closed state, it would populate your custom field with the value currently in your Iteration field.
This would give you a snapshot of the value at the right point in time which then wouldn't be altered if the iteration was later changed, along with a history entry if it was opened & closed multiple times over its lifetime.
As iteration is time limited and release is perpetual there is an inherent mismatch of purpose with using iteration here. Iteration is for planning.
You would be better creating a release list with the version that you release.
If you are sprinting for example you may not know up front which release you will end up on before you start. If you are not sprinting then you are just kidding yourself that your know.

Dynamics CRM 2011 Import Data Duplication Rules

I have a requirement in which I need to import data from excel (CSV) to Dynamics CRM regularly.
Instead of using some simple Data Duplication Rules, I need to implement a point system to determine whether a data is considered duplicate or not.
Let me give an example. For example these are the particular rules for Import:
First Name, exact match, 10 pts
Last Name, exact match, 15 pts
Email, exact match, 20 pts
Mobile Phone, exact match, 5 pts
And then the Threshold value => 19 pts
Now, if a record have First Name and Last Name matched with an old record in the entity, the points will be 25 pts, which is higher than the threshold (19 pts), therefore the data is considered as Duplicate
If, for example, the particular record only have same First Name and Mobile Phone, the points will be 15 pts, which is lower than the threshold and thus considered as Non-Duplicate
What is the best approach to achieve this requirement? Is it possible to utilize the default functionality of Import Data in the MS CRM? Is there any 3rd party Add-on that answer my requirement above?
Thank you for all the help.
Updated
Hi Konrad, thank you for your suggestions, let me elaborate here:
Excel. You could filter out the data using Excel and then, once you've obtained a unique list, import it.
Nice one but I don't think it is really workable in my case, the data will be coming regularly from client in moderate numbers (hundreds to thousands). Typically client won't check about the duplication on the data.
Workflow. Run a process removing any instance calculated as a duplicate.
Workflow is a good idea, however since it is being processed asynchronously, my concern is the user in some cases may already do some update/changes to the data inserted, before the workflow finish working.. therefore creating some data inconsistency or at the very least confusing user experience
Plugin. On every creation of a new record, you'd check if it's to be regarded as duplicate-ish and cancel it's creation (or mark for removal).
I like this approach. So I just import like usual (for example, to contact entity), but I already have a plugin in place that getting triggered every time a record is created, the plugin will check whether the record is duplicat-ish or not and took necessary action.
I haven't been fiddling a lot with duplicate detection but looking at your criteria you might be able to make rules that match those, pretty much three rules to cover your cases, full name match, last name and mobile phone match and email match.
If you want to do the points system I haven't seen any out of the box components that solve this, however CRM Extensions have a product called Import Manager that might have that kind of duplicate detection. They claim to have customized duplicate checking. Might be worth asking them about this.
Otherwise it's custom coding that will solve this problem.
I can think of the following approaches to the task (depending on the number of records, repetitiveness of the import, automatization requirement etc.) they may be all good somehow. Would you care to elaborate on the current conditions?
Excel. You could filter out the data using Excel and then, once you've obtained a unique list, import it.
Plugin. On every creation of a new record, you'd check if it's to be regarded as duplicate-ish and cancel it's creation (or mark for removal).
Workflow. Run a process removing any instance calculated as a duplicate.
You also need to consider the implication of such elimination of data. There's a mathematical issue. Suppose that the uniqueness' radius (i.e. the threshold in this 1D case) is 3. Consider the following set of numbers (it's listed twice, just in different order).
1 3 5 7 -> 1 _ 5 _
3 1 5 7 -> _ 3 _ 7
Are you sure that's the intended result? Under some circumstances, you can even end up with sets of records of different sizes (only depending on the order). I'm a bit curious on why and how the setup came up.
Personally, I'd go with plugin, if the above is OK by you. If you need to make sure that some of the unique-ish elements never get omitted, you'd probably best of applying a test algorithm to a backup of the data. However, that may defeat it's purpose.
In fact, it sounds so interesting that I might create the solution for you (just to show it can be done) and blog about it. What's the dead-line?

MonoTouch.Dialog and limiting the number of rows

I have a MonoTouch.Dialog with UIViewElements that contain a UIWebView. This is a messaging type app and messages are constantly being added. I can ealisy get over 1000 UIViewElements and there is no limit. Obviously memory is going to be an issue.
What is the best way to only keep the last 100 rows?
I need a first in first out type setup. I can manually remove the Elements but do not know if there is a better way.
As far as I know, you do have to do this manually. I used TweetStation as an example.
Though this is not the latest app you can get (2010), it does cover a lot of the basics you need to understand.
Here is more information on building custom cells, if you do not know already.
The core part of managing your list of elements is to override the CreateSizingSource-method on your inheriting DialogViewController-class and creating your own SizingSource as shown in this part of TweetStation.
I hope this will get you started.

Resources