Was wondering if anyone knows of a plugin for JIRA that would provide a consistent view on a daily / weekly / Monthly / etc. basis for People / Resource Tracking and forecasting of resource availability - especially if resource could be used on multiple "projects" potentially.
Current Tasks assigned with duration and period of effort
Future Tasks assigned with duration and period of effort
Availability for assignment of tasks based on current and future items assigned already
Conflict, overallocation, prioritization - ability to quickly see over / under subscription
Ability for individuals to update progress on tasks
Ability for managers to generate reports
etc.
This is for a scenario if you have ~100+ people spread across 5 locations globally with different people managers and multiple Book of Works / multiple Projects, etc.
We developed and support our JIRA add-on for resource planning and issue scheduling — ActivityTimeline.
ActivityTimeline provides weekly grid-based dashboard where rows are people, columns — days of week. Drag'n'drop a JIRA task to schedule it to some day; Resize or Move issues to change planned start/end dates. Everything is visible on one screen and you can move backward and forward across weeks.
Details at http://activitytimeline.com
Sorry if this sounds too commercial but hopefully it will be of use to some.
Tempo- Timesheet
Feywa provides you with resource allocation and resource utilization feature but supports only JIRA 4.3.4.
In future, if you ever happen to upgrade your JIRA(which you obviously would), Feywa support will be lost.
i am looking forward to a solution for Resource allocation and resource utilization for JIRA 4.4.3. Any help is extremely appreciated.
I have a similar need and have come across a recent announcement from Tempo http://www.tempoplugin.com/ of release of two new modules that I am hoping will address these needs, Tempo Planner and Tempo Books - http://blog.tempoplugin.com/2013/tm-software-introduces-a-project-and-portfolio-management-solution-for-atlassians-jira/ .
You can use eazyBI reporting application with JIRA integration to create ad-hoc JIRA issues reports based on any standard dimension (projects, components, reporters, assignees, issue types, priorities, statuses, time etc.) as well as using any custom fields.
if you're still looking, this add-on might help : forecast.geertjan.it/ It allows you to assign future tasks to a period(s) and teams. It is multi-project and also shows team (over)allocation. Contact me, if you have questions.
Related
we are using Azure DevOps as our ALM system. When a user story or bug fix is resolved, it shows in a public query - like a stack - where our QA team members subsequentially pull tickets independently for verification. As this is part of a pull request review, a PR can not be merged unless QA finished testing. So we aim for fast response times and parallelization of testing to minimise the potential of merge conflicts. Often times, we find that multiple work items are self-assigned to the same people, while other team members do not have work items assigned, increasing the potential response times for our devs (unless people change assignments) and leading to a rather subsequential then parallel verification of work items
So we are looking for a way in Azure Dev Ops that allows us to make sure that members of a certain user group can only be assigned one work item of certain work item type and state at the time. We looked into Custom Rules in detail but failed to get anything like this out of it. I'm thankful for any ideas and hints on how this can be accomplished (extensions also welcome)
There is no such rule or policy in Azure DevOps.
And it won't prevent someone from working on it anyway to be honest... I assume testing multiple changes in a single go isn't an option? It would simplefy things tremendously...
For my work I've been assigned multiple Azure DevOps projects that I have been asked to create dashboards for. My boss is interested in a dashboard for each individual project (at that project level) and one master dashboard that consolidates all the information across projects. I've looked at the main Microsoft support link for this (https://learn.microsoft.com/en-us/azure/devops/project/work-across-projects-faqs?view=azure-devops), as well as numerous other resources, and I am still having several issues. I'm trying to work around the following constraints:
We don't want to create another project and use the "Query Across Projects" feature, as we are trying to utilize the hierarchical structure of Azure DevOps projects.
We'd like to avoid OData Queries / PowerBI, as I've run into numerous issues with them. Specifically, the data I get from the query is in a very ugly format (almost entirely hashed strings and other unclean data).
I'm open to any and all suggestions and would really appreciate pointers to other resources - I've had a hard time finding people who are looking to do similar tasks or who are having similar problems, so any help is much obliged.
I did some work on it, I guess we have a dashboard very much coupled with project.
Program level dashboard as in project management. But there is concept of Portfolio management, check if that helps: https://learn.microsoft.com/en-us/azure/devops/boards/plans/portfolio-management?view=azure-devops
How can I create a dashboard across multiple Azure DevOps Projects?
Starting from November 12 2020, MS announce a public preview for Delivery Plans 2.0, which provide a first-class roadmap and timeline solution natively in Azure Boards. The initial preview will include these features:
Bringing Delivery Plans into the core product, rather than requiring
an extension to be installed.
Enabling work items to span iteration boundaries.
Enabling drag and drop borders to show when a work item starts and
ends.
Enabling stakeholders to view plans.
Use Delivery Plans to ensure your teams are aligned with your organizational goals. You can view multiple backlogs and multiple teams across your whole account. You can interact with the plan with simple drag-and-drop operations to update or modify the schedule, opening cards, expanding and collapsing teams, and more.
You could check the document Review team Delivery Plans for some more details.
We're transitioning to a managed service provider for our IT service desk and deskside and we're working out the details of their SLAs. Many of the SLAs are based on ticket status. An example of this is the following:
"Measures the amount of time it takes to assess, schedule, test, and package application packages before they are available for User Acceptance Testing."
My first thought was to try using SLAs to measure this, as they neatly tie together calendars and priorities, but I'm having a really hard time finding any information about how I could do this.
Now I'm looking into using the TKSTATUS.STATUSTRACKING attribute on tickets, but I believe this just tracks straight 24/7 time instead of taking into consideration any calendars.
Has anyone tried this before? Any suggestions?
We are in a similar process, but we are measuring time on site of vendors for our work orders. Opposed to using any SLAs we have diffrent statuses which mark different events. Then when we want to know how long it took for an event to finish, we look to see the time required to change status in the wostatus table.
I am using MVC3, EF5, LINQ, .NET4.5, SQL Database.
Microsoft has just brought out the new service levels for SQL Databases ie Basic, Standard and Premium.
Originally I was using the "Web" SQL database since my DB was small ie about 30mb. However on my test web site instance I have been using Basic web site and "Basic" SQL Database setups to save money.
I have a "slower" running query which suddenly took 9secs when my Live DB was restored as a "Basic" new style DB on the test instance. It tool about 2.5 secs on live. When I scaled up this test DB instance to "Standard" SO, 20 DTUs, it took 3.9 secs. When I then scaled this DB back to the "retired" "Web" format, it then took 1.9 secs which really surprised me. It is as if one needs to scale the DB to S1 to get comparable performance to the old "Web" style DB, but I suspect this will then cost more than the old "Web" format DB.
I appreciate any comments on the above, especially if other have found the new DB styles can be slower.
At the end of the day, what setup in the new DB style is the old "Web" style equivalent to?
Thanks.
EDIT (THIS IS REALLY REALLY WORRYING)
I have discovered a very useful document on this, and my worst fears are confirmed
see Web/Business comparison with new SQL Database service tiers. These are very, very worrying as it seems that web database performance can only be matched by the "Premium P1" edition, and we would not be able to afford the use of this. So for the time being we will continue to use the "Web" edition.
EDIT, Seem to have touched a raw nerve.... There are many worried folks about this....
see: Forum chat with worried users
FEEDBACK FROM .NET USER GROUP
I have also been speaking with a number of my Azure using .NET peers at a recent user group meeting, and they were also very worried to the extend they believed developers would just leave Azure. I think one of the key mistakes here, by Microsoft, is to set the performance of Basic well below that of Web(most of the time) and even S1 and S2 below web. It is only when you get onto P1 and P2 that you experience a par, and we dare not use this in test due to the impact on charges. In our experience Web has performed at this high level for 90% of the time. I am guessing the 10% is there, since you say it is, but non of our clients have complained about this. However to retain our current level of performance we would need to upgrade to S2 or P1 which would have an extraordinary impact on our monthly charges. Jim Rand's feedback is appreciated, and backs up our concerns.
I am the author of the blog post mentioned above. A more up to date version of that post is available:
http://cbailiss.wordpress.com/2014/09/16/performance-in-new-azure-sql-database-performance-tiers/
The tests I conducted were primarily around the physical I/O capabilities of the new service tiers. From those tests I believe that P1 offers roughly the same I/O on average as Web/Business.
So, the specific answer to your question:
At the end of the day, what setup in the
new DB style is the old "Web" style equivalent to?
If you were running toward the physical I/O limits of Web/Business (roughly speaking 200MB+ read, 50MB+ write per minute), then I would say a minimum of P1 is needed to offer equivalent I/O performance in the newer service tiers.
If on average your I/O is generally much less than the figures above, then the database may perform OK on one of the Standard Tiers.
My tests didn't quantify/compare CPU or memory differences between Web/Business and the new tiers, but they too scale by service tier in the new world. The sys.resource_stats DMV in the master database might offer some insight for your workload. See the newer blog post above for more details.
For completeness, it is worth mentioning that the newer service tiers do offer some other advantages likely supporting more connections concurrently, new availability features, new backup features, etc.
Hope that helps...
EDIT: Jan 2015: A new Standard S3 performance level is currently in preview as part of the Azure SQL Database v12 version. This looks like it will offer price-performance at a point much closer to Business Edition than has been available until now. In addition, every service tier and performance level looks to be gaining higher performance in v12. See my blog post for details:
https://cbailiss.wordpress.com/2014/12/17/azure-sql-database-v12-performance-tests-show-significant-performance-increase/
Chris
System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. Hit this last Thursday. Converting data from old system to SQL Azure. Chose the new Standard (S2) instead of the 5 gig web (retired) database.
The SQL:
UPDATE Invoice
SET SalesOrderID = O.SalesOrderID
FROM Invoice
INNER JOIN SalesOrder AS O ON Invoice.InvoiceID = O.InvoiceID
196043 rows. Re ran and it took over 4 minutes. Exported database and reloaded it into the web edition. Query took 19 seconds. Total database size is about 750 megabytes.
Bottom line, this is more than "all a little worrying". Unless Microsoft gets the performance up on the new basic / standard / premium tiers to where it is now in the web edition, they can pretty much kiss Azure goodbye. Totally unreasonable that you can't run a query on only 196043 rows unless the the data is in the cache. So much for analytics with a relational database.
I'll be advising my client this week of this matter. Undoubtedly, he will be contacting upper management at Microsoft.
Jim, I'd be happy to help. We know that changing business models is a hard thing to do. In the Web/Business case, you pay on size of the DB and you get whatever performance we have at the time. Sometimes this is great, other times this is ok and sometimes performance is very poor. Customers have given us feedback that this unpredictable performance is very difficult to deal with.
Using this feedback as a key input, the business model for Basic/Standard/Premium is $/perf. Understanding what resources your consuming is a great first step before moving to B/S/P. We have several pieces of new guidance that should help you do this
http://azure.microsoft.com/en-us/documentation/articles/sql-database-upgrade-new-service-tiers/
Your mileage may vary here. Many customers see a decrease because of this business model change. Others see no impact, and some will see an increase if their DBs are very small and consume a lot of resources. I and the team would be happy to help customers move into the new business model. To have great conversations will need some customer specifics that aren't best shared in a public forum. guyhay#microsoft is my email if you'd like to have that conversation.
I'm looking at designing some core information systems at a new company I'm working at (described one of my ideas here Workflow system)
I've thought a bit more, and am strongly considering using sharepoint for a lot of the heavy lifting seeing as it comes with so much out of the box.
However, I'm not sure how it will handle the high volume of data we'll be throwing at it. I read the MS whitepaper (http://go.microsoft.com/fwlink/?LinkId=95450&clcid=0x409), and it says about 2000 items in a list is about the limit using traditional design methods.
But first a bit of info on my plan and data structures :
We have multiple clients. Each client has multiple applications. Each application will have multiple, ongoing jobs (or process runs).
Each application will store significant correspondence and documentation. Each job represents the processing of a data file on a single run, and stores information about the job such as the postscript file, postal manifests, etc.
Job volume will be about 50 - 100 a day. Each job will have a workflow, triggered by external programs. Then, say on a "job scheduler" page, production staff can schedule the jobs and perform custom actions on the job (written as plugins).
I was thinking the jobs would sit outside and accessed via the BDC, but I would still like them represented in sharepoint lists, to add in sharepoint functionality and reporting, and they'd be accessible in multiple places
e.g.
Application portal - see jobs for application
Production scheduler - see lists of upcoming jobs, assign to resources, trigger other functionality (e.g. copy print file to printer, produce mailing machine file)
Invoicing view - view completed but uninvoiced jobs, export to accounting package
Client view - client portal displays jobs, invoices, stock levels (from external warehouse system), documentation, change register / helpdesk
So basic info about the job would sit in the BDC, but then sharepoint would capture additional metadata about each job. Also, down the line we might put in more advanced workflows using WF or something like K2 blackpoint / blackpearl.
Is this feasible? Any resources you'd recommend to read to get up to speed?
To use SharePoint, you should concentrate on what SharePoint is good at and what it is designed for.
SharePoint is a great collaboration portal, it is not so good as a simple high volume database. So...
You can setup a small site for each client and subsites for each job. The goal of the "job site" is to display (using a webpart perhaps) the relevant upcoming jobs, a list of job errors/exceptions and relevant team documentation on each job.
Separate sites can be created to give a particular "view" of the jobs. E.g an "Invoicing" site can be created to give a view again from BDC webparts of what is requiring invoicing.
https://iwsolve.partners.extranet.microsoft.com/SDPS/ may provide some help.
Don't try and store huge amounts of information in a SharePoint list, just because it may be possible to "tag" it with meta data. A database table is perfectly able to include columns supplying additional information if required.
Think about it this way. If you are creating 50-100 jobs a day, putting that data into a list pre-supposes your sites users are going to want to enter metadata on those jobs manually. I thought not, so create systems you need in order to get the metadata stored correctly at source, or store metadata about the "types" of jobs within a SharePoint list and allow SharePoint to match the job type with jobs in the BDC.
SharePoint will help you to integrate all your systems information together, but unfortunately it looks like you have a lot of work to do just planning what information should go where and how each type of use will view it.
Please take a look at this blog post I wrote on managing large SharePoint lists for better performance- it might offer a bit of an explanation for the 2,000 items issue, which is not actually a hard limit on the number of items in a list, as SharePoint will support up to 5 million items per list. One way around this would be to create and maintain different views that filter by an indexed field to show you different items, up to 2,000 at a time. Hope that helps.
Dina Ayoub
Program Manager
Windows SharePoint Services
SharePoint is probably quite a good fit for the UI side of things, though you'll need to think carefully about which parts are stored and modified in SharePoint lists and which parts are stored elsewhere. That's not so much a SharePoint issue as something you always have to deal with when you have multiple data sources.
I'd probably use a SharePoint list as the primary store for jobs, to avoid any sync issues and make editing easier. The volume of data shouldn't be an issue - just make sure you aren't trying to display 2000 items at once - it's the view, not the list itself that runs into performance issues on large numbers of items.
Tough question Dane... I would like to know a little more about your design / vision before giving an opinion.
Based on what I read in your question I would not use SharePoint 2007 as a development platform for this application.
1) Development experience in SharePoint 2007 can be painful and unproductive at times.
Hard to debug
Steep learning curve
2) Easy to get in trouble with performance
Data Layer is complex and can require expert SQL / SharePoint Admin skills to make platform scale.
Content databases should not exceed 100 GB.
3) Deployment can be extremely difficult depending on what you are doing.
4) New version will be released in the next 12 months.
Just my .02.