Is the AMP lab done next February? - apache-spark

First, I'm not sure where to post this, so feel free to close if it doesn't belong here.
Is there a way to track whether Berkeley's AMP lab will indeed shutdown next year?
From their about site:
The AMPLab is a five-year collaborative effort at UC Berkeley
and it was started in February 2011.
So, I was curious if this was a hard date, or if it will be extended (or has already been extended?)

Relaying an answer from AMP director Mike Franklin:
One year into the lab we got a 5 yr Expeditions in Computing Award as
part of the White House Big Data initiative in 2012, so we extend the
lab for a year. We intend to start winding it down at the end of
2016, while supporting existing projects and students who will be
finishing up. The AMPLab faculty are starting discussions this
summer about what research challenges we'd like to tackle next, and
how best to organize to do so.
An interesting thing to note is that the Spark project started at
about this point in the AMPLab predecessor project (RADLab) so we have
a track record of being able to make these transitions.

Related

Should developers be allowed to participate in backlog planning processes? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
I recently interviewed with a company which has started introducing Scrum for their development cycles. I asked one of the developers how their experience has been, and it sounds like they are completely divested from the planning process. He wasn't allowed any input as to what went into a given Sprint, and didn't participate in any planning or grooming activities.
Basically, at the start of the last Sprint (or two) he was handed a to-do list. He had to breakdown items into their respective tasks (so they could be worked on over the Sprint), but wasn't involved in any planning activities; I'm skeptical he was allowed much input into how much effort an item might take -- I suspect the architects decided this for the team.
Is this how Scrum should be handled? My current team fully participates in all planning activities, continually adding our input as to how features may be addressed and how much effort they might take. I'm a bit skeptical (and nervous) about a company which simply hands developers a to-do list without asking for their input.
Note: I understand that once a Sprint starts, the list really is a prioritized to-do list. My concern is not having input into the planning process from the start.
If those who are doing the work don't get to give input saying what amount of work can fit into a sprint and let the business decide whats most important and should be scheduled to fit. Its not going to work run away. They are using new trendy agile words but doing the same old things.
(...) He wasn't allowed any input as to what went into a given Sprint, and didn't participate in any planning or grooming activities.
Obviously, they're still doing command and control and micro-management (the team is not empowered and self-organizing) and they are still using push-based scheduling (they didn't enable pull-scheduling).
Scrum has other characteristics but the above points are more than enough to say that they aren't doing Scrum, regardless of how they name it, they didn't really shift from the outdated waterfall approach (they just did put some lipstick on the pig).
This is a big hint that they're still totally clueless about what Scrum is about, they didn't get it at all. And this is not going to change without some inspection and adaptation, if they even want to change. If you don't have the power to make this happen, run away.
Is this how Scrum should be handled?
No.
I worked at a place that called themselves agile. They had 6-8 month release cycles. Some things came from a backlog, but during the "Requirements Gathering" phase, basically the managers would spend a week or two meeting with various people in the company, and write up a feature list. The first day of each 4 week "iteration", the dev team would all get together and break down everything in a series of meetings. The last day of the iteration was deployment day, where there would be an intrim deployment that nobody outside of the dev team ever saw.
During the 8 month release cycle, the managers would touch base with the stakeholders maybe once or twice in the last two months of the release, at which point the only issues raised in those meetings that had a chance in hell of getting done before release were issues that were bad enough to make the whole effort useless if they were not implemented.
This is not agile, this is a variant on waterfall with a poor choice of ideas and methodologies cherry picked from other methodologies. At the end of the day, it still has all the same problems that waterfall does.
The lesson I took from my employment there is that development methodologies include things for a reason. If you are cherry picking from a methodology without fully understanding it (and by fully understanding, I mean having actually worked with it), there is a high chance that you will not use something that is actually vitally important to the whole thing. For example, in xp, kent beck advocates relying on refactoring later as a way to cut down on up front design. However, the only reason this actually works is that he also advocates TDD and pair programming. If you have a comprehensive test suite and an extra set of eyes there for the whole thing, refactoring is fairly safe. If you just cherry pick the first part and leave those two out, you are essentially cowboy coding.
I am extremely skeptical of shoppes that roll their own methodologies for this reason. There are an absolutely shocking amount of crimes being committed in the name of agile.
Is this how Scrum should be handled?
Definitely not. Scrum strives to increase transparency. By blocking developers from planning activities, they are doing the opposite of what scrum suggests.
You talked about 2 points here:
1. Sprint Planning - The Scrum Team members should be Definitely required here.
2. Backlog Grooming - May or may not be required here. You have to use your resources wisely and with common sense. One team member with strong developer background would be okay here I think.
There is one more type in Scrum:
Release Planning - Some might say developers are not needed here. But as per the Scrum Guide - "Release planning requires estimating and prioritizing the Product Backlog for the Release". Well prioritization can be done by the POs and suggested by the stake holders, but estimating would be most accurate if it is done by someone who is actually going to do the work, so it is a good idea to involve developers here. Again, resources should be used wisely. If it makes sense to not involve all developers and have people rotate turns to estimate, that is not a bad idea.
I suggest follow this structure:
Sprint Planning - part 1 : Estimation and pulling backlogs in Sprint from product backlog (PO, SM and Team are pigs here)
Sprint Planning - part 2 : Tasking, estimating task hours and breaking them down. (SM, and Team are pigs, PO is chicken here unless PO is taking tasks as well)
It is up to the team to figure out, during the sprint planning meeting, how it will turn the selected product backlog into a shippable product functionality. If they are not part of this process then they would not be able to commit.
The answer to your title question is: Developers (team) must participate in planning meetings. Planning meetings are for developers (team).
The good approach is to have two planning meetings at the beginning of each sprint: Planning meeting 1 and Planning meeting 2. In Planning meeting 1 Product owner gives prioritized (and size estimated - size estimation is not done on planning meeting) product backlog to the team and team starts to discuss most prioritized user stories. For each disucssed user story team should be able to collect:
Detailed requirements (for example which fields the input form has to have ...)
Constraints (for example how fast the functionality has to be)
Acceptance tests (verification of results)
UI sketches (for example how should UI flow looks like)
Acceptance criteria (validation from end user - acceptance criteria doesn't have to be real test. It can be something related to "easy to use" etc.)
There should be time boundary for Planning meeting 1. Number of user stories you were able to discuss can correspond to number of user stories you will be able to complete in upcoming sprint. At the end of Planning meeting 1 team must make commitment - say how many of discussed user stories will be done in upcomming sprint. Sprint planning meeting 2 is only for team because team further discusses user stories and breaks them into tasks.
Generally, of course they should. Obviously, it's never realistically possible to the degree that developers would like. However, if sprints are usually "Hair On Fire" type affairs, where the developers get no serious input at all... then at the very LEAST there should be regularly-scheduled "entropy reduction" sprints, where all tasks are selected exclusively by the developers for the purpose of cleaning crap up.
At least some developers need to be there so work can be properly estimated and pipelined.
But not all developers need to be there. All can be there is it makes more sense.
On the other hand, developers need to understand that the business priorities are the priorities, no matter what they think should come next. Everyone has to work together ot make it work.
I'm not so much worried about my input, but about my insight. I recently was involved in a project where I had no knowledge of the project before the plans were handed to me supposedly complete. The nightmare started when I discovered that the process was not completely thought out and the data definitions were not complete. I wound up having to go through the whole process again to get the answers that I required.
The Team can be involved in the planning process without a formal process or meeting. The planning process is really very fluid. At the start, the goal should be to get to starting sprints ASAP. Spending too much time in planning before the first sprint feels very waterfall and is a waste of everyone's time. I, as a team member would feel relieved to not be a part of that, except for the fact that it indicates a dysfunctional nature to the organization. The Team should always be free to voice ideas on an ongoing basis (since that's when the real planning happens). But, 2 things you mentioned concern me most.
First, the Team should be the only ones to determine how many backlog items they can do this sprint. They certainly would be involved in estimating the effort. That's a big problem.
Second, the Team does not sound like they have access to the product owner (maybe there ins't even one here). Even if the team has not been involved in the "planning" thus far, surely if I were talking to the product owner in the planning meeting, or had access to them at other times, I would voice suggestions over time.

Sprint to the finish: how to keep all team-members busy in the final days of a Scrum sprint? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
Given that the tasks in a specific sprint will not divide perfectly into the team, and all finish on the same date, what do you do to keep everyone working as the sprint moves into its final stages?
Inevitably it seems like there will be one or two people freed-up. If all the other tasks are done-done, and the remaining tasks are already underway, then what?
Do those team-members pick up items from the top of the product backlog, as they are likely to be needed in the next sprint anyways to get a head start?
What do you or your teams do?
My teams have always picked items up from the backlog, starting with the highest-priority items that can fit in the remaining time.
If nothing quite fits that criteria (as when there's only half a day left and/or no small stories to pick up), consider paying down some technical debt.
Scrum is done by teams.
If some people are done, they can help other members of their team.
They can also help their team by getting a head start on the next sprint.
They can also do some exploration of new technology, if that would help the team.
Or they could brush up their own skills, if that would help the team.
They could create training materials to help other members of the team improve their skills.
That's a team decision.
Pay down Technical Debt
Do anything that the team thinks should be done but doesn't belong on a card because there's no visible business value. Some people have called these tasks "technical stories". They tend to be things you should have done before Sprint 0, but didn't. Examples include adding of these that you don't already have to the build:
a Continous Integration server
a test coverage tool
static analysis tools
One thing I recommend is looking up future tasks and doing some detailed planning for estimates. This is non trivial and will take some time. Another is to scope of a new large scale project that can be broken into tasks and entered in product backlog.
Refactoring, writing unit-tests, improvement skills.
(...) what do you do to keep everyone working as the sprint moves into its final stages?
Nothing, I expect a self-organized team to find out this by themselves. And there are many options (by order of importance):
Help other members of the team to finish their stories (achieving the goal of the sprint is the most important, the whole team succeed or fail at this, not individuals).
Prepare a kick-ass demo.
Pick up a story from the backlog that can be done-done before the end of the sprint (i.e. not always the next highest priority items but something that fit in terms of size).
Repay technical debt if you have some.
Document things if this make sense.
Explore new things (tools, frameworks, testing techniques, etc) that may be useful for the team.
While it may seem obvious for team members to move on to the next highest items in the product backlog, I would advise against starting with this.
First and foremost, the teams' obligation is to achieving the sprint goal, so anything they can do to work towards that must come first (e.g. helping out testing, chipping in where possible, etc.).
Next, the team should look at expanding their definition of "done". Perhaps it currently doesn't include testing, or doesn't include some form of code review. Most teams starting with Scrum do not start with a definition of done that truly has a product increment that is ready to ship, so now would be the team to move towards that.
As others have mentioned, what tools do you need setup in order to get closer to a shippable state? Continuous integration? Automated acceptance tests? Now is the time to add these things.
Likely, you also have areas of the code that existed before you moved to Scrum and thus don't have very good test coverage or have accumulated technical debt. Now's the time to pay that off.
Also, as Mike Cohn suggests in his book Succeeding with Agile teams may want to reserve roughly 10% of their time for some look ahead planning. This may involve having a meeting with the Product Owner to discuss some up and coming stories for future sprints, breaking down larger stories into smaller ones, or for designers, perhaps doing some wire frames or mock-ups for upcoming stories.
Once you've gotten to this state, only then should you consider continuing on with the product backlog.
When there are team members that have completed there task early and find themselfs unoccupied there are a few things that can be done.
Make sure that estimation can be improved so hence planning can be improved. In doing this, bare in mind this estimation is very subjective. (However in my view underestimation is a situation we do not want to be in).
The scrum master has to bring in an ethos to the team of "Forwarding thinking"; improvements in oneself, in team productivity and the product or business the team is working on.
2.1. Try help out other team members task where possible to get stories Done (DOD) in the the sprint.
This could be pair work (pair programming)
As a programmer fixing other peoples bugs
etc etc
2.2. Try to help the scrum-master with other stories in he backlog. Check if any small story can be completed within the capacity of sprint making sure of it Impact to the sprint.
2.3. Work on research where there is a story in the backlog which is unclear. Do research on this story. Here a new story can be created with the emphasis on delivering research results. This story should be 0 points. programming prototyping etc can be done on the developers local PC without it being checked in.
2.4. Develop ones own skill either in there functional area (programming, testing etc) or the domain area.
The idea is a team that is performing. Each team member is dedicated to the goals of the team. So if you find yourself free...forward think how can i help the goals of the team.

Is our agile plan standard? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
We have been trying Scrum but for a while now and are trying to formalize it within as our own version of Agile Application Development. Here's how our current process works. There are two main drawbacks to it as it stands right now. Wanted to get input on whether you have a similar approach and if the community has any practical tips for the impediments we currently have.
Scrum team = 4 Developers, 2 QA, 1 Tech Writer, 1 PO(PM), 1 Scrum Master (Engg Dir)
Release = 3 Sprints
Sprint = 2 weeks
PO and customers create product backlog of user stories and related acceptance criteria.
1 week Sprint planning at the start of each iteration
Day 1# Estimate the Sprint backlog and agree on priority
Day 2-5# Scrum team discuss stories and work on the details of each story in the Sprint backlog (get the details on the story, process flows if any, identify UE guidelines to apply, details for UI items/fields/widgets and their behavior if anything specific is required, understand acceptance criteria and create tests)
2 week Sprint with 15 min daily scrum
Repeat 3 week cycle
The two major drawbacks we're having in this are:
The details that are discussed in the spring planning week are not captured effectively and get noted on a wiki. Since there is no standard format for capturing such details in Scrum, often time is wasted in daily scrum or subsequent meetings are required to further understand story details. Whats the best way of capturing story details for a functionally fairly complex product in sprint planning? Most of the issues seem to be around UI and developers inability to decide how screens/fields should be laid out without detailed mocks.
How do you anticipate critical showstopper bugs that come back from customers when team is in a sprint cycle. Currently the Dev folks have to be pulled away into supporting these Red Account issues that crop up thus disrupting the sprint.
Any inputs on how we can improve this?
There is no "standard" Agile plan. Plans aren't important.. planning is. What i mean by that is adapt your plan regularly to reflect ground realities. Formulating a plan, having it blessed by the powers to be and then strapping on developers hasn't worked traditionally.
Sprint planning shouldn't go over a day if I'm not mistaken. One of the key ideas of scrum is that you don't spend too much time planning. If they do, stop and reconvene when you have better clarity.. dont trudge on.
Get prioritized set of stories from customer ~3 hrs
developers huddle to estimate ~3 hrs
show estimates and let customers change their bucket to reflect business needs (within sprint quota) ~rem time.
Documenting decisions:
Get a good scribe? Someone who can type away as fast as 4 people talk.. get the core statements/decisions in a high-visibility area like a chart.. or a wiki.. whatever works for you
UX Study:
Try to pipeline UX work. Make sure the UX people have already worked UI Details,Mock screens, workflows, etc. by the time the devs get to it. UX is working on stories for Iteration n+1 when the devs are working on iteration n. A bit difficult but can be done if UX is causing a lot of "thrashing" for you.
Bug-Duty:
One approach is to make all bugs as regular backlog items for the next iteration. Get Customer buy-in on which ones need to go in during sprint planning.
If that is not feasible, Trend bug-inflows, rate of fixing and plan for it. Keep x days marked away for the fix-on-demand devs dedicated for these requests.
Scope for improvement:
You need a dedicated person in the "customer" role (or coach/BA who can front for the customer) that the developers can get in touch with on a real-time basis. Daily scrum meetings should be timeboxed to 30 mins and shouldn't include story "clarifications". Stick to the 3 questions - What did you do yesterday? What are you doing today? Any obstacles you need help with?
The dev or sub-team in charge of a specific story should work with the Customer/Front in case of doubts while they are working on the specific task. They are responsible for extracting out the details as part of the development effort. They can request for help from other devs who have worked in related areas too if that helps. Work together with the customer to stay on the right track.
HTH
Yep. I noted that nowhere in your process were the developers listening / talking to the actual end users. This is a recipe for failure. You cannot expect your "PO" to catch all the nuances that the actual users will express.
Developers must talk to the end users. The PO should be there as well, to document what was discovered. This is the biggest problem I see in most development projects, separation of developers from users.
Why are you sprint planning meetings a week long? The goal of sprint planning is to get just enough detail to feel comfortable as a team with the features you can get done and commit to doing them. This usually takes less than a day (~4 hours). The actual implementation details are discovered just in time by the devs during the sprint. That is why it is so important that they have access to the PO and the users. If you are asking where to capture the details, then you are designing to much in the planning meeting. The details should go directly into code during the sprint as they are worked out.
What would be a showstopper? The PO sees the progress at the end of each sprint (2 weeks) and then decides if the business value is enough to warrant a release. If there were any critical issues then the PO would probably not release that sprint. Hopefully you can get your PO and maybe users to look at the product on a daily basis as features are completed and thus reduce the probability of issues at the end of a sprint.

How do you structure a development sprint? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
So I have a backlog of features and we are about to get started on a sizable project. I am working on defining the structure of our sprints and I'm interested in the communities feedback.
What I'm thinking is:
One day sprint planning
Fill the backlog and figure out what each dev will go after this sprint
Three weeks of development
GO! GO! GO!
Daily stand up meeting
Check to see if anyone needs help or feels off track
Two days of sprint review
code reviews happen here, stakeholder presentations
One day sprint retrospective
what did we get done in the last sprint? how can we do better next time?
Sprints should always end on a Tuesday (to avoid too much weekend stress).
Anything else? There is obviously more to agile than this. I want to provide the team with a simple outline of how we are going to operate as we get this project started.
I'd consider experimenting with sprints that are shorter then one month.
Personally I find one-two week iterations more effective at getting effective feedback quickly. It also prevents any issues that may be causing problems at the iteration level building up to levels that become harder to manage.
Even for the 30 day sprint - two days sounds about a day to long for the sprint review... and one day sounds about 0.5 days too long for the retrospective. I've found that if you need much more than that there have been communication problems while the iterations has been going on - so you might want to look at needing long reviews as a possible red flag.
Of course that's just been my experience - of mostly developing web apps with smallish (4-12) person teams. You're experience may vary.
That said - I'd definitely give shorter sprints a try. Like integration builds - a lot of things get easier if you do them more often.
Turn off email, cell phone and instant messaging apps for core code time. 10am to 1pm, 2pm to 5pm might be good blocks for this.
Order food, drinks for team when they are in "the zone".
Cancel all other meetings for the days of, before and after the planning session and the review days.
Make sure the "stand-up" remains a STAND-up. It is very easy to slide into longer and longer meetings.
One day of sprint planning and three days at the end may be too much. Only schedule as much time as you need.
+1 to the idea of shorter iterations. Personally, four one-week iterations within a sprint have worked well. People are great at estimating near-term tasks; past that it becomes more and more guesswork.
Looks like a good approach. I second what adrianh and jedidja said about possibly shorter iterations. I like 1 weekers myself. As well as better estimation, it also keeps the idea of "working software" on a much shorter cycle.
A few questions:
Why are code reviews left until the end? Either pair program, or do your reviews as you go.
Does 3 weeks of development mean "dev, test, documentation, installers, etc" ? I.e. everything you need to be truly done?
We structure our sprints very similar to your outline except our sprint reviews are the last day of the sprint and generally on last about an hour. The sprint review is the time where you exhibit your work to the customers and any other interested parties, not the time to do code reviews. Code reviews, if you chose to do them, should be done periodically throughout the sprint. We used to have a one hour block each week where we'd go over developer nominated code, meaning we didn't waste time reviewing every LOC written.
We also end our sprints on a Tuesday and begin on a Thursday leaving Wednesday to wrap up loose ends and tackle technical debt created during the sprint.
I don't recommend postponing code reviews until after the sprint, they should be an integral part of the development process. In other words, a task is not done unless the code has been reviewed (and tested, and documented, and ...).
Its important to stay away from managing for the sake of managing. SCRUM only requires 1 meeting a day, and that's a short one. Additionally, during each sprint, the only other meetings are the Spring retrospective, and the sprint planning. This allows us to implement ROWE, or a Results Oriented Work Environment. Let your developers decide How, Where, When they will do thier development. Use your daily stand-ups to track that they are doing their work. Other than that, stand back and be amazed at thier productivity.
Ideas like "turn off cell phones, turn off IM apps, etc during coding" are all bad ideas. When you hire your team, you are hiring them with confidence that they know how to do thier job correctly. If you hired them with that understanding, why would you want to constrain thier ability to get thier job done the best way they know possible? If you're using SCRUM, then each developer will have chosen the work they feel they're able to do, your job as a Scrum-Master is to remove obstacles, not create them.
Code Reviews: Absolutely necessary. Peer reviews of code are a great teaching tool for junior developers attending meetings, and for the folks having thier code reviewed.
Design Documents: I personally feel that detailed design documents covering what the developer intends to do is very important, and I also feel they are an important part of the development process. Now, this is not specifically in-line with agile development, but I personally regularly refer back to design documents created years ago to see what the original developer was thinking when they coded their modules.

How do you use FogBugz with an Agile methodology? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
"Evidence-based scheduling" in FogBugz is interesting, but how do I use it w/ an Agile methodology?
As eed3si9n said, if you are consistent in your estimates for EBS, FogBugz will take care of this for you.
As to the more general, how does FogBugz fit with the Agile methodology, your best bet is to do sprints as mini-releases. Create a sprint and add the cases you want to achieve for that sprint to that release (or milestone). Give it an end date, say a week away, if you do week long sprints. Then EBS can track it and tell you if you are on schedule.
The graphs in the Reports section will also show you a burndown chart. The terminology is a bit different because FogBugz isn't Agile-only but the info is there.
You want to see if the expected time you are going to finish your sprint is staying steady or going forward. If it is steady you are on track and your burndown rate is on target. If it is creeping up, you are losing ground and your sprint is getting delayed. Time to move things to the next sprint or figure out why you messed up your estimates :)
Essentially I suppose this is a burn-up chart instead of a burndown chart, but it gives you the same answer to the same question. Am I going to finish on time? What do I have left to do?
Atalasoft's Lou Franco wrote an excellent post on this as well. Patrick Altman also has an article.
Update: fixed link to Altman's article
I asked the FogBugz guys the same thing because in XP for example you'd provide the estimate in IET (ideal engineering time). Their answer was to be consistent in the way you provide the estimate.
We started using FogBugz for pretty much everything within our technical team: Documentation, bug reporting, managing tasks. We have progressively got more Agile as time has gone on.
What I have done is created a release which is called the Product Backlog, and this is given an arbitrary release date in the future. I changed the FogBugz field "Version" to "Priority" so we can sort by priority. To manage the product backlog I heavily use Areas to categorise the user stories. Areas could be Themes or Epics. Each Iteration is a Release in FogBugz.
Now, one thing we have recently started using is Story Points as opposed to Ideal Task Days for estimating our Product Backlog. FogBugz doesn't understand a unit of measurement of Story Points so rather confusingly, 1 SP in our Product Backlog is reported as 1 Day in FogBugz. This could be dangerous if there is any confusion. But our team is small. I don't use the in built reporting tools in FogBugz, but it would be great if I could.
So, all my Story Point and Velocity calculations are done outside of FogBugz in Excel. This seems to be fine for now. We're tracking tasks using index cards for user stories and post-it notes as tasks on our boards in the office. Have a look at the book "Scrum and XP from the Trenches" book by Kniberg which influenced my decision. Actually having a big board with everything on it which we are staring at in our morning Scrums really helps.
I do think the historical estimation history and reporting in FogBugz is excellent. Does this work with the planning poker world? I suppose at least from a team's estimation history it does.
As User Stories in the Product Backlog often evolve as there are iterative planning sessions, (Agile Planning) it would be great if there was a wiki style editing of cases as opposed to a thread of descriptions.
There is talk that the next major version will be more supportive of Agile processes so am very much looking forward to seeing that this offers.
Edit:
FogBugz 7 is now out with much better management of Product "Project" Backlogs. Take a look!
http://www.fogcreek.com/FogBugz/blog/post/Scrum-Friendly-Features.aspx
Here are some suggestions for including Story Points in your planning:
When you enter your Story into FB7 you can do it as a Case and include the number of Story Points from Planning Poker in a new custom field that you create called "Story Points" (how to do this below). Then, when you get around to working on that Story, you can break it down further into Sub-Cases, if necessary, and also enter the estimated time to complete each Sub-Case (the estimated times will add up in the Story (top) Case's "Estimate" field, as well as feed Evidence Based Scheduling / Burndown Charts)
Here are two things to consider modifying in your FogBugz installation to reflect your Agile nomenclature.
(1) Out of the box, the FB Category "Feature" is most like your "Story." But you can change your Category names, and add new ones at Admin > Workflow > Customize Categories. Here's additional information on this:
http://www.fogcreek.com/FogBugz/docs/70/topics/plugins/CustomWorkflow.html?isl=174457
(2) To capture Story Points, you'll probably want to create a Custom Field in the Case dialogue. This is accomplished with the included Custom Fields Plugin. Additional information on this is available at isl=174461
Note that with Custom Fields, you can also add a text edit box for the Story which will always appear in the Case dialogue header (no matter how lengthy the case activity history below it gets.)

Resources