Archive | Uncategorized RSS feed for this section

Just-in-time Requirements Analysis (JITRA)

20 Apr

Introduction

Just-in-time requirements analysis (JITRA) is a BA approach based on lean/agile/kanban practices.

The 2 principles underlying JITRA are that requirements:

  • Should ONLY be identified when they are needed; and
  • Should ONLY be defined at the level of detail required

The first principle aims to optimize the timing of requirements; requirements should be delivered based on need rather than convenience (e.g.the  development team needs this next – versus – this feature can be specified easily).

The second principle aims to optimize the cost of requirements; superfluous detail is a waste of effort – and ultimately money.

Time and costing are often finite resources on a project. JITRA aims to optimize both from a requirements perspective.

Implementation

The most widely accepted framework for implementing JITRA principles breaks analysis down into 4 major activities:

  • Initial Analysis
  • Feature Set Analysis
  • Story Analysis
  • After-action Analysis

For a detailed description of each stage – please refer to: http://cf.agilealliance.org/articles/system/article/file/1007/file.pdf

Challenges

Teams that implement JITRA often face perceived challenges from a range of stakeholders:

  • BA: “Without a buffer – requirements might not be ready on time. I don’t know how long it will take to analyse the requirements until I begin!”
  • PM: “Without detailed analysis at the start of the project – how can I estimate the delivery date!”
  • Developers: “Isn’t this just product leaving it until the last minute – and giving developers incomplete requirements!”

Although these perceived challenges could stop a team from experimenting with JITRA principles – there are strong advantages to the approach.

Advantages

  • Agile: JITRA reiterates that the Business Analyst should work on features that can go straight into the backlog. This should provide the development team with a continuous flow of requirements and avoid a “BA bottleneck”. Additionally – as requirements don’t need to be fully specified upfront – JITRA enables requirement details to emerge during iterations.
  • BA perspective: the further in advance of development that requirements are defined – the more likely they are to become out of date. This in itself will lead to rework and ultimately require more analysis effort. With JITRA all three of these issues should be addressed.
  • Quality: the closer to delivery requirements can be left – the more information a BA has on which to build. This will lead to more valuable requirements.
  • Developers: Product requirements typically exist at a high level – the BA provides the detail. One problem with this approach is that if a BA provides details far in advance – one or two specific (and probably minor/low value) detailed requirements could cause considerable development challenges. If the BA provides initial high level requirements – then the development team can present back a set of options for  detailed requirement – the development team can also quantify these options in terms of effort/risk/ technical elegance.
  • PM: This approach requires a smaller upfront investment from the BAs. It also reduces waste from the requirements stage  – as there is less BA rework and less redundant effort.

Brian the Business Analyst

11 Apr

Brian the Business Analyst

Quantifying Agile

14 Mar

There have been a number of attempts to quantify Agile adoption – i.e. to measure the “Agility” of a team. The appeal is obvious. If an Agile maturity model could be defined in a quantifiable manner, then it would be possible to grade individual teams:

Team 1

70% Agile – could do better.

Team 2

85% Agile – almost there!!

Team 3

No metrics.

The visibility this affords would provide Team 1 & Team 2 with tangible feedback and aspirational goals. To take this example further – if you were able quantify Agile adoption, then you would also be able to make informed strategic decisions and justify the existing development approach:

Let’s say you are the CIO. You pay £10,000 to bring in Agile trainers. One year later you want to measure the value of your investment. You ask Team 3 – who attended the training – and they say: “Yes we adopted it. We’re Agile”. You look at the team’s deliverables and see no improvement to their output. The quality, cost and timing for delivery have remained unchanged.  Do you attribute this to Team 3 failing to implement Agile (i.e. due to a low level of adoption) or was Agile the incorrect approach for this team (i.e. due to the nature of the product another approach would have had greater efficacy)?

It’s certainly an interesting question. If you would like to quantify Agile adoption – you could look at the following resources:

i)              Thoughtworks Agile Self Assessment: http://www.agileassessments.com/online-assessments/agile-self-evaluation

ii)             Nokia Test (a.k.a The ScrumButt Test): http://jeffsutherland.com/nokiatest.pdf

iii)           Agile Maturity Model: http://www.drdobbs.com/architecture-and-design/the-agile-maturity-model-amm/224201005

iv)           Agile Artefact Audit: http://msdn.microsoft.com/en-us/library/dd997580.aspx

v)             Agile Health Scorecard: http://illustratedagile.com/files/agile-health-dashboard-template.xlsx

Although the above techniques are valuable – there is a restriction when attempting to quantify Agile adoption. “Agile” is a concept; it is based on an aspirational manifesto: http://agilemanifesto.org/. The manifesto is descriptive (e.g. you should value customer collaboration) – rather than prescriptive (e.g. you should produce a customer collaboration tracker which should be reviewed biweekly with the PMO team). The reach of “Agile” as a concept/archetype means that measuring Agile adoption will always be an imprecise science.