We use Cookies to ensure that we give you the best experience on our website. Read our Privacy Policy.
REJECT ALL COOKIESI AGREE
Blog
QA & Testing

Is It Done? - Yes. Well, Maybe. Actually, No.

June 3, 2022
6 min read
Definition of Done (DOD) - Featured Image
By
Ayman Ali

When we talk about Scrum teams in the Agile approach projects, the delivery process really needs to be impeccable. To ensure that your deliverables are ready for a client review, it is extremely important to respect SDLC in the development process.

Definition of Done (DOD)

Definition of Done (DOD) in Agile or what is known as Exit Criteria in other SDLC approaches supported by different methodologies, gives you the strong base you need to keep delivering value early and often. It does this by making use of the three pillars of Scrum: transparency, inspection, and adaptation. It helps Scrum teams work together more collaboratively. Increases transparency and ultimately results in the development of consistently higher quality software.

In general, the Scrum team owns the Definition of Done. It is shared between the development team and the product owner. Only the development team is in a position to define it because it asserts the quality of the work that the team must perform. Defining a clear DOD is proof of applying the five values that all Scrum teams share: commitment, courage, focus, openness, and respect, as per the Scrum guide.

Quality Assurance in Agile

Quality Assurance plays a vital role in Agile projects. Testers must understand the values and principles that underpin Agile-driven projects. Moreover, testers are an integral part of a whole-team approach together with developers and business representatives. The members in an Agile project communicate with each other early and frequently, which helps with removing defects early. Accordingly, it accelerates the development activities and results in developing a quality product. Quality is everyone’s responsibility.

From a Quality Assurance perspective, the Definition of Done is one of the constituent instruments and practices of Scrum in a fast-paced development through continuous integration and continuous delivery/deployment pipelines (CI/CD). It is defined and explained as; to make sure that there is a potentially releasable product at each sprint’s end, the Scrum team discusses and defines appropriate criteria for sprint completion. The discussion deepens the team’s understanding of the backlog items and the product requirements.

Definition of Done & Test Levels

Each test level has its definition of done. The following list gives examples that may be relevant for the different test levels.

Unit testing

  • 100% decision coverage where possible, with careful reviews of any infeasible paths.
  • Performing a static analysis of the entire code.
  • Resolving major defects (ranking based on priority and severity).
  • Focusing on an unacceptable technical debt remaining in design and the code.
  • Reviewing the entire code, unit tests, and unit test results.
  • Automating all unit tests.
  • Agreeing on limits and crucial characteristics (e.g., performance).

Integration testing

  • Testing all functional requirements, including positive and negative tests, with the number of tests based on size, complexity, and risks.
  • Testing all interfaces between units.
  • Covering all quality risks according to the agreed extent of testing.
  • Resolving major defects (priority based according to risk and importance).
  • Reporting all discovered defects.
  • Automating all regression tests, where possible, with storing all automated tests in a common repository.

System testing

  • Testing user stories end-to-end with follow-up on features and functions.
  • Covering all user personas.
  • The most important quality characteristics of the system covered (e.g., performance, robustness, reliability).
  • Performing tests in a production-like environment(s). Including hardware and software for all supported configurations, to every possible extent.
  • Covering all quality risks according to the agreed extent of testing.
  • Automating all regression tests, where possible, with all automated tests stored in a common repository.
  • Reporting and possibly fixing all found defects.
  • Resolving major defects (prioritized according to risk and importance).

User Story

The definition of done for user stories has the following criteria:

  • The user stories selected for the iteration are complete, understandable to the team, detailed with testable acceptance criteria.
  • Reviewing and specifying all the elements of the user story, including complete user story acceptance tests.
  • Identifying and estimating selected user stories with tasks that are necessary to implement.

Feature

The definition of done for features, which may span multiple user stories or epics, may include:

  • Definition and approval for all constituent user stories, with acceptance criteria.
  • Completing design without technical debt.
  • The code is complete when there is no technical debt or unfinished refactoring.
  • Unit tests have been performed and have achieved the defined level of coverage.
  • Performing integration tests and system tests for the feature according to the coverage criteria.
  • Review and debugging for major defects, if any.
  • Completing feature documentation, which may include release notes, user manuals, and online help functions.

Iteration

The definition of done for the iteration may include the following:

  • All features for the iteration are ready and individually tested according to the feature level criteria.
  • Adding any non-critical unfixed defects, within the constraints of the iteration, to the product backlog and prioritized.
  • Testing and finalizing the integration of all features for the iteration.
  • Documentation written, reviewed, and approved.

At this point, the software is potentially releasable because of the complete iteration. However, not all iterations result in a release.

Release

The definition of done for a release, which may span multiple iterations, usually includes the following areas:

  • Coverage: All relevant test basis elements for all contents of the release are covered by testing. The complexity and size of all changes, together with the associated risk of failure, are determining the adequacy of the coverage.
  • Quality: The defect intensity (e.g., depending on existing defects per day or transaction), the defect density (e.g., the number of existing defects compared to the number of user stories, effort, and/or quality attributes). The estimated number of remaining defects needs to be within acceptable limits, with a clear understanding of the consequences regarding unresolved and remaining defects (e.g., the severity and priority). Understanding and accepting the residual level of risk associated with each identified quality risk.
  • Time: Everything needs to be according to predetermined features and functionalities. Business considerations demand a second review for the release after reaching the delivery date.
  • Cost: The estimated life cycle cost should serve to calculate the return on investment for the delivered system (i.e., the calculated development and maintenance cost should be considerably lower than the expected total sales of the product). After release, the lifecycle cost often comes from maintenance due to possible defects escaping to production.

Is it done? - Yes, sure, it’s done.

“When you have a clearly defined Definition of Done, you have the control of the process, you structured the objectives, and you placed the right steps on the success road”.

References:

  1. The Scrum Guide by Ken Schwaber and Jeff Sutherland
  2. The Agile Manifesto
  3. The Scrum Values
  4. ISTQB-AT
  5. ISO/IEC 25010:2011
  6. ISO/IEC/IEEE 26515:2018

Accelerate Your Career with 2am.tech

Join our team and collaborate with top tech professionals on cutting-edge projects, shaping the future of software development with your creativity and expertise.

Open Positions

Don't miss out on
our latest insights
– Subscribe Now!

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Share This Post
Back to Blog
Don't miss out on
our latest insights
– Subscribe Now!
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Navigate
Start Now