Leading the Transition to Effective Testing in Your Agile Team

Leading the Transition to Effective Testing in Your Agile Team Fran O’Hara – Inspire Quality Services [email protected] www.inspireqs.ie Flip...
Author: Augusta Tate
5 downloads 0 Views 793KB Size
Leading the Transition to Effective Testing in Your Agile Team Fran O’Hara – Inspire Quality Services

[email protected] www.inspireqs.ie

Flipping the Iron Triangle FIXED

Scope/ Resources Requirements

Schedule

Value Driven Plan Driven ESTIMATED

ScheduleScope/ Requirements

Resources

Quality Quality

? 2

Technical Debt Symptoms of technical debt • Bugs found in production • Incomprehensible, un-maintainable code • Insufficient or un-maintainable automated tests • Lack of CI • Poor internal quality • Etc.

3

Quality & Test • Quality is not equal to test. Quality is achieved by putting development and testing into a blender and mixing them until one is indistinguishable from the other. • Testing must be an unavoidable aspect of development, and the marriage of development and testing is where quality is achieved. from ‘How google tests software’, James Whittaker et. al.

4

Release planning level – a testing perspective Add value in release (re-)planning by: • Supporting the Product Owner in writing User Stories/Epics and making sure they are testable, • Participating in the high level risk analysis of those User Stories/Epics, • Ensuring Estimation includes testing perspective • Planning the testing for the release level. That is, to create a test strategy/approach for it (resources, tools, test levels, static testing, test environments, test automation targets), based on the scope and risks identified for that release • Playing a key role in defining the acceptance criteria (definition of done) of the release, and later on of the iteration/Sprint.

Adapted from ISTQB Agile Tester Extension Syllabus

5

Lessons Learnt / Challenges Test Strategy & Risk

Requirements (e.g. Story size, Non-Fn.)

Definition of Done

Test Competency

EuroSTAR webinar poll: 84% had experienced 3 or more (58%... 4 or more)

Test Automation

Line Management

Techniques (e.g. exploratory), Planning for Quality, Documentation, ….. 6

Basic Testing within a Sprint Automated Acceptance/Story based Tests

Automated Unit Tests

Represent Executable Represent Executable requirements Design specifications © 2014 Inspire Quality Services

Test Strategy & Risk

Manual Exploratory Tests

Provides Supplementary feedback 7

Agile Testing Quadrants – Risk!

© 2014 Inspire Quality Services

Test Strategy & Risk

8

Definition of ‘Done’

Definition of Done

An agreement between PO and the Team • Evolving over time to increase quality & ‘doneness’ Used to guide the team in estimating and doing Used by the PO to increase predictability and accept Done PBIs ‘Done’ may apply to a PBI and to an Increment A single DoD may apply across an organisation, or a product • Multiple teams on a product share the DoD 9

DoD example

Definition of Done

Story level

Sprint level

Release level

• Unit tests passed, • unit tests achieving 80% decision coverage, • Integration tests passed • acceptance tests passed with traceability to story acceptance criteria, • code and unit tests reviewed, • static analysis has no important warnings, • coding standard compliant, • published to Dev server

• Reviewed and accepted by PO, • E-2-E functional and feature tests passed • all regression tests passing, • exploratory testing completed, • performance profiling complete, • bugs committed in sprint resolved, • deployment/release docs updated and reviewed, • user manual updated

• Released to Stage server, • Deployment tests passed, • Deployment/release docs delivered, • large scale integration performance/stress testing passed

10

The Automation Pyramid Manual Tests

Test Automation

e.g. exploratory

Based on Mike Cohn

GUI layer e.g. Selenium

API/Service layer

Automate at feature/workflow level Automate at story level

Acceptance Tests e.g. Fitnesse, Cucumber

Unit/Component layer Developer Tests e.g. JUnit

Automate at design level 11

Test Competency

Agile Development Team (Analysts, Progmrs., Testers, Architect, DBA, UI/UX, etc)

Create each increment of ‘Done’ Product Architect

?

Team Lead

No Specialised Sub-Teams

QA Lead

BA Lead

Developer1

Tester1

BA1

Developer2

Tester2

BA2

12

Is testing fully integrated? Sprint 1

A

Sprint 2 Code & Code Bug Fix

Code

Requirements (e.g. Story size, Non-Fn.)

Test

Sprint 1

B

Code

Code & Bug Fix

Sprint 2 Code

Test

C

Code & Bug Fix

Test

Sprint 1 Code & Bug Fix

Sprint 2 Code & Bug Fix

Test

Test

Sprint Backlog

….

13

User Story Example – Hotel Reservation Reservation Cancellation As a user I want to cancel a reservation so that I avoid being charged full rate

Confirmation: • Verify a premium member can cancel the same day without a fee • Verify a non-premium member is charged 10% for same day cancellation but otherwise not charged • Verify an email confirmation is sent to user with appropriate information • Verify that the hotel is notified within 10 minutes of a cancellation

CONVERSATION: • What if I am a premium member – do I have charges? • When is a non-premium member charged and how much? • How do these vary depending on when cancellation occurs? • Do we need to send the user confirmation by email? • When does the hotel need to be notified? • What if the user has paid a deposit?

Consider also specification by example/BDD

14

Context : Acceptance Test Driven Development

15

All

All, CU

All

All

All

PO, TM, SM (All)

TM

WEEK1 WEEK2 PO: Product Owner – SM: ScrumMaster - TM: Development Team – CU: Customer Each Event is Timeboxed. Times provided are maximum times from the Scrum Guide at scrum.org based on a 1 month sprint. Each event is an opportunity to Inspect and Adapt 16

Examples of how to evolve quality/test practices… • See Google’s ‘Test Certified’ levels • Paddy Power’s review of teams practices – using scale of 0-5 for items such as – – – – – – – – –

Code Reviews, Pair Programming, Code Analysis, Unit Tests, Continuous Integration, Automated Acceptance Tests, Data Generation, Performance Analysis, TDD, etc. (from Graham Abel, Softtest Test Automation Conference 2013)

• Communities of Practice, Tribes, Mentors, etc. 17

Conclusions to leading the way • Educate yourself on what agile really means and how quality and test fit in your context • Avoid the common pitfalls, use retrospectives to address the current issues • Be proactive in release level planning re test strategy/approach and ‘big picture’ thinking – Get help from outside the team if needed

• Organise a test CoP! • Collaborate, Communicate and Question constantly – Prevention! 18

Thank You! Fran O’Hara InspireQS www.inspireqs.ie [email protected]

19

Suggest Documents