Continuous Improvement - Part 1 Tutorial

1 Continuous Improvement 1

Hello and welcome to ‘Domain VII—Continuous Improvement’ of the PMI-ACP Certification course offered by Simplilearn. This is the first part of this domain.

2 Objectives

After completing this lesson, you will be able to: • Explain the concepts of Kaizen • Explain retrospectives and various techniques of conducting retrospectives • Identify the steps in process analysis • Describe Agile process tailoring • Explain how Agile embeds quality throughout the project lifecycle • Identify the best practices of continuous integration

3 Kaizen - Introduction

Kaizen is a Japanese word, which stands for continuous improvement. ‘Kai’ means change and ‘Zen’ means good; put together, it means ‘change for the better.’ This technique is used by organizations across industries to come up with a competitive strategy. Kaizen advocates the involvement of people at all levels; everyone is encouraged to come up with small improvements on a continuous basis. The underlying premise of Kaizen is that big results come from small changes, accumulated over time. These small changes improve productivity, effectiveness, and innovation, while reducing waste. To support the concept of continuous improvement, organizations must invest in training and study materials, and provide constant supervision. Click each colored area in the chart to know more.

4 Kaizen - Introduction (contd.)

Agile projects provide the perfect environment to implement Kaizen. As the entire project is broken down into small iterations and releases, the learning from one can be implemented in successive iterations. The focus on continuous improvement—of both the team and product—through ‘inspect and adapt’ is supported in the Scrum framework by three key meetings: • Daily Stand-Up • Sprint Review • Sprint Retrospection The timely resolution of impediments in the impediment log, one of the key artifacts used by a ScrumMaster, facilitates the implementation of Kaizen.

5 Kaizen - Key Aspects

Key aspects of Kaizen are: • Future Thinking, • Waste Reduction, • High Quality, • Low Costs, • Empowerment, • Flexible Practices, • Just-In–Time, • Customer Focus, and • Team Work

6 Retrospectives

Recall the Agile principle, “At regular intervals the team reflects on how to become more effective, then tunes its behavior accordingly.” Retrospectives, integral to Agile projects, are sessions where, the team reflects on what worked and what can be improved. Every sprint typically closes with a Retrospection Meeting, where the entire Scrum team, that is, the Development Team, ScrumMaster, and Product Owner discuss the delivery performance of the current sprint. It is timeboxed to three hours for a monthly sprint, and to correspondingly smaller periods for smaller sprints. The Agile team must understand that the retrospective is not meant to be a dissection, and ensure it is meaningful and fruitful. Instead of being used only after a catastrophic failure, retrospectives must be conducted at regular intervals throughout the life of a project— the purpose being to learn from the experience and not to apportion blame.

7 Retrospectives-Agenda

In retrospectives, teams discuss: “What went well?”, “What did not go well?”, “What needs to change?”, and “What still puzzles us?” The retrospective is not an exercise in fault-finding, instead it focusses on identifying best practices and getting the team’s buy-in for future implementation. Frequent retrospection enables an Agile team to converge on the right solution and meet business needs.

8 Importance of Retrospectives

Retrospectives are necessary in an Agile project environment as they help to reflect on and learn from the team’s experiences. They also allow teams determine their future course of action, based on the lessons learned. Retrospectives also improve communication, and foster free and frank dialogue between team members, thus establishing trust. Finally, retrospectives empower teams to steer their own course and assume responsibility for both product development and their own development and growth.

9 Conducting a Retrospective-Factors

Retrospectives, attended by the team members and the facilitator, can be conducted at the iteration, release, or project levels. By the end of the retrospective, the team must establish a goal to work toward in the next iteration. While conducting a retrospective, the facilitator: • ensures everybody in the team actively participates in it. • should be experienced, neutral, and perceived by the team as non-threatening and helpful. • sets the duration, expectation, and the goals for the retrospective; and also sets the ground rules for the meeting.

10 Conducting a Retrospective - Steps

The retrospective meeting goes through five steps. Click each step in the “Inspect and Adapt” part of the continuous improvement cycle to know more. • In the first step, Set the Stage, the ScrumMaster clearly defines the ground rules with the aim of creating an atmosphere where people can comfortably discuss the impediments impacting the project. While an open discussion on impediments is encouraged, the participants also acknowledge that the retrospective is not the place to make personal criticisms or complaints. • In the next step, the team gathers data relevant to the problems faced during the sprint. You have already learned about problem detection and resolution techniques in the previous domain, including the fishbone diagram, the Five Whys technique, and control limits. • The third step is to generate insights based on the collated data. The team analyses the data to infer the root cause for the identified issues. • In the fourth step, the team decides on the improvements to implement in upcoming sprints to prevent such issues from recurring. • In the last step, Close the Retrospection, the ScrumMaster thanks the team members for their contribution; team members also show appreciation for each other’s help in resolving technical issues during the sprint.

11 Techniques to Conduct Retrospectives

While there are multiple ways to conduct a retrospective, it is important to ensure this meeting does not turn confrontational. The team can adhere to Norm Keith’s Prime Directive: “Regardless of what we discover today, we understand and truly believe that everyone did the best job they could, given what they knew at the time, their skills and abilities, the resources available, and the situation at hand.” Brainstorming and mute mapping are two common techniques used in retrospectives. Brainstorming is a collaborative technique where the team reflects on lessons learned and generates the best ideas for improvements in future sprints. Some ways of conducting brainstorming sessions are focus groups and facilitated workshops. After the brainstorming session, the team works on clubbing ideas that are similar or identical, without any discussion. This ensures that affinities associated with the ideas are quickly identified and are not lost when the conversation is dominated by one participant. This method of segregating ideas is called Mute Mapping.

12 Brainstorming Techniques

To conduct a brainstorming session, the facilitator can use one of the following methods: Round Robin—everyone in the team is given a chance to provide their views in a round robin fashion. Team members can either choose to expand on an earlier issue or add a new perspective. Free-for-All—members can participate without any restriction and provide inputs. The only disadvantage is that the quieter members of the team may not be given any opportunity to voice their views. Quiet Writing— team members write their ideas and pass them to the facilitator. This method limits the influence members may have on each other as the ideas are generated in isolation.

13 Process Analysis Technique

Process analysis is an important technique followed by an architect, Product Owner, business analyst, or anyone who works on understanding a system, defines or refines requirements, and provides a business or process-related solution. The steps involved in process analysis are: One, identify the user of the system. Two, define the goals of the main user. Three, define the usage patterns of the systems. Four, prepare a functional solution to meet user goals and usage patterns. Five, define the main navigation paths in the system. Six, create User Interface or UI mockups. Seven, polish the UI elements with the help of user input.

14 Agile Process Tailoring

Process tailoring involves customizing Agile processes to a given situation. It can include roles, processes, or procedures. The foundation of Agile is flexibility. Beyond the Agile Manifesto and Agile principles, there is very little that is immutable or considered sacred. The team must be able to modify what is not working for them. Some examples of project-specific tailoring are: • Determining how to add or remove work products and tasks • Changing milestones and work products completed at each milestone, along with the expected extent of project or product completion at specific times • Assigning responsibilities for review and approval of work artifacts to specific people using the RACI model; RACI stands for Responsible, Accountable, Concerned, and Informed • Establishing detailed procedures to report progress, perform measurements, manage requirements or change requests, or anything else the team intends to govern differently

15 Agile Process Tailoring - ShuHaRi

The word ShuHaRi comes from the Japanese martial art form of the same name, which recommends the following three stages of gaining knowledge. Click each stage to know more. SHU means “Follow the rules.” Teams newly implementing Agile must follow the guidelines provided by the methodology without tailoring any process. This is because there is a specific reasoning for every Agile process; for instance, lightweight user stories are augmented by the Sprint Planning and Daily Stand-Up meetings to identify and address the gaps early in the project life cycle. HA means “Branch out.” In this stage, teams using the Agile methodology guidelines now have sufficient understanding to explore new techniques and practices. This can be done to either showcase continuous improvement or because the current methodology does not enable value delivery. RI means “Find your own approach.” After gaining sufficient mastery over Agile practices, the team can now create its own practices and guidelines to suit the project dynamics. The key learning here is that before tailoring a process, ensure that the Agile framework is correctly implemented and its benefits reaped.

16 Quality in Agile

The textbook definition of quality is “conformance to requirements and fitness of use.” Quality can be broadly classified as customer quality and technical quality. Customer quality is quality perceived by the customer, and is therefore extrinsic. The customer develops a perception of quality only after the team delivers a working system, which is why Agile focuses on delivering working software early and frequently. The litmus test for customer quality is in delivering “value” to the customer. Technical quality is quality perceived by the development team, and is intrinsic. It enables continual value delivery. Technical quality indicates whether the product is good and whether the team is confident in delivering the product to the customer. Poor quality can cost the organization as it makes products unreliable and negatively impacts customer confidence. More significantly, poor quality indicates an unstable foundation and compromises the team’s ability to be responsive to customer needs, as most of the team’s time is spent on fixing quality issues.

17 Project and Quality Standards for Agile Projects

An Extreme Programming practice, which can be applied to activities other than coding, is to have a well-defined coding standard and enforce adherence, as standards guarantee a basic level of quality. All developers agree to adhere to the following guidelines while writing code: • It must go beyond formatting, and include indentation, commenting, and others. • It must focus on consistency and consensus over perfection. • It must adhere to a minimum set of standards that the entire team follows. Teams must also develop best practices that can be posted in the team room in the form of an information radiator.

18 Quality in Agile - Best Practices

Agile recommends the following best practices, to continuously improve the quality, effectiveness, and value of products. • Verification and validation • Exploratory and usability testing • Test-Driven Development • Acceptance Test-Driven Development • Continuous integration • Definition of Done All these practices aim to deliver a product with zero defects, ensure the solution works according to requirements, and improve the value of the product.

19 Quality Best Practices - Frequent Verification and Validation

The objective of an iteration in Agile is to produce code that is of ‘near-releasable’ or ‘potentially shippable’ quality. This requires the code to have passed through the verification and validation steps. Let’s take a closer look at verification and validation. Verification is testing against the stated requirements. It answers the question, “Is this product built to specification?” Verification includes testing, inspections, and peer reviews. Unit tests, system tests, and statistical analysis are used to identify and remove defects from the product. Conversely, validation answers the question, “Is this the product you wanted?” One of the strengths of Agile is that it enforces frequent validation of the end product with the customer. By frequently checking, using, and even performing tests around the working system, customers can validate whether the product being built matches their expectations, and will deliver the intended value. Validation is achieved by holding frequent demos for the users, using prototypes or wireframes, and encouraging the customers to perform “user acceptance tests” for completed components.

20 Quality Best Practices - Frequent Verification and Validation (contd.)

An Agile iteration produces artifacts, including code and documents, within a timeboxed period. Therefore, verification and validation are performed early and continually throughout the life of an Agile project. Peer or customer reviews are used to verify and validate requirements and design. This is important even if the system is developed for internal use. To verify and validate code, code reviews, unit testing, system and integration testing, which may include both functional and non-functional testing, are used.

21 Quality Best Practices - Exploratory Testing

Although automated unit and functional tests provide the flexibility to execute tests multiple times and support incremental delivery, every aspect of the solution cannot be covered owing to the practical difficulty in automating all possible tests. Therefore, exploratory testing becomes essential to delivering a quality product. Exploratory testing breaks away from the traditional paradigm of software testing using planned test scripts. It is a manual testing technique that requires both testing and business domain expertise. Using exploratory testing in combination with automated tests helps improve overall test coverage.

22 Quality Best Practices - Exploratory Testing (contd.)

Exploratory testing is not a prescriptive method and requires testers to devise creative ways to test an application. It helps identify high-system or workflow-critical defects, which cannot be easily captured in an automated test. Exploratory testers are expected to: • Have extensive application and business knowledge • Find innovative and creative methods simulating possible user behavior • Ensure thorough evaluation while managing time

23 Quality Best Practices - Usability Testing

The focus of usability testing is to gather end-user feedback on the application being developed. In this, end users evaluate the product and express how comfortable they are using it. Usability testing is a type of black-box testing, and covers the following aspects of an application: • Workflow • Layouts • Navigation • Speed and performance • Ease of use • Ease of learning • Error handling • Customer satisfaction • Attractiveness The effectiveness of usability testing entirely relies on the people performing these tests. Therefore, you must select suitable end users for this and ensure they cover every aspect of the solution. In line with the iterative and incremental development model used in Agile, usability testing is carried out at relevant points throughout the project lifecycle, to ensure the product being developed matches end-user expectations.

24 Quality Best Practices - Test - Driven Development

Test-Driven Development, or TDD, was first introduced in Extreme Programming which holds to the belief, “If testing is good, test all the time ”. TDD is Test-First Development, or TFD, combined with continuous refactoring. TDD is an evolutionary, that is, both iterative and incremental, approach to software development where, developers must first write a test, and then write a code that would satisfy the conditions of the test. This ensures there is a rich and robust collection of tests available. By automating these tests and linking them to the Integration Server, every time a piece of code is compiled, you can gauge its quality from the test results. Developers feel more confident in refactoring code and ensuring adherence to coding standards when there is a higher number of available tests.

25 Quality Best Practices - Test - Driven Development (contd.)

TDD is a rapid cycle of testing followed by coding along with refactoring. The tester initially writes some basic acceptance tests for the bare minimum functionality. The developer then writes the code that will pass these tests. The testers then add some more scenarios the developers have to code for. At the same time, developers refactor the code already created so that it remains efficient. This cycle repeats until the testers can no longer visualize tests that will fail. The flowchart given on the screen indicates how TDD works. In Step 1, add basic tests, just enough to cause the tests to fail. Next, execute the test suite, or a subset of it, to verify the tests are indeed failing. In the third step, update the code so that the tests are successful this time. Next, run the tests again to verify if they are successful. If the tests fail again, repeat steps 3 and 4. Finally, if all the tests are successful, think of some more tests that may fail. Repeat the entire cycle, Steps 1 through 6, until you can no longer think of any test that might fail. During this process, developers continue to refactor the code to keep it efficient.

26 Test - Driven Development - Advantages

There are many advantages of TDD. One is that it enforces the principle of Just-In-Time or JIT design just before developing the code, rather than coming up with a detailed design at the beginning of the project. TDD ensures that the team always has automated tests to execute against the code, even before it is written, to make testing quick and easy. This way, Agile developers can validate their work by running tests as often and early as possible. TDD gives developers the confidence to refactor the code to retain the highest quality possible, as the test suite detects if anything is ‘broken’ as a result of refactoring. Any mistakes in refactoring are caught in an automated test. The developers get instant feedback, helping them fix the issues faster. TDD substantially reduces the number of errors in the code because the building process itself guarantees that the tests will succeed rather than fail. TDD improves design and can be extended to test and document external or public interfaces unambiguously and clearly. This guards against mistakes that may inadvertently creep into the system.

27 Quality Best Practices - Acceptance Test - Driven Development

Acceptance Test-Driven Development, or ATDD, differs from TDD as it focusses on writing tests based on the acceptance criteria defined for user stories. Here are some key points related to ATDD: • While unit tests in TDD focus on “building the code right”, acceptance tests focus on “building the right code.” • Each requirement is expressed in terms of inputs and expected outputs. • ATDD requirements are documented in Wiki pages that can be executed once the code is complete. • ATDD involves creating tests before code, the tests represent expectations of the software’s behavior. • While unit tests are white-box tests that assess the internals of a system, acceptance tests are black-box tests that assess the system’s functionality.

28 Acceptance Test - Driven Development Cycle

Let’s look at the ATDD cycle. The diagram given here shows a typical ATDD cycle comprising four stages: Discuss, Distill, Develop, and Demo. Click each stage to know more. In the ‘Discuss’ stage, the team holds a discussion with the business stakeholders who have requested a feature or story, and develops a detailed understanding of the system’s behavior from the end-user point of view. Based on this, the team defines technical interfaces and writes acceptance tests that can be executed automatically or programmatically. In ‘Distill’, the team fashions or implements tests to be incorporated into the automated testing framework. This ensures that the tests don’t just remain specifications, but actually become “executable.” Examples of test automation frameworks that support defining the tests before implementation are the FIT, FitNesse, Concordian, and Robot frameworks. In the ‘Develop’ stage, the team develops the code following a TDD approach, that is, they first test, write code to pass the tests, refactor, and so on. In the stage ‘Demo’, the team ensures they cannot envision any more tests or scenarios that may cause the system to fail. They then provide a demo of the system to the stakeholders, indicating the tests run and any vulnerability, if identified. Manual exploratory testing is also done to reveal gaps in the acceptance criteria and discover defects that occur when multiple user stories are executed in a scenario.

29 Effectiveness of TDD - Experiment 1

In this example, let’s discuss how TDD helped an IT company deliver quality products. The organization has been using TDD for five years and for over ten releases of a Java-implemented product. A set of experiments was conducted with two groups of programmers, a control group and the other using TDD. The participants were asked to develop a short program to automate the scoring of a bowling game. The results were: • TDD developers took 16% more time to write the tests, and then develop the code. • The TDD group’s code passed 18% more functional black-box test cases compared to the control group. Additionally, the TDD team came up with good-quality automated test cases, whereas the control group was not able to write any worthwhile automated test cases.

30 Quality Best Practices - Continuous Integration

Continuous integration is one of the twelve practices of Extreme Programming. It originates from the belief, “If integration is good, then integrate all the time.” It is an engineering best practice that can be extended to all projects, regardless of methodology. In continuous integration, the code is checked in to the mainline as soon as the developer is satisfied with it. The code must be automatically integrated into the build system and become part of the installer and infrastructure of the application. Ideally, a set of automated tests must also be integrated with this process so that the code automatically compiles, links, builds, integrates, deploys, and verifies the basic sanity of the system. Problems during the execution of these steps are visible instantly, letting the developer know something is wrong near real-time. This helps developers fix issues immediately, instead of waiting till the code is picked up and verified by the testers. By attaching lava lamps or any other signaling device to the integration server, you can get real-time information on whether the build was successful. Ideally, a red lamp indicates the build is broken, and a green one indicates a successful build.

31 Best Practices of Continuous Integration

Continuous integration best practices are: • Maintain a single source code repository • Automate the build process, including compilation, linking, packaging, and creating an installer • Make the build self-testing; it should be able to report failures • Make sure everyone commits the code to the main branch every day • Get the code built as it is checked in, that is, every commit should build the mainline on an integration machine • Keep the build fast • Test the build in a sand-box, that is, a clone of the production environment • Make it easy for everybody to get the latest installer with all the most recent changes • Make the process transparent so everybody can observe it, even if the build or the tests fail • Automate deployment

32 Quality Best Practices - Definition of Done

Every project has its own definition of ‘Done’. It is important for the team to agree and commit to this definition. Often, this is in the form of an information radiator posted in the team room. This definition sets the criteria the product must meet to be considered as ‘Done’. Some of these criteria are: The product has passed all the tests, and gained user acceptance or client approval. It has passed an in-house iteration review. And, the product is of a quality that can be shipped or delivered to the customer.

33 Checklist for Story Completion

Teams must come up with their own checklists for marking a story ‘Done’. Let’s look at a sample checklist for story completion. • The story has gone through all the tests: unit, system, and integration tests. • All the functional codes are written. • The design is complete and refactoring done to the team’s satisfaction. • The story is properly integrated across different components of the system, such as the database and UI. • It is integrated into the build system and is part of the build. • It is available through the installer. • The story can be migrated if required, for instance a database schema migration. • Customers and other stakeholders have reviewed the story and confirmed that it meets their requirements. • All the identified bugs are fixed. • And, the customers agree that the story is complete.

34 Quiz

Following is the quiz section to check your understanding of the lesson. Select the correct answer and click Submit to see the feedback.

35 Summary

Let’s summarize the topics covered in this lesson: • Retrospectives are regular reviews of the sprints by the team members to discuss what worked and what needs improvement for the next iteration. • Process analysis is an important technique for anyone who works on understanding a system, defines/refines the requirements, and provides a business/process related solution. • Process tailoring involves customizing Agile processes. • Customer quality delivers value in the short term while technical quality enables continuous delivery of value over time. • Extreme programming recommends creating a coding standard and enforcing adherence to it. • The objective of an iteration in Agile must be to produce code that is of ‘near releasable’ or ‘potentially shippable’ quality. • Quality best practices include: verification and validation, usability testing, TDD, ATDD, Definition of Done, and continuous integration.

36 Conclusion

This concludes ‘Continuous Improvement, Part 1.’ The next part of the domain is ‘Continuous Improvement, Part 2.’

1 Continuous Improvement 1

Hello and welcome to ‘Domain VII—Continuous Improvement’ of the PMI-ACP Certification course offered by Simplilearn. This is the first part of this domain.

2 Objectives

After completing this lesson, you will be able to: • Explain the concepts of Kaizen • Explain retrospectives and various techniques of conducting retrospectives • Identify the steps in process analysis • Describe Agile process tailoring • Explain how Agile embeds quality throughout the project lifecycle • Identify the best practices of continuous integration

3 Kaizen - Introduction

Kaizen is a Japanese word, which stands for continuous improvement. ‘Kai’ means change and ‘Zen’ means good; put together, it means ‘change for the better.’ This technique is used by organizations across industries to come up with a competitive strategy. Kaizen advocates the involvement of people at all levels; everyone is encouraged to come up with small improvements on a continuous basis. The underlying premise of Kaizen is that big results come from small changes, accumulated over time. These small changes improve productivity, effectiveness, and innovation, while reducing waste. To support the concept of continuous improvement, organizations must invest in training and study materials, and provide constant supervision. Click each colored area in the chart to know more.

4 Kaizen - Introduction (contd.)

Agile projects provide the perfect environment to implement Kaizen. As the entire project is broken down into small iterations and releases, the learning from one can be implemented in successive iterations. The focus on continuous improvement—of both the team and product—through ‘inspect and adapt’ is supported in the Scrum framework by three key meetings: • Daily Stand-Up • Sprint Review • Sprint Retrospection The timely resolution of impediments in the impediment log, one of the key artifacts used by a ScrumMaster, facilitates the implementation of Kaizen.

5 Kaizen - Key Aspects

Key aspects of Kaizen are: • Future Thinking, • Waste Reduction, • High Quality, • Low Costs, • Empowerment, • Flexible Practices, • Just-In–Time, • Customer Focus, and • Team Work

6 Retrospectives

Recall the Agile principle, “At regular intervals the team reflects on how to become more effective, then tunes its behavior accordingly.” Retrospectives, integral to Agile projects, are sessions where, the team reflects on what worked and what can be improved. Every sprint typically closes with a Retrospection Meeting, where the entire Scrum team, that is, the Development Team, ScrumMaster, and Product Owner discuss the delivery performance of the current sprint. It is timeboxed to three hours for a monthly sprint, and to correspondingly smaller periods for smaller sprints. The Agile team must understand that the retrospective is not meant to be a dissection, and ensure it is meaningful and fruitful. Instead of being used only after a catastrophic failure, retrospectives must be conducted at regular intervals throughout the life of a project— the purpose being to learn from the experience and not to apportion blame.

7 Retrospectives-Agenda

In retrospectives, teams discuss: “What went well?”, “What did not go well?”, “What needs to change?”, and “What still puzzles us?” The retrospective is not an exercise in fault-finding, instead it focusses on identifying best practices and getting the team’s buy-in for future implementation. Frequent retrospection enables an Agile team to converge on the right solution and meet business needs.

8 Importance of Retrospectives

Retrospectives are necessary in an Agile project environment as they help to reflect on and learn from the team’s experiences. They also allow teams determine their future course of action, based on the lessons learned. Retrospectives also improve communication, and foster free and frank dialogue between team members, thus establishing trust. Finally, retrospectives empower teams to steer their own course and assume responsibility for both product development and their own development and growth.

9 Conducting a Retrospective-Factors

Retrospectives, attended by the team members and the facilitator, can be conducted at the iteration, release, or project levels. By the end of the retrospective, the team must establish a goal to work toward in the next iteration. While conducting a retrospective, the facilitator: • ensures everybody in the team actively participates in it. • should be experienced, neutral, and perceived by the team as non-threatening and helpful. • sets the duration, expectation, and the goals for the retrospective; and also sets the ground rules for the meeting.

10 Conducting a Retrospective - Steps

The retrospective meeting goes through five steps. Click each step in the “Inspect and Adapt” part of the continuous improvement cycle to know more. • In the first step, Set the Stage, the ScrumMaster clearly defines the ground rules with the aim of creating an atmosphere where people can comfortably discuss the impediments impacting the project. While an open discussion on impediments is encouraged, the participants also acknowledge that the retrospective is not the place to make personal criticisms or complaints. • In the next step, the team gathers data relevant to the problems faced during the sprint. You have already learned about problem detection and resolution techniques in the previous domain, including the fishbone diagram, the Five Whys technique, and control limits. • The third step is to generate insights based on the collated data. The team analyses the data to infer the root cause for the identified issues. • In the fourth step, the team decides on the improvements to implement in upcoming sprints to prevent such issues from recurring. • In the last step, Close the Retrospection, the ScrumMaster thanks the team members for their contribution; team members also show appreciation for each other’s help in resolving technical issues during the sprint.

11 Techniques to Conduct Retrospectives

While there are multiple ways to conduct a retrospective, it is important to ensure this meeting does not turn confrontational. The team can adhere to Norm Keith’s Prime Directive: “Regardless of what we discover today, we understand and truly believe that everyone did the best job they could, given what they knew at the time, their skills and abilities, the resources available, and the situation at hand.” Brainstorming and mute mapping are two common techniques used in retrospectives. Brainstorming is a collaborative technique where the team reflects on lessons learned and generates the best ideas for improvements in future sprints. Some ways of conducting brainstorming sessions are focus groups and facilitated workshops. After the brainstorming session, the team works on clubbing ideas that are similar or identical, without any discussion. This ensures that affinities associated with the ideas are quickly identified and are not lost when the conversation is dominated by one participant. This method of segregating ideas is called Mute Mapping.

12 Brainstorming Techniques

To conduct a brainstorming session, the facilitator can use one of the following methods: Round Robin—everyone in the team is given a chance to provide their views in a round robin fashion. Team members can either choose to expand on an earlier issue or add a new perspective. Free-for-All—members can participate without any restriction and provide inputs. The only disadvantage is that the quieter members of the team may not be given any opportunity to voice their views. Quiet Writing— team members write their ideas and pass them to the facilitator. This method limits the influence members may have on each other as the ideas are generated in isolation.

13 Process Analysis Technique

Process analysis is an important technique followed by an architect, Product Owner, business analyst, or anyone who works on understanding a system, defines or refines requirements, and provides a business or process-related solution. The steps involved in process analysis are: One, identify the user of the system. Two, define the goals of the main user. Three, define the usage patterns of the systems. Four, prepare a functional solution to meet user goals and usage patterns. Five, define the main navigation paths in the system. Six, create User Interface or UI mockups. Seven, polish the UI elements with the help of user input.

14 Agile Process Tailoring

Process tailoring involves customizing Agile processes to a given situation. It can include roles, processes, or procedures. The foundation of Agile is flexibility. Beyond the Agile Manifesto and Agile principles, there is very little that is immutable or considered sacred. The team must be able to modify what is not working for them. Some examples of project-specific tailoring are: • Determining how to add or remove work products and tasks • Changing milestones and work products completed at each milestone, along with the expected extent of project or product completion at specific times • Assigning responsibilities for review and approval of work artifacts to specific people using the RACI model; RACI stands for Responsible, Accountable, Concerned, and Informed • Establishing detailed procedures to report progress, perform measurements, manage requirements or change requests, or anything else the team intends to govern differently

15 Agile Process Tailoring - ShuHaRi

The word ShuHaRi comes from the Japanese martial art form of the same name, which recommends the following three stages of gaining knowledge. Click each stage to know more. SHU means “Follow the rules.” Teams newly implementing Agile must follow the guidelines provided by the methodology without tailoring any process. This is because there is a specific reasoning for every Agile process; for instance, lightweight user stories are augmented by the Sprint Planning and Daily Stand-Up meetings to identify and address the gaps early in the project life cycle. HA means “Branch out.” In this stage, teams using the Agile methodology guidelines now have sufficient understanding to explore new techniques and practices. This can be done to either showcase continuous improvement or because the current methodology does not enable value delivery. RI means “Find your own approach.” After gaining sufficient mastery over Agile practices, the team can now create its own practices and guidelines to suit the project dynamics. The key learning here is that before tailoring a process, ensure that the Agile framework is correctly implemented and its benefits reaped.

16 Quality in Agile

The textbook definition of quality is “conformance to requirements and fitness of use.” Quality can be broadly classified as customer quality and technical quality. Customer quality is quality perceived by the customer, and is therefore extrinsic. The customer develops a perception of quality only after the team delivers a working system, which is why Agile focuses on delivering working software early and frequently. The litmus test for customer quality is in delivering “value” to the customer. Technical quality is quality perceived by the development team, and is intrinsic. It enables continual value delivery. Technical quality indicates whether the product is good and whether the team is confident in delivering the product to the customer. Poor quality can cost the organization as it makes products unreliable and negatively impacts customer confidence. More significantly, poor quality indicates an unstable foundation and compromises the team’s ability to be responsive to customer needs, as most of the team’s time is spent on fixing quality issues.

17 Project and Quality Standards for Agile Projects

An Extreme Programming practice, which can be applied to activities other than coding, is to have a well-defined coding standard and enforce adherence, as standards guarantee a basic level of quality. All developers agree to adhere to the following guidelines while writing code: • It must go beyond formatting, and include indentation, commenting, and others. • It must focus on consistency and consensus over perfection. • It must adhere to a minimum set of standards that the entire team follows. Teams must also develop best practices that can be posted in the team room in the form of an information radiator.

18 Quality in Agile - Best Practices

Agile recommends the following best practices, to continuously improve the quality, effectiveness, and value of products. • Verification and validation • Exploratory and usability testing • Test-Driven Development • Acceptance Test-Driven Development • Continuous integration • Definition of Done All these practices aim to deliver a product with zero defects, ensure the solution works according to requirements, and improve the value of the product.

19 Quality Best Practices - Frequent Verification and Validation

The objective of an iteration in Agile is to produce code that is of ‘near-releasable’ or ‘potentially shippable’ quality. This requires the code to have passed through the verification and validation steps. Let’s take a closer look at verification and validation. Verification is testing against the stated requirements. It answers the question, “Is this product built to specification?” Verification includes testing, inspections, and peer reviews. Unit tests, system tests, and statistical analysis are used to identify and remove defects from the product. Conversely, validation answers the question, “Is this the product you wanted?” One of the strengths of Agile is that it enforces frequent validation of the end product with the customer. By frequently checking, using, and even performing tests around the working system, customers can validate whether the product being built matches their expectations, and will deliver the intended value. Validation is achieved by holding frequent demos for the users, using prototypes or wireframes, and encouraging the customers to perform “user acceptance tests” for completed components.

20 Quality Best Practices - Frequent Verification and Validation (contd.)

An Agile iteration produces artifacts, including code and documents, within a timeboxed period. Therefore, verification and validation are performed early and continually throughout the life of an Agile project. Peer or customer reviews are used to verify and validate requirements and design. This is important even if the system is developed for internal use. To verify and validate code, code reviews, unit testing, system and integration testing, which may include both functional and non-functional testing, are used.

21 Quality Best Practices - Exploratory Testing

Although automated unit and functional tests provide the flexibility to execute tests multiple times and support incremental delivery, every aspect of the solution cannot be covered owing to the practical difficulty in automating all possible tests. Therefore, exploratory testing becomes essential to delivering a quality product. Exploratory testing breaks away from the traditional paradigm of software testing using planned test scripts. It is a manual testing technique that requires both testing and business domain expertise. Using exploratory testing in combination with automated tests helps improve overall test coverage.

22 Quality Best Practices - Exploratory Testing (contd.)

Exploratory testing is not a prescriptive method and requires testers to devise creative ways to test an application. It helps identify high-system or workflow-critical defects, which cannot be easily captured in an automated test. Exploratory testers are expected to: • Have extensive application and business knowledge • Find innovative and creative methods simulating possible user behavior • Ensure thorough evaluation while managing time

23 Quality Best Practices - Usability Testing

The focus of usability testing is to gather end-user feedback on the application being developed. In this, end users evaluate the product and express how comfortable they are using it. Usability testing is a type of black-box testing, and covers the following aspects of an application: • Workflow • Layouts • Navigation • Speed and performance • Ease of use • Ease of learning • Error handling • Customer satisfaction • Attractiveness The effectiveness of usability testing entirely relies on the people performing these tests. Therefore, you must select suitable end users for this and ensure they cover every aspect of the solution. In line with the iterative and incremental development model used in Agile, usability testing is carried out at relevant points throughout the project lifecycle, to ensure the product being developed matches end-user expectations.

24 Quality Best Practices - Test - Driven Development

Test-Driven Development, or TDD, was first introduced in Extreme Programming which holds to the belief, “If testing is good, test all the time ”. TDD is Test-First Development, or TFD, combined with continuous refactoring. TDD is an evolutionary, that is, both iterative and incremental, approach to software development where, developers must first write a test, and then write a code that would satisfy the conditions of the test. This ensures there is a rich and robust collection of tests available. By automating these tests and linking them to the Integration Server, every time a piece of code is compiled, you can gauge its quality from the test results. Developers feel more confident in refactoring code and ensuring adherence to coding standards when there is a higher number of available tests.

25 Quality Best Practices - Test - Driven Development (contd.)

TDD is a rapid cycle of testing followed by coding along with refactoring. The tester initially writes some basic acceptance tests for the bare minimum functionality. The developer then writes the code that will pass these tests. The testers then add some more scenarios the developers have to code for. At the same time, developers refactor the code already created so that it remains efficient. This cycle repeats until the testers can no longer visualize tests that will fail. The flowchart given on the screen indicates how TDD works. In Step 1, add basic tests, just enough to cause the tests to fail. Next, execute the test suite, or a subset of it, to verify the tests are indeed failing. In the third step, update the code so that the tests are successful this time. Next, run the tests again to verify if they are successful. If the tests fail again, repeat steps 3 and 4. Finally, if all the tests are successful, think of some more tests that may fail. Repeat the entire cycle, Steps 1 through 6, until you can no longer think of any test that might fail. During this process, developers continue to refactor the code to keep it efficient.

26 Test - Driven Development - Advantages

There are many advantages of TDD. One is that it enforces the principle of Just-In-Time or JIT design just before developing the code, rather than coming up with a detailed design at the beginning of the project. TDD ensures that the team always has automated tests to execute against the code, even before it is written, to make testing quick and easy. This way, Agile developers can validate their work by running tests as often and early as possible. TDD gives developers the confidence to refactor the code to retain the highest quality possible, as the test suite detects if anything is ‘broken’ as a result of refactoring. Any mistakes in refactoring are caught in an automated test. The developers get instant feedback, helping them fix the issues faster. TDD substantially reduces the number of errors in the code because the building process itself guarantees that the tests will succeed rather than fail. TDD improves design and can be extended to test and document external or public interfaces unambiguously and clearly. This guards against mistakes that may inadvertently creep into the system.

27 Quality Best Practices - Acceptance Test - Driven Development

Acceptance Test-Driven Development, or ATDD, differs from TDD as it focusses on writing tests based on the acceptance criteria defined for user stories. Here are some key points related to ATDD: • While unit tests in TDD focus on “building the code right”, acceptance tests focus on “building the right code.” • Each requirement is expressed in terms of inputs and expected outputs. • ATDD requirements are documented in Wiki pages that can be executed once the code is complete. • ATDD involves creating tests before code, the tests represent expectations of the software’s behavior. • While unit tests are white-box tests that assess the internals of a system, acceptance tests are black-box tests that assess the system’s functionality.

28 Acceptance Test - Driven Development Cycle

Let’s look at the ATDD cycle. The diagram given here shows a typical ATDD cycle comprising four stages: Discuss, Distill, Develop, and Demo. Click each stage to know more. In the ‘Discuss’ stage, the team holds a discussion with the business stakeholders who have requested a feature or story, and develops a detailed understanding of the system’s behavior from the end-user point of view. Based on this, the team defines technical interfaces and writes acceptance tests that can be executed automatically or programmatically. In ‘Distill’, the team fashions or implements tests to be incorporated into the automated testing framework. This ensures that the tests don’t just remain specifications, but actually become “executable.” Examples of test automation frameworks that support defining the tests before implementation are the FIT, FitNesse, Concordian, and Robot frameworks. In the ‘Develop’ stage, the team develops the code following a TDD approach, that is, they first test, write code to pass the tests, refactor, and so on. In the stage ‘Demo’, the team ensures they cannot envision any more tests or scenarios that may cause the system to fail. They then provide a demo of the system to the stakeholders, indicating the tests run and any vulnerability, if identified. Manual exploratory testing is also done to reveal gaps in the acceptance criteria and discover defects that occur when multiple user stories are executed in a scenario.

29 Effectiveness of TDD - Experiment 1

In this example, let’s discuss how TDD helped an IT company deliver quality products. The organization has been using TDD for five years and for over ten releases of a Java-implemented product. A set of experiments was conducted with two groups of programmers, a control group and the other using TDD. The participants were asked to develop a short program to automate the scoring of a bowling game. The results were: • TDD developers took 16% more time to write the tests, and then develop the code. • The TDD group’s code passed 18% more functional black-box test cases compared to the control group. Additionally, the TDD team came up with good-quality automated test cases, whereas the control group was not able to write any worthwhile automated test cases.

30 Quality Best Practices - Continuous Integration

Continuous integration is one of the twelve practices of Extreme Programming. It originates from the belief, “If integration is good, then integrate all the time.” It is an engineering best practice that can be extended to all projects, regardless of methodology. In continuous integration, the code is checked in to the mainline as soon as the developer is satisfied with it. The code must be automatically integrated into the build system and become part of the installer and infrastructure of the application. Ideally, a set of automated tests must also be integrated with this process so that the code automatically compiles, links, builds, integrates, deploys, and verifies the basic sanity of the system. Problems during the execution of these steps are visible instantly, letting the developer know something is wrong near real-time. This helps developers fix issues immediately, instead of waiting till the code is picked up and verified by the testers. By attaching lava lamps or any other signaling device to the integration server, you can get real-time information on whether the build was successful. Ideally, a red lamp indicates the build is broken, and a green one indicates a successful build.

31 Best Practices of Continuous Integration

Continuous integration best practices are: • Maintain a single source code repository • Automate the build process, including compilation, linking, packaging, and creating an installer • Make the build self-testing; it should be able to report failures • Make sure everyone commits the code to the main branch every day • Get the code built as it is checked in, that is, every commit should build the mainline on an integration machine • Keep the build fast • Test the build in a sand-box, that is, a clone of the production environment • Make it easy for everybody to get the latest installer with all the most recent changes • Make the process transparent so everybody can observe it, even if the build or the tests fail • Automate deployment

32 Quality Best Practices - Definition of Done

Every project has its own definition of ‘Done’. It is important for the team to agree and commit to this definition. Often, this is in the form of an information radiator posted in the team room. This definition sets the criteria the product must meet to be considered as ‘Done’. Some of these criteria are: The product has passed all the tests, and gained user acceptance or client approval. It has passed an in-house iteration review. And, the product is of a quality that can be shipped or delivered to the customer.

33 Checklist for Story Completion

Teams must come up with their own checklists for marking a story ‘Done’. Let’s look at a sample checklist for story completion. • The story has gone through all the tests: unit, system, and integration tests. • All the functional codes are written. • The design is complete and refactoring done to the team’s satisfaction. • The story is properly integrated across different components of the system, such as the database and UI. • It is integrated into the build system and is part of the build. • It is available through the installer. • The story can be migrated if required, for instance a database schema migration. • Customers and other stakeholders have reviewed the story and confirmed that it meets their requirements. • All the identified bugs are fixed. • And, the customers agree that the story is complete.

34 Quiz

Following is the quiz section to check your understanding of the lesson. Select the correct answer and click Submit to see the feedback.

35 Summary

Let’s summarize the topics covered in this lesson: • Retrospectives are regular reviews of the sprints by the team members to discuss what worked and what needs improvement for the next iteration. • Process analysis is an important technique for anyone who works on understanding a system, defines/refines the requirements, and provides a business/process related solution. • Process tailoring involves customizing Agile processes. • Customer quality delivers value in the short term while technical quality enables continuous delivery of value over time. • Extreme programming recommends creating a coding standard and enforcing adherence to it. • The objective of an iteration in Agile must be to produce code that is of ‘near releasable’ or ‘potentially shippable’ quality. • Quality best practices include: verification and validation, usability testing, TDD, ATDD, Definition of Done, and continuous integration.

36 Conclusion

This concludes ‘Continuous Improvement, Part 1.’ The next part of the domain is ‘Continuous Improvement, Part 2.’

  • Disclaimer
  • PMP, PMI, PMBOK, CAPM, PgMP, PfMP, ACP, PBA, RMP, SP, and OPM3 are registered marks of the Project Management Institute, Inc.

We use cookies on this site for functional and analytical purposes. By using the site, you agree to be cookied and to our Terms of Use. Find out more

Request more information

For individuals
For business
Name*
Email*
Phone Number*
Your Message (Optional)

By proceeding, you agree to our Terms of Use and Privacy Policy

We are looking into your query.
Our consultants will get in touch with you soon.

A Simplilearn representative will get back to you in one business day.

First Name*
Last Name*
Email*
Phone Number*
Company*
Job Title*

By proceeding, you agree to our Terms of Use and Privacy Policy