Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 25 Next »

Sprint Architecture

Backlog Item Breakdown

Percent

Task

50%

  • New releasable feature work

10%

  • Bug Fixes

20%

Other tech or sprint-related work including:

  • Ad-hoc testing - EN and FR

  • Missed unit or automated tests

  • Tech debt

10%

  • Release activities (including notes and change log)

  • Documentation

  • Make sure Playwright tests are passing

10%

  • Buffer

Release Process

We will ship once per sprint.

Process

Task

Ship once per sprint

Ship all shippable features once per sprint

Daily Breakdown

Length: 10 business days

Day

Task

Team Members

1-3

  • Write change management ticket for release in two weeks (not next release but the following release)

  • Maggie & Jugraj

  • Create sprint branch

  • Ashiq

  • QA & dev meeting to identify possible test cases as a group

  • Complete any additional testing of backlog items from previous sprint (if required)

  • Write test cases for PBI and bugs in current sprint

  • Ad-hoc testing in QA environment (30min-1hr)

  • Test backlog items as needed

  • Ewa & Tarek

  • Develop backlog items

  • Write test cases as needed

  • Test each other’s backlog items as needed

  • Dev Team

  • UAT testing - if needed - when should we do prep? UAT Test on day 1?

  • QA & Steve

Day 4

  • Final sign off on backlog items to ship

  • Jugraj, Ashiq and Maggie

  • ACC smoke test

  • QA, dev and Maggie

Day 5

  • Release ACC to prod

  • Jugraj

  • Write test cases for PBI and bugs in current sprint as needed

  • Test backlog items as needed

  • Ewa & Tarek

  • Develop backlog items

  • Write test cases as needed

  • Test each other’s backlog items as needed

  • Dev Team

Day 6-8

  • Write test cases for PBI and bugs in current sprint as needed

  • Test backlog items as needed

  • Ewa & Tarek

  • Develop backlog items

  • Write test cases as needed

  • Test each other’s backlog items as needed

  • Dev Team

Day 9

  • Identify what has met the definition of done. They can be shipped on day 5 of next sprint. Anything else that is not done will be completed and shipped the following sprint.

  • Jugraj, Ashiq and Maggie

  • Write test cases for PBI and bugs in current sprint as needed

  • Test backlog items as needed

  • Tarek & Maggie

  • Develop backlog items

  • Write test cases as needed

  • Test each other’s backlog items as needed

  • Dev Team

Day 10

  • Sprint review &sprint planning.

  • Finish and wrap up what can be done

  • QA & dev team

Dev - Backlog item checklist

Team Member

Task

Dev #1

  • Complete backlog item or bug development

Dev # 2

  • Write test case

  • Create automated test

  • Manually test feature and surrounding features

Shippable Stories

Type

Required to Ship

Tech Debt

  • Definition of done is met

PBI

  • Definition of done is met

Bug

  • Definition of done is met

One-off Sprints

  • Bug fix only sprint

  • Full regression test

Automated Testing

  • Can use playwright for automated testing - have we solved the issue of authentication?

Sprint Branches

Task

Branch

Team Member

Create sprint branch

Sprint branch

Ashiq

Push sprint code to sprint branch

Sprint branch

Dev team

Sprint Environments

Task

Environment

Dev testing & Playwright

Dev

Current Release

ACC

Next Release*

QA

*When work in QA is ready for next release, they will be pushed to ACC

Sprint decision-makers

Task

Team Member

Whether or not a bug or UI change suggestion will be added to the backlog

Regression Test

Tasks

Schedule

Weekday

Tasks

Team Member

1

Wednesday

2

Thursday

3

Friday

4

Monday

5

Tuesday

6

Wednesday

7

Thursday

8

Friday

9

Monday

10

Tuesday

Release Process

There are two processes to choose from:

Process

Task

1

Ship once per sprint

Ship all shippable features once per sprint

2

Ship multiple times per sprint

Ship each story when it is ready

We can ship in mid-sprint.

Explore - do we want to start shipping each story? We can just ship the story when it’s ready vs a specific release day. Ashiq will look into it. Meta does this.

Day 1 - Discuss what we can ship in the morning (we can tack it on the scrum), and ship in the afternoon. PO can inform clients of shipment.

Sprint

Plan

Tasks

Sprint 1

Sprint 2

Sprint 3

Individual Sprint Plan

Weekday

Tasks

Team Member

1

Wednesday

Shipping option #1

  • Complete any additional testing of backlog items from previous sprint (if required)

2

Thursday

  • Complete any additional testing of backlog items from previous sprint (if required)

  • Ad-hoc testing - Ewa

3

Friday

  • Complete any additional testing of backlog items from previous sprint (if required)

4

Monday

  • Sign off on backlog items to ship

5

Tuesday

Shipping option #2

  • Morning: Ship stories that are complete

  • Afternoon:

Jugraj, Steve

Items to Action

Question

Decision

Status

Who will decide whether or not a bug or UI change suggestion will be added to the backlog

Create a feature to hold items we will never fix, so the team can use it as a reference in the future

In which environment should we test backlog items and Playwright tests - Test or ACC, or both?

If we have a single sprint branch, can we separate stories that are not ready to ship from stories that are ready to ship?

Do we need an SMGS ticket for every time we ship an item (e.g. one every two weeks?)

Testing - Dave

Test cases:

  1. Submit form with no info added - purpose: to get error / warning message

  2. Test happy case / path - fill out with proper info and make sure it is saved correctly on the frontend, retrievable in whatever form (e.g. in table, on same page - if looked up later). Need to know what type of data you are testing for. One happy test case.

  3. Test each field for the limits (e.g. numbers/special characters - in letter only field; enter a huge number in a field, if entering names; making sure app does not crash) - SQL injection- coded this - can add extra info or delete info (drop table command). Write user stories that have failures - e.g. enter special characters and get an error message.

  4. If multi-page form, make sure it appears as needed (in another form or in a report or db)

  5. Test for usability - language presented is understandable in FR and ENG; check for consistency, fields line up

  6. Every field / function needs to be tested

  7. Test what it interacts with - e.g. if needs to be populated in a report, then add to the test

To start:

  1. Tester would probably start with ad-hoc testing first to get familiar

  2. Then write test cases

Test access to a URL - cannot access pages they are not supposed to - eg. enter admin page URL, and test with role-based access.

Developers wrote automated tests based on what was written by the QA

Changes - automated software generates a test code, and then can update the test code. Then save and rerun the test code.

Release process with shippable feature

Day 2, 3 and 4:

  • Dev - Any backlog items that need to be tested from days 9 and 10 of last sprint.

  • Dev cannot test backlog items they developed.

  • Dev, PO, Ewa, Maggie - Ad-hoc testing of app (30min to 1 hr.)

  • Dev - meeting with dev and QA sprint tickets and identify test cases as a group - discuss possible test cases for each ticket, which tickets interact. Test cases stored in DevOps.

  • {PO? - UAT prep - Word doc and spreadsheet creation.}

  • {PO -

    • {Option 1: Provide list of backlog items to users. They can test if they have time in ACC. If they do not have time, they do not have to test. One hour of testing only max. Perhaps we can send a copy of the weekly agenda, and they can test if they like.}

    • {Option 2: PO hosts regular bi-weekly testing session with users to test backlog items completed last sprint in ACC. Users only attend if they can. }

Day 9 - identify what has met the definition of done. This can be shipped mid-next sprint. Anything else that is not done will be completed and shipped the following sprint.

Day 10 - Sprint review, sprint planning. Finish and wrap up what we can.

Note: we shouldn’t worry too much about carry-over at this point.

Day 1, 2 and 3:

  • Dev - Any backlog items that need to be tested from days 9 and 10 of last sprint. Dev cannot test backlog items they developed.

  • Dev, PO, Maggie - Ad-hoc testing of app (30min to 1 hr.)?

  • Dev- Review sprint tickets and identify test cases as a group? (Question: where should test cases be stored?)

  • {PO? - UAT prep - Word doc and spreadsheet creation.}

  • {PO -

    • {Option 1: Provide list of backlog items to users. They can test if they have time in ACC. If they do not have time, they do not have to test. One hour of testing only max. Perhaps we can send a copy of the weekly agenda, and they can test if they like.}

    • {Option 2: PO hosts regular bi-weekly testing session with users to test backlog items completed last sprint in ACC. Users only attend if they can. }

Day 4, 5, 6,

  • Dev work

  • Dev testing

  • Dev user story writing

Day 8:

  • PO walkover (if needed) - 15-30 minutes max.

Day 9:

  • Dev: Sprint review planning - add items to present at sprint review to agenda

Day 10:

  • Dev: Sprint review dry run - prepare demo and participate in sprint review dry run

  • Dev: Sprint review - participate in sprint review

  • PO / Users: Sprint review - provide users with {test} URL, and they can test if they are interested.

Definition of Done

  • Acceptance criteria is complete

  • Unit test is complete

  • {Automated testing is complete}

  • Documentation for security groups

  • Dev has tested in QA, ACC and prod (if possible)

  • Dev sends screenshot of new UI work, or other items for which feedback is needed, to PO during development process for sign-off. Do not need to send screenshot for bug fixes.

  • No other sign-off is necessary as PO has seen screenshots, walk-through and review

  • When a story is done, and anything around it is tested so it is shippable

Future consideration:

  • Security

  • Code coverage

Testing

Ad-hoc testing:

  • Dev, QA (Ewa), PO - 30min to 1hr per sprint

Basic level: - Dev’s only

  • Writing test cases

  • Reviewing each other test cases - manually

  • Executing each other test cases - manually

  • Still valuable - can start automating in the future

  • Careful not to add too many testing activities to the sprint for dev’s

Automated testing:

  • B Unit testing - more like unit testing for the UI - not full UI automated testing

Regression Testing

Release Plan

Release Plan - brainstorming

1 month release schedule - Release Train

Total: 2 sprints

Sprint 1

Tech debt - some tech debt might need to be tested. But it is safe to do tech debt at the end of the sprint. At minimum, as sanity check might need to happen.

Dev:

60% - New development

10% - Bug Fixes

20% - Tech debt

10% - Release activities

QA:

70% - Backlog item testing

20% - User testing - day 10? Either PO or QA?

10% - UAT prep

Sprint 2

Sprint 3

Dev:

QA:

50% - ACC environment testing

50% - UAT testing (with users) - Test environment?

Sprint 4

Dev:

50% - Release activities

?? % - Critical bug fixes?

?? - ??

Dev - Release Architecture Needed

Manual to automated - Jugraj is working with Henry on this

One click deployment

Patch fixes - how do we implement?

Anything else?

Architecture: With QA resource - brainstorming only

Day 1, 2 and 3:

  • QA - Any backlog items that need to be tested from days 9 and 10 of last sprint.

  • QA - Ad-hoc testing of app (30min to 1 hr.).

  • QA - UAT prep - Word doc and spreadsheet creation.

  • Dev - Work on top priority backlog items

  • {PO -

    • {Option 1: Provide list of backlog items to users. They can test if they have time in ACC. If they do not have time, they do not have to test.}

    • {Option 2: PO hosts regular bi-weekly testing session with users to test backlog items completed last sprint. Users only attend if they can.}

Day 4, 5, 6, 7:

Testing - if QA resources - brainstorming only

If QA:

  • First three days, can catch up on testing

  • Writing test cases

  • Ad-hoc testing

  • Review test cases with one of dev team’s members

  • Executing test cases

  • Test all tickets within one sprint - if possible

Testing - brainstorming only

Discussion with Grace - April 4

Bugs addressed during the sprint. Leave some space for bug testing during the sprint

10 business days

First 3 days:

  • Ad-hoc testing

  • UAT prep - documentation regarding the previous sprint

  • Users can test what was completed last sprint - Grace or Steve, do not need two team members

  • TGD team - QA team tests first, and then PO or user tests, and the user or PO marks it as done. User or PO testing does not need to be complete during the same sprint.

Issues to think about:

  • How to track modifications to backlog items. E.g. when testing, if something needs to be added, what should we do?

Discussion with Ashiq

PO and users:

  • test in ACC

  • They can test on the first day of the next sprint - will will give them a list of what we completed and they can test if they can, if not, that is ok - maybe 1 hr of testing for them

  • Dev - send a screenshot of new UI before the functionality is built to get feedback from PO via email. Do not need to send screenshots for things such as bugs, and FR translations. Only things we need feeback on.

  • PO walkover - two days before the end of sprint - 15-30 min to take a quick look if screenshots are not enough - flexible. PO may not need it every week.

    • Can show him work still in progress

  • Users can test in test environment if they wish after sprint review. We can give them a few bullet points of what we worked on. Either at end of review or first day of sprint.

Test Cases - brainstorming only

Questions:

  • What will the test cases be used for? (e.g. backlog testing, automated testing, future use)

  • How can we minimize the amount of work required to write a test case? E.g. focus on developing a specific feature per sprint, so test case writing can start on day 1 - ? Perhaps identify which backlog items will need test cases at the beginning of the sprint, so test case writing can start at the beginning?

How would we work with dev’s only to write test cases?

Discussion with Ashiq

QA free from manual testing

  • Dev’s test each other’s work / test cases

  • Ad-hoc testing - Grace, Maggie, dev’s, Ewa, PO - anyone who can help

  • Automated testing - B Unit testing - more like unit testing for the UI - not full UI automated testing

  • Concern - not to add to many testing activities to dev team

Basic level: - Dev’s only

  • Writing test cases

  • Reviewing each other test cases - manually

  • Executing each other test cases - manually

  • Still valuable - can start automating in the future

If QA:

  • First three days, can catch up on testing

  • Writing test cases

  • Ad-hoc testing

  • Review test cases with one of dev team’s members

  • Executing test cases

  • Test all tickets within one sprint - if possible

Currently we have 3 dev’s and one QA

Need to keep quality or speed

Other work environments - 1 to 1 dev team - work at writing the user stories from the beginning of sprint

Scope

Time

Quality

Automated Testing

  • How can we implement?

  • How can we keep QA involved?

  • What is the outcome of implementation? E.g. less QA manual testing

Unit Testing

{Add short write-up, description}

Release Planning Architecture

2 month release schedule - Release Train

Total: 4 sprints

Sprint 1

Tech debt - some tech debt might need to be tested. But it is safe to do tech debt at the end of the sprint. At minimum, as sanity check might need to happen.

Dev:

60% - New development

10% - Bug Fixes

20% - Tech debt

10% - Release activities

QA:

70% - Backlog item testing

20% - User testing - day 10? Either PO or QA?

10% - UAT prep

Sprint 2

Sprint 3

Dev:

QA:

50% - ACC environment testing

50% - UAT testing (with users) - Test environment?

Sprint 4

Dev:

50% - Release activities

?? % - Critical bug fixes?

?? - ??

Dev - Release Architecture Needed

Manual to automated - Jugraj is working with Henry on this

One click deployment

Patch fixes - how do we implement?

Anything else?

  • No labels