Sprint Architecture
Backlog Item Breakdown
Percent | Task |
---|---|
50% |
|
30% | Other tech or sprint-related work including:
|
10% |
|
10% |
|
Required Backlog Items
Ad-hoc testing
FR translation testing
Daily Breakdown
Length: 10 business days
Weekday | Task | Team Members | |
---|---|---|---|
1 | Wednesday |
| Maggie / Jugraj |
2 | Thursday |
| Ewa |
3 | Friday | ||
4 | Monday | ||
5 | Tuesday |
| Ewa |
6 | Wednesday | ||
7 | Thursday |
| Ewa |
8 | Friday | ||
9 | Monday |
| |
10 | Tuesday |
| Ewa |
Dev - Backlog item checklist
Team Member | Task | |
---|---|---|
1 | Dev #1 |
|
2 | Dev # 2 |
|
3 |
One-off Sprints
Bug fix only sprint
Full regression test
Automated Testing
Can use playwright for automated testing - have we solved the issue of authentication?
Sprint Branches
Task | Branch | Team Member |
---|---|---|
Create sprint branch {on first day of sprint?} | Sprint branch | |
Push sprint code to sprint branch | Sprint branch | Dev team |
Sprint Environments
Task | Environment | Team Member |
---|---|---|
Dev testing | Dev | Dev |
Manual QA test, automated test | {Test?} | Dev and QA |
{ACC?} | Dev and QA |
Sprint decision-makers
Task | Team Member |
---|---|
Whether or not a bug or UI change suggestion will be added to the backlog | |
Regression Test
Tasks
Schedule
Weekday | Tasks | Team Member | |
---|---|---|---|
1 | Wednesday | ||
2 | Thursday | ||
3 | Friday | ||
4 | Monday | ||
5 | Tuesday | ||
6 | Wednesday | ||
7 | Thursday | ||
8 | Friday | ||
9 | Monday | ||
10 | Tuesday |
Items to Action
Decision | Team Member | Status |
---|---|---|
Who will decide whether or not a bug or UI change suggestion will be added to the backlog | ||
Create a feature to hold items we will never fix, so the team can use it as a reference in the future | ||
In which environment should we test backlog items and Playwright tests - Test or ACC, or both? |
Release process with shippable feature
We can ship in mid-sprint.
Explore - do we want to start shipping each story? We can just ship the story when it’s ready vs a specific release day. Ashiq will look into it. Meta does this.
Day 1 - Discuss what we can ship in the morning (we can tack it on the scrum), and ship in the afternoon. PO can inform clients of shipment.
Day 2, 3 and 4:
Dev - Any backlog items that need to be tested from days 9 and 10 of last sprint.
Dev cannot test backlog items they developed.
Dev, PO, Ewa, Maggie - Ad-hoc testing of app (30min to 1 hr.)
Dev - meeting with dev and QA sprint tickets and identify test cases as a group - discuss possible test cases for each ticket, which tickets interact. Test cases stored in DevOps.
{PO? - UAT prep - Word doc and spreadsheet creation.}
{PO -
{Option 1: Provide list of backlog items to users. They can test if they have time in ACC. If they do not have time, they do not have to test. One hour of testing only max. Perhaps we can send a copy of the weekly agenda, and they can test if they like.}
{Option 2: PO hosts regular bi-weekly testing session with users to test backlog items completed last sprint in ACC. Users only attend if they can. }
Day 9 - identify what has met the definition of done. This can be shipped mid-next sprint. Anything else that is not done will be completed and shipped the following sprint.
Day 10 - Sprint review, sprint planning. Finish and wrap up what we can.
Note: we shouldn’t worry too much about carry-over at this point.
Day 1, 2 and 3:
Dev - Any backlog items that need to be tested from days 9 and 10 of last sprint. Dev cannot test backlog items they developed.
Dev, PO, Maggie - Ad-hoc testing of app (30min to 1 hr.)?
Dev- Review sprint tickets and identify test cases as a group? (Question: where should test cases be stored?)
{PO? - UAT prep - Word doc and spreadsheet creation.}
{PO -
{Option 1: Provide list of backlog items to users. They can test if they have time in ACC. If they do not have time, they do not have to test. One hour of testing only max. Perhaps we can send a copy of the weekly agenda, and they can test if they like.}
{Option 2: PO hosts regular bi-weekly testing session with users to test backlog items completed last sprint in ACC. Users only attend if they can. }
Day 4, 5, 6,
Dev work
Dev testing
Dev user story writing
Day 8:
PO walkover (if needed) - 15-30 minutes max.
Day 9:
Dev: Sprint review planning - add items to present at sprint review to agenda
Day 10:
Dev: Sprint review dry run - prepare demo and participate in sprint review dry run
Dev: Sprint review - participate in sprint review
PO / Users: Sprint review - provide users with {test} URL, and they can test if they are interested.
Definition of Done
Acceptance criteria is complete
Unit test is complete
{Automated testing is complete}
Dev has tested in QA, ACC and prod (if possible)
Dev sends screenshot of new UI work, or other items for which feedback is needed, to PO during development process for sign-off. Do not need to send screenshot for bug fixes.
No other sign-off is necessary as PO has seen screenshots, walk-through and review
When a story is done, and anything around it is tested so it is shippable
Future consideration:
Security
Code coverage
Testing
Ad-hoc testing:
Dev, QA (Ewa), PO - 30min to 1hr per sprint
Basic level: - Dev’s only
Writing test cases
Reviewing each other test cases - manually
Executing each other test cases - manually
Still valuable - can start automating in the future
Careful not to add too many testing activities to the sprint for dev’s
Automated testing:
B Unit testing - more like unit testing for the UI - not full UI automated testing
Regression Testing
Release Plan
Release Plan - brainstorming
1 month release schedule - Release Train
Total: 2 sprints
Sprint 1
Tech debt - some tech debt might need to be tested. But it is safe to do tech debt at the end of the sprint. At minimum, as sanity check might need to happen.
Dev:
60% - New development
10% - Bug Fixes
20% - Tech debt
10% - Release activities
QA:
70% - Backlog item testing
20% - User testing - day 10? Either PO or QA?
10% - UAT prep
Sprint 2
Sprint 3
Dev:
QA:
50% - ACC environment testing
50% - UAT testing (with users) - Test environment?
Sprint 4
Dev:
50% - Release activities
?? % - Critical bug fixes?
?? - ??
Dev - Release Architecture Needed
Manual to automated - Jugraj is working with Henry on this
One click deployment
Patch fixes - how do we implement?
Anything else?
Architecture: With QA resource - brainstorming only
Day 1, 2 and 3:
QA - Any backlog items that need to be tested from days 9 and 10 of last sprint.
QA - Ad-hoc testing of app (30min to 1 hr.).
QA - UAT prep - Word doc and spreadsheet creation.
Dev - Work on top priority backlog items
{PO -
{Option 1: Provide list of backlog items to users. They can test if they have time in ACC. If they do not have time, they do not have to test.}
{Option 2: PO hosts regular bi-weekly testing session with users to test backlog items completed last sprint. Users only attend if they can.}
Day 4, 5, 6, 7:
Testing - if QA resources - brainstorming only
If QA:
First three days, can catch up on testing
Writing test cases
Ad-hoc testing
Review test cases with one of dev team’s members
Executing test cases
Test all tickets within one sprint - if possible
Testing - brainstorming only
Discussion with Grace - April 4
Bugs addressed during the sprint. Leave some space for bug testing during the sprint
10 business days
First 3 days:
Ad-hoc testing
UAT prep - documentation regarding the previous sprint
Users can test what was completed last sprint - Grace or Steve, do not need two team members
TGD team - QA team tests first, and then PO or user tests, and the user or PO marks it as done. User or PO testing does not need to be complete during the same sprint.
Issues to think about:
How to track modifications to backlog items. E.g. when testing, if something needs to be added, what should we do?
Discussion with Ashiq
PO and users:
test in ACC
They can test on the first day of the next sprint - will will give them a list of what we completed and they can test if they can, if not, that is ok - maybe 1 hr of testing for them
Dev - send a screenshot of new UI before the functionality is built to get feedback from PO via email. Do not need to send screenshots for things such as bugs, and FR translations. Only things we need feeback on.
PO walkover - two days before the end of sprint - 15-30 min to take a quick look if screenshots are not enough - flexible. PO may not need it every week.
Can show him work still in progress
Users can test in test environment if they wish after sprint review. We can give them a few bullet points of what we worked on. Either at end of review or first day of sprint.
Test Cases - brainstorming only
Questions:
What will the test cases be used for? (e.g. backlog testing, automated testing, future use)
How can we minimize the amount of work required to write a test case? E.g. focus on developing a specific feature per sprint, so test case writing can start on day 1 - ? Perhaps identify which backlog items will need test cases at the beginning of the sprint, so test case writing can start at the beginning?
How would we work with dev’s only to write test cases?
Discussion with Ashiq
QA free from manual testing
Dev’s test each other’s work / test cases
Ad-hoc testing - Grace, Maggie, dev’s, Ewa, PO - anyone who can help
Automated testing - B Unit testing - more like unit testing for the UI - not full UI automated testing
Concern - not to add to many testing activities to dev team
Basic level: - Dev’s only
Writing test cases
Reviewing each other test cases - manually
Executing each other test cases - manually
Still valuable - can start automating in the future
If QA:
First three days, can catch up on testing
Writing test cases
Ad-hoc testing
Review test cases with one of dev team’s members
Executing test cases
Test all tickets within one sprint - if possible
Currently we have 3 dev’s and one QA
Need to keep quality or speed
Other work environments - 1 to 1 dev team - work at writing the user stories from the beginning of sprint
Scope
Time
Quality
Automated Testing
How can we implement?
How can we keep QA involved?
What is the outcome of implementation? E.g. less QA manual testing
Unit Testing
{Add short write-up, description}
Release Planning Architecture
2 month release schedule - Release Train
Total: 4 sprints
Sprint 1
Tech debt - some tech debt might need to be tested. But it is safe to do tech debt at the end of the sprint. At minimum, as sanity check might need to happen.
Dev:
60% - New development
10% - Bug Fixes
20% - Tech debt
10% - Release activities
QA:
70% - Backlog item testing
20% - User testing - day 10? Either PO or QA?
10% - UAT prep
Sprint 2
Sprint 3
Dev:
QA:
50% - ACC environment testing
50% - UAT testing (with users) - Test environment?
Sprint 4
Dev:
50% - Release activities
?? % - Critical bug fixes?
?? - ??
Dev - Release Architecture Needed
Manual to automated - Jugraj is working with Henry on this
One click deployment
Patch fixes - how do we implement?
Anything else?