Table of Contents | ||||
---|---|---|---|---|
|
...
Day | Task | Team Members | ||
---|---|---|---|---|
1-3 |
|
| ||
|
| |||
|
| |||
|
| |||
|
| |||
|
|
|
| |
Day 4 |
|
| ||
|
| |||
Day 5 |
|
| ||
|
| |||
|
| |||
Day 6-8 |
|
| ||
|
| |||
Day 9 |
|
| ||
|
| |||
|
| |||
Day 10 |
|
|
One-off Sprints
Bug fix only sprint
Full regression test
Dev - Backlog item checklist
Team Member | Task |
---|---|
Dev #1 |
|
Dev # 2 |
|
...
Type | Required to Ship |
---|---|
Tech Debt |
|
PBI |
|
Bug |
|
One-off Sprints
Bug fix only sprint
Full regression test
Automated Testing
...
UAT
Product Owner or QA:
Provide list of backlog items to users
Users can test in ACC if they have time
If users do not have time, they do not have to test (not required)
Users should only test one hour max.
Sprint Branches
Task | Branch | Team Member |
---|---|---|
Create sprint branch | Sprint branch | Ashiq |
Push sprint code to sprint branch | Sprint branch | Dev team |
...
*When work in QA is ready for next release, they will be pushed to ACC
Sprint decision-makers
...
Task
...
Team Member
...
Whether or not a bug or UI change suggestion will be added to the backlog
Regression Test
Tasks
Schedule
...
Weekday
...
Tasks
...
Team Member
...
Wednesday
...
Thursday
...
Friday
...
Monday
...
Tuesday
...
Wednesday
...
Thursday
...
Friday
...
Monday
...
Tuesday
Release Process
There are two processes to choose from:
...
Process
...
Task
...
Ship once per sprint
...
Ship all shippable features once per sprint
...
Ship multiple times per sprint
...
Ship each story when it is ready
We can ship in mid-sprint.
Explore - do we want to start shipping each story? We can just ship the story when it’s ready vs a specific release day. Ashiq will look into it. Meta does this.
Day 1 - Discuss what we can ship in the morning (we can tack it on the scrum), and ship in the afternoon. PO can inform clients of shipment.
...
Sprint
...
Plan
...
Tasks
...
Sprint 1
...
Sprint 2
...
Sprint 3
Individual Sprint Plan
...
Weekday
...
Tasks
...
Team Member
...
Wednesday
...
Shipping option #1
Complete any additional testing of backlog items from previous sprint (if required)
...
Thursday
...
Complete any additional testing of backlog items from previous sprint (if required)
Ad-hoc testing - Ewa
...
Friday
...
Complete any additional testing of backlog items from previous sprint (if required)
...
Monday
...
Sign off on backlog items to ship
...
Tuesday
...
Shipping option #2
Morning: Ship stories that are complete
Afternoon:
...
Jugraj, Steve
Items to Action
...
Question
...
Decision
...
Status
...
Who will decide whether or not a bug or UI change suggestion will be added to the backlog
...
Create a feature to hold items we will never fix, so the team can use it as a reference in the future
...
In which environment should we test backlog items and Playwright tests - Test or ACC, or both?
...
If we have a single sprint branch, can we separate stories that are not ready to ship from stories that are ready to ship?
...
Do we need an SMGS ticket for every time we ship an item (e.g. one every two weeks?)
Testing - Dave
Test cases:
Submit form with no info added - purpose: to get error / warning message
Test happy case / path - fill out with proper info and make sure it is saved correctly on the frontend, retrievable in whatever form (e.g. in table, on same page - if looked up later). Need to know what type of data you are testing for. One happy test case.
Test each field for the limits (e.g. numbers/special characters - in letter only field; enter a huge number in a field, if entering names; making sure app does not crash) - SQL injection- coded this - can add extra info or delete info (drop table command). Write user stories that have failures - e.g. enter special characters and get an error message.
If multi-page form, make sure it appears as needed (in another form or in a report or db)
Test for usability - language presented is understandable in FR and ENG; check for consistency, fields line up
Every field / function needs to be tested
Test what it interacts with - e.g. if needs to be populated in a report, then add to the test
To start:
Tester would probably start with ad-hoc testing first to get familiar
Then write test cases
Test access to a URL - cannot access pages they are not supposed to - eg. enter admin page URL, and test with role-based access.
Developers wrote automated tests based on what was written by the QA
Changes - automated software generates a test code, and then can update the test code. Then save and rerun the test code.
Release process with shippable feature
Day 2, 3 and 4:
Dev - Any backlog items that need to be tested from days 9 and 10 of last sprint.
Dev cannot test backlog items they developed.
Dev, PO, Ewa, Maggie - Ad-hoc testing of app (30min to 1 hr.)
Dev - meeting with dev and QA sprint tickets and identify test cases as a group - discuss possible test cases for each ticket, which tickets interact. Test cases stored in DevOps.
{PO? - UAT prep - Word doc and spreadsheet creation.}
{PO -
{Option 1: Provide list of backlog items to users. They can test if they have time in ACC. If they do not have time, they do not have to test. One hour of testing only max. Perhaps we can send a copy of the weekly agenda, and they can test if they like.}
{Option 2: PO hosts regular bi-weekly testing session with users to test backlog items completed last sprint in ACC. Users only attend if they can. }
Day 9 - identify what has met the definition of done. This can be shipped mid-next sprint. Anything else that is not done will be completed and shipped the following sprint.
Day 10 - Sprint review, sprint planning. Finish and wrap up what we can.
Note: we shouldn’t worry too much about carry-over at this point.
Day 1, 2 and 3:
Dev - Any backlog items that need to be tested from days 9 and 10 of last sprint. Dev cannot test backlog items they developed.
Dev, PO, Maggie - Ad-hoc testing of app (30min to 1 hr.)?
Dev- Review sprint tickets and identify test cases as a group? (Question: where should test cases be stored?)
{PO? - UAT prep - Word doc and spreadsheet creation.}
{PO -
{Option 1: Provide list of backlog items to users. They can test if they have time in ACC. If they do not have time, they do not have to test. One hour of testing only max. Perhaps we can send a copy of the weekly agenda, and they can test if they like.}
{Option 2: PO hosts regular bi-weekly testing session with users to test backlog items completed last sprint in ACC. Users only attend if they can. }
Day 4, 5, 6,
Dev work
Dev testing
Dev user story writing
Day 8:
PO walkover (if needed) - 15-30 minutes max.
Day 9:
Dev: Sprint review planning - add items to present at sprint review to agenda
Day 10:
Dev: Sprint review dry run - prepare demo and participate in sprint review dry run
Dev: Sprint review - participate in sprint review
PO / Users: Sprint review - provide users with {test} URL, and they can test if they are interested.
Deployment
Current: Manual deployment
Future: One-click deployment (Jugraj is working with Henry
Definition of Done
Acceptance criteria is complete
Unit test is complete
Test case is written
{Automated testing test is complete}
QA manual test is complete in correct branch
Documentation for security groups
Dev has tested in QA, ACC and prod (if possible)
As needed: Dev sends screenshot of new UI work , or other items for which feedback is needed, to PO during development process for sign-off. Do not need to send screenshot for bug fixes.
No other sign-off is necessary as PO has seen screenshots, walk-through and review
When a story is done, and anything around it is tested so it is shippable
Future consideration
Definition of done future consideration:
Security
Code coverage
Testing
Task | Team Member |
---|---|
Ad-hoc testing |
...
...
QA |
...
Automated testing:
...
B Unit testing - more like unit testing for the UI - not full UI automated testing
Regression Testing
Release Plan
Release Plan - brainstorming
1 month release schedule - Release Train
Total: 2 sprints
Sprint 1
Tech debt - some tech debt might need to be tested. But it is safe to do tech debt at the end of the sprint. At minimum, as sanity check might need to happen.
Dev:
60% - New development
10% - Bug Fixes
20% - Tech debt
10% - Release activities
QA:
70% - Backlog item testing
20% - User testing - day 10? Either PO or QA?
10% - UAT prep
Sprint 2
Sprint 3
Dev:
QA:
50% - ACC environment testing
50% - UAT testing (with users) - Test environment?
Sprint 4
Dev:
50% - Release activities
?? % - Critical bug fixes?
?? - ??
Dev - Release Architecture Needed
Manual to automated - Jugraj is working with Henry on this
One click deployment
Patch fixes - how do we implement?
Anything else?
Architecture: With QA resource - brainstorming only
Day 1, 2 and 3:
QA - Any backlog items that need to be tested from days 9 and 10 of last sprint.
QA - Ad-hoc testing of app (30min to 1 hr.).
QA - UAT prep - Word doc and spreadsheet creation.
Dev - Work on top priority backlog items
{PO -
{Option 1: Provide list of backlog items to users. They can test if they have time in ACC. If they do not have time, they do not have to test.}
{Option 2: PO hosts regular bi-weekly testing session with users to test backlog items completed last sprint. Users only attend if they can.}
Day 4, 5, 6, 7:
Testing - if QA resources - brainstorming only
If QA:
First three days, can catch up on testing
Writing test cases
Ad-hoc testing
Review test cases with one of dev team’s members
Executing test cases
Test all tickets within one sprint - if possible
Testing - brainstorming only
Discussion with Grace - April 4
Bugs addressed during the sprint. Leave some space for bug testing during the sprint
10 business days
First 3 days:
Ad-hoc testing
UAT prep - documentation regarding the previous sprint
Users can test what was completed last sprint - Grace or Steve, do not need two team members
TGD team - QA team tests first, and then PO or user tests, and the user or PO marks it as done. User or PO testing does not need to be complete during the same sprint.
Issues to think about:
How to track modifications to backlog items. E.g. when testing, if something needs to be added, what should we do?
Discussion with Ashiq
PO and users:
test in ACC
They can test on the first day of the next sprint - will will give them a list of what we completed and they can test if they can, if not, that is ok - maybe 1 hr of testing for them
Dev - send a screenshot of new UI before the functionality is built to get feedback from PO via email. Do not need to send screenshots for things such as bugs, and FR translations. Only things we need feeback on.
PO walkover - two days before the end of sprint - 15-30 min to take a quick look if screenshots are not enough - flexible. PO may not need it every week.
Can show him work still in progress
Users can test in test environment if they wish after sprint review. We can give them a few bullet points of what we worked on. Either at end of review or first day of sprint.
Test Cases - brainstorming only
Questions:
What will the test cases be used for? (e.g. backlog testing, automated testing, future use)
How can we minimize the amount of work required to write a test case? E.g. focus on developing a specific feature per sprint, so test case writing can start on day 1 - ? Perhaps identify which backlog items will need test cases at the beginning of the sprint, so test case writing can start at the beginning?
How would we work with dev’s only to write test cases?
Discussion with Ashiq
QA free from manual testing
Dev’s test each other’s work / test cases
Ad-hoc testing - Grace, Maggie, dev’s, Ewa, PO - anyone who can help
Automated testing - B Unit testing - more like unit testing for the UI - not full UI automated testing
Concern - not to add to many testing activities to dev team
Basic level: - Dev’s only
Writing test cases
Reviewing each other test cases - manually
Executing each other test cases - manually
Still valuable - can start automating in the future
If QA:
First three days, can catch up on testing
Writing test cases
Ad-hoc testing
Review test cases with one of dev team’s members
Executing test cases
Test all tickets within one sprint - if possible
Currently we have 3 dev’s and one QA
Need to keep quality or speed
Other work environments - 1 to 1 dev team - work at writing the user stories from the beginning of sprint
Scope
Time
Quality
Automated Testing
How can we implement?
How can we keep QA involved?
What is the outcome of implementation? E.g. less QA manual testing
Unit Testing
{Add short write-up, description}
Release Planning Architecture
2 month release schedule - Release Train
Total: 4 sprints
Sprint 1
Tech debt - some tech debt might need to be tested. But it is safe to do tech debt at the end of the sprint. At minimum, as sanity check might need to happen.
Dev:
60% - New development
10% - Bug Fixes
20% - Tech debt
10% - Release activities
QA:
70% - Backlog item testing
20% - User testing - day 10? Either PO or QA?
10% - UAT prep
Sprint 2
Sprint 3
Dev:
QA:
50% - ACC environment testing
50% - UAT testing (with users) - Test environment?
Sprint 4
Dev:
50% - Release activities
?? % - Critical bug fixes?
?? - ??
Dev - Release Architecture Needed
Manual to automated - Jugraj is working with Henry on this
One click deployment
Patch fixes - how do we implement?
...
Writing test cases |
...
Reviewing each other test cases - manually
...
Executing each other test cases - manually
...
Still valuable - can start automating in the future
...
Careful not to add too many testing activities to the sprint for dev’s
QA & Dev | |
Unit test | Dev |
Automated (B-unit test - UI based) | Dev |
Manual test | QA & Dev |
UAT prep | QA |
UAT | QA or PO |
Smoke test | QA & Dev |
Regression test | QA and Dev |