A few weeks ago I was given the opportunity to step up as QA Lead for Super Adventure Pals. Part of this entailed building a QA infrastructure from the ground up, and creating a schedule to plot out testing for six weeks until the projected launch.
Roughly 80% of our estimated testing staff were work experience interns from Art Intelligence. Hand-chosen from the best and brightest, these young developers would be given the opportunity to work alongside professional developers and learn about an avenue barely covered for their course. These testers, while great in number, had no background in Quality Assurance beforehand. As professionalism isn't something that's taught well in an online course I had to plan a Bug Reporting Pipeline that catered to inexperienced testers, encouraged collaboration, and could be easily browsed and sorted by SAP developers.
My first instinct was to go with JIRA or Bugzilla, but the SAP team has never worked with this software before, and it might have been excluded some the technologically illiterate testers from being able to contribute meaningfully. Trello is online project management tool set. Trello ultimately won out. With a myriad of tutorials and ease of set-up, it was simply best fit for our quality assurance team. Even though it wasn't specifically designed for Bug Reporting, it is flexible enough that with a spit-shine and some auxiliary documentation it would suffice!
I whipped up some supporting paperwork to assist the QA Team.
Up next I blocked out a rough schedule. Prior to joining the team there hadn't been any structured playtesting. With plans to launch in 6 weeks, I needed to get an idea for what metrics the team was planning to collect so we could focus fire for those results. The SAP team expressed that they wanted to focus on the following feedback:
- Controls: Are the controls fluid and intuitive?
- Progression Tracking: Does the player progression keep pace with level difficulty? - Player Retention: Does the game get stale? When do players tend to take a break from play?
- Bug Reporting: To identify bugs that impair visuals, impede gameplay, or crash the game.
- General Feedback: To collect general feedback and suggestions.
To get the best results I staggered the starting time of testers so we could get fresh eyes on the project as bugs were fixed. This also gave me time to refine the Quality Assurance process throughout, and also allotted some flexibility for interns so the work experience didn't impede their other studies. Teams of testers cycle through the following tests: - First Impression Testing: Play the game from start to finish while recording thoughts on controls, progression, and player retention. This also gives testers the chance to learn all the mechanics of the game and develop a mastery of the controls. Ideally skill at the game should be less of an obstacle to spotting bugs.
- Functionality Testing: Play a segment of the game, as defined by the Team Lead, and compile bug reports using the Bug & Feedback Reporting Document as a guide. During this stage testers will also need to attempt to reproduce bugs catalogued by other testers as well.
- Regression Testing: Play the game in search for bugs that were "fixed" in each new build. Test the vicinity of where fixed bugs were located in search for the creation of new bugs.
コメント