QA/e10s + A11y Test Plan

From MozillaWiki
< QA
Revision as of 19:38, 20 December 2016 by Aklotz (talk | contribs) (→‎Test Areas: Filled in coverage fields marked unknown or left empty)
Jump to navigation Jump to search

Revision History

This section describes the modifications that have been made to this wiki page. A new row has been completed each time the content of this document is updated (small corrections for typographical errors do not need to be recorded). The description of the modification contains the differences from the prior version, in terms of what sections were updated and to what extent.

Date Version Author Description
12/13/2016 1.0 Grover Wimberly IV Created first draft
12/13/2016 1.1 Kanchan Kumari Added some more info

Overview

Purpose

Testing accessibility features with e10s for touch screen devices using Windows and Linux operating system.

Scope

This wiki details the testing that will be performed for the A11y/e10s Windows touchscreen feature. It defines the overall testing requirements and provides an integrated view of the project test activities. Its purpose is to document:

  • What will be tested
  • How testing will be performed

Ownership

Product contact:
Erin Lancaster (IRC: elan)


User Experience contact:
Not applicable


Engineering contact:
Aaron Klotz (IRC: aklotz) (Windows)
Trevor Saunders (IRC: tbsaunde) (Linux)

QA contact:
Marco Zehe (IRC: MarcoZ)


QA:
PM for QA team - Rares Bologa (IRC: RaresB)
QA Lead - Grover Wimberly IV (IRC: Grover-QA)
QA - Kanchan Kumari (IRC: Kanchan_QA)
QA - Justin Williams (IRC: JW_SoftvisionQA)
QA - Stefan Georgiev (IRC: StefanG_QA)
QA - Abe Masresha (IRC: Abe_LV)

Testing summary

Scope of Testing

In Scope

The scope of our testing is the A11y/e10s accessibility and its functionality.

  • Integration: Verify the integration with the current browser functionalities and UI;
  • Functionality: Basic and advanced functionality to be verified according to the existing requirements;
  • Usability: Intuitive and how users interact with the feature;

Out of Scope

Requirements for testing

Environments

Testing will be performed on following OSes:

  • Windows 10 (x64)
  • Linux - Ubuntu 16.04 (x64)

Test Strategy

Test Objectives

This section details the progression test objectives that will be covered. Please note that this is at a high level. For large projects, a suite of test cases would be created which would reference directly back to this master. This could be documented in bullet form or in a table similar to the one below.

Ref Function Test Objective Evaluation Criteria Test Type Owners
1 Install NVDA Verify program is correctly installed Manual Eng Team
2 e10s Accessibility and Web Pages Verify popular web pages (Twitter, Facebook, Gmail, Google Apps) are functioning correctly with changes to preferences and installed programs. Manual Eng Team
3 Uninstall/Teardown Verify any changes are reverted and browser returns to default 1. Uninstall NVDA and related add-ons and reset preferences
2. Verify that the browser UI returned to default
Manual Eng Team

Builds

TBD

Test Execution Schedule

The following table identifies the anticipated testing period available for test execution.

Project phase Start Date End Date
Start project December 2016 -
Study documentation/specs received from developers TBD -
QA - Test plan creation 12/13/2016 -
QA - Test cases/Env preparation 12/12/2016 -
QA - Nightly Testing - -
QA - Aurora Testing December 2016
QA - Beta Testing
Release Date

Testing Tools

Detail the tools to be used for testing, for example see the following table:

Process Tool
Test plan creation Mozilla wiki
Test case creation TestRail
Test case execution TestRail
Bugs management Bugzilla/GitHub (mainly)

Status

Overview

  • Track the dates and build number where feature was released to Nightly
  • Track the dates and build number where feature was merged to Aurora
  • Track the dates and build number where feature was merged to Release/Beta

Risk analysis

Risk areas Mitigation
Performance issues on Popular sites Testing will focused on performance and compatibility of e10 and A11Y on popular websites .

References

Testcases

Overview

  • Summary of testing scenarios

Test Areas

Test Areas Covered Details
Private Window Yes
Multi-Process Enabled Yes
Multi-process Disabled Yes
Theme (high contrast) No
UI
Mouse-only operation Yes
Keyboard-only operation Yes
Display (HiDPI) No
Interraction (scroll, zoom) Yes
Usable with a screen reader Yes e.g. with NVDA
Usability and/or discoverability testing Yes Is this feature user friendly
Help/Support
Help/support interface required No Make sure link to support/help page exist and is easy reachable.
Support documents planned(written) Yes Make sure support documents are written and are correct.
Install/Upgrade
Feature upgrades/downgrades data as expected No
Does sync work across upgrades No
Requires install testing Yes Requires NVDA Installation
Affects first-run or onboarding No
Does this affect partner builds? Partner build testing No
Enterprise Raise up the topic to developers to see if they are expecting to work different on ESR builds
Enterprise administration No
Network proxies/autoconfig No
ESR behavior changes No
Locked preferences No
Data Monitoring
Temporary or permanent telemetry monitoring No Testing was not conducted by SV QA Eng team.
Telemetry correctness testing No Testing was not conducted by SV QA Eng team.
Server integration testing No Testing was not conducted by SV QA Eng team.
Offline and server failure testing No
Load testing No Testing was not conducted by SV QA Eng team.
Add-ons If add-ons are available for testing feature, or is current feature will affect some add-ons, then API testing should be done for the add-on.
Addon API required? No
Comprehensive API testing No
Permissions No
Testing with existing/popular addons Yes Ensure no performance/stability regressions
Security
3rd-party security review No
Privilege escalation testing No
Fuzzing No
Web Compatibility depends on the feature
Testing against target sites Yes
Survey of many sites for compatibility Yes
Interoperability depends on the feature
Common protocol/data format with other software: specification available. Interop testing with other common clients or servers. Yes NVDA should cover most of this. Other common clients are closed-source, expensive, and do not offer trial versions.
Coordinated testing/interop across the Firefoxes: Desktop, Android, iOS No
Interaction of this feature with other browser features Yes

Test suite

  • Full Test suite - TBD on TestRail
  • Smoke Test suite -* Regression Test suite - TBD

Bug Work

Sign off

Criteria

Check list

  • All test cases should be executed
  • All blockers, criticals must be fixed and verified or have an agreed-upon timeline for being fixed (as determined by engineering/RelMan/QA)

Results

Aurora testing

  • TBD on TestRail

Merge to Aurora Sign-off
List of OSes that will be covered by testing

  • Link for the tests run - TBD
    • Full Test suite - TBD

Checklist

Exit Criteria Status Notes/Details
Testing Prerequisites (specs, use cases)
Testing Infrastructure setup No
Test Plan Creation [DONE]
Test Cases Creation [IN PROGRESS]
Full Functional Tests Execution
Smoke Tests Execution
Automation Coverage
Performance Testing
All Defects Logged
Critical/Blockers Fixed and Verified
Daily Status Report (email/etherpad statuses/ gdoc with results)
Metrics/Telemetry N/A
QA Signoff - Nightly Release Email to be sent
QA Aurora - Full Testing
QA Signoff - Aurora Release Email to be sent
QA Beta - Full Testing
QA Signoff - Beta Release Email to be sent