Skip to main content
This page is under construction. More content coming soon!

What is Regression Testing?

Regression testing ensures that new code changes haven’t broken existing functionality. It’s typically run before releases or after significant code changes.
While smoke tests verify “can we continue testing?”, regression tests verify “are we ready to ship?”

When to Run Regression Tests

Before Release

Comprehensive validation before shipping to production

After Major Changes

Verify large features or refactoring didn’t break anything

Sprint End

Validate all sprint work before demo/release

After Hotfix

Ensure hotfix didn’t introduce new issues

Building a Regression Suite

1

Start with Critical Tests

Include all smoke tests (P0/P1 critical paths)
2

Add Feature Coverage

Include tests for all major features:
priority IN (p0, p1, p2) AND status = active
3

Add Historical Failures

Include tests that have failed before:
failure_count > 0 AND status = active
ORDER BY failure_count DESC
4

Tag Tests

Tag tests with regression for easy filtering:
tags CONTAINS "regression" AND status = active
5

Balance Coverage vs Time

Target suite that completes in 2-4 hours maximum

Regression Testing Strategy

Complete Test Suite

Run all regression tests before major releases.When to use:
  • Major version releases
  • Quarterly releases
  • After significant architecture changes
Characteristics:
  • 100+ tests
  • 2-4 hours execution time
  • All features covered
  • All priority levels (P0-P3)
Full regression is time-consuming. Reserve for major releases only.

Running Regression Tests

1

Select Test Suite

Choose appropriate regression level:
# Full regression
tags CONTAINS "regression" AND status = active
ORDER BY priority ASC

# Selective regression (e.g., auth module)
tags CONTAINS "regression" AND
component = "authentication" AND
status = active
2

Create Test Run

Configure the test run:
  • Name: Regression - v2.1.0 Release
  • Environment: Staging
  • Build: 2.1.0-rc1
  • Assigned to: Distribute among team
3

Execute Tests

Team executes assigned tests in parallel
4

Track Progress

Monitor run dashboard for:
  • Completion percentage
  • Pass/fail rate
  • Blockers
  • Failed tests needing triage
5

Triage Failures

For each failure:
  • Is it a real bug? Create defect.
  • Is it a test issue? Fix the test.
  • Is it environmental? Investigate environment.
6

Release Decision

Decide go/no-go based on results:
  • All P0 tests passed? ✅ Can release
  • Any P0 failures? ❌ Block release
  • P1/P2 failures? Assess risk

Example Regression Suites

Scope: Full website functionalityCategories:
  • Authentication (10 tests)
  • Navigation (15 tests)
  • User Profile (12 tests)
  • Search (8 tests)
  • Content Management (20 tests)
  • Forms & Validation (18 tests)
  • Notifications (7 tests)
  • Settings (10 tests)
Total: ~100 tests Time: ~3 hours Tags: regression, web
Scope: All API endpointsCategories:
  • Authentication endpoints (5 tests)
  • User CRUD operations (12 tests)
  • Product endpoints (15 tests)
  • Order processing (10 tests)
  • Search & filtering (8 tests)
  • File uploads (6 tests)
  • Webhooks (5 tests)
Total: ~60 tests Time: ~45 minutes (automated) Tags: regression, api, automated
Scope: iOS & Android appsCategories (per platform):
  • App launch & onboarding (5 tests)
  • Login & authentication (8 tests)
  • Main navigation (10 tests)
  • Core features (25 tests)
  • Settings & preferences (7 tests)
  • Offline functionality (10 tests)
  • Push notifications (5 tests)
Total: ~70 tests per platform Time: ~2 hours per platform Tags: regression, mobile, ios/android

Optimizing Regression Testing

Prioritize by Risk

Test high-risk, frequently-changed areas first

Automate Stable Tests

Move stable, repetitive tests to automation

Parallelize Execution

Distribute tests across team members

Remove Obsolete Tests

Archive tests for deprecated features

Maintain Test Quality

Fix flaky tests immediately

Track Trends

Monitor pass rates and execution times

Regression Test Maintenance

When to add tests to regression suite:
  • New feature released
  • Bug found in production
  • Area lacking coverage
  • High-value user workflow
Process:
  1. Create test case
  2. Execute to validate it works
  3. Tag with regression
  4. Add to appropriate folder
When to remove tests from regression suite:
  • Feature deprecated/removed
  • Test is consistently flaky
  • Test is redundant with other tests
  • Feature changed significantly
Process:
  1. Change status to archived (don’t delete)
  2. Remove regression tag
  3. Document why it was removed
Keep tests current with product changes:
  • Review test suite quarterly
  • Update after major feature changes
  • Fix broken tests immediately
  • Refine test steps for clarity
  • Update expected results
Assign test ownership to feature teams for ongoing maintenance

Integration with Release Process

What’s Next?