Skip to main content
The AI Assistant can generate complete, detailed test cases in seconds. Just describe what you want to test, and the AI creates a fully-formed test case with steps, expected results, and tags.

Quick Start

1

Open the AI Chat

Press Cmd + K (Mac) or Ctrl + K (Windows/Linux)
2

Describe What to Test

Type your request in plain English:
Create a test for user login with valid credentials
3

Review the Generated Test

The AI will create a complete test case with:
  • Clear title
  • Test steps with expected results
  • Preconditions
  • Tags
  • Priority
4

Save or Refine

  • Click “Save” to add it to your test suite
  • Or ask the AI to modify it: “Add a step for two-factor authentication”

Generation Patterns

Create one specific test:
Create a test for password reset
Generate a test case for adding items to cart
Write a test for user profile update
The AI will generate a single, focused test case.

What Gets Generated

When you ask the AI to create a test, it automatically generates:

Test Metadata

  • Unique identifier (TC-XXXX)
  • Clear, descriptive title
  • Appropriate priority (P0-P4)
  • Relevant tags
  • Test type (manual/automated)

Test Content

  • Preconditions
  • Detailed test steps
  • Expected results for each step
  • Overall pass/fail criteria
  • Test data (when applicable)

Best Practices

  • Follows naming conventions
  • Includes edge cases
  • Uses clear language
  • Atomic test design

Context Awareness

  • Matches your product type
  • Uses existing tags
  • Follows team patterns
  • Considers related tests

Example Prompts

Generate smoke tests for e-commerce checkout
AI generates:
  • Add item to cart
  • Update cart quantities
  • Apply discount code
  • Select shipping method
  • Enter payment information
  • Complete purchase
  • Receive confirmation email
Create API tests for user CRUD operations
AI generates:
  • Create user (valid data)
  • Create user (invalid email)
  • Create user (duplicate email)
  • Get user by ID
  • Update user information
  • Delete user
  • Verify deletion
Generate tests for mobile app onboarding
AI generates:
  • First launch experience
  • Permission requests
  • Account creation
  • Profile setup
  • Feature introduction screens
  • Skip onboarding flow
  • Complete onboarding
Create security tests for login
AI generates:
  • SQL injection attempts
  • XSS attack prevention
  • Brute force protection
  • Session fixation prevention
  • CSRF token validation
  • Password complexity enforcement
  • Account lockout after failures

Refining Generated Tests

You can iterate with the AI to refine tests:
1

Ask for Modifications

Add a step to verify email notification is sent
Make this test cover mobile view as well
Split this into separate positive and negative tests
2

Request Different Formats

Make this more detailed with specific test data
Simplify this test to just the happy path
Convert this to an API test
3

Add Context

This is for a healthcare app, add HIPAA compliance checks
We use Stripe for payments, update the payment steps
Include accessibility testing requirements

Advanced Techniques

Generate many tests at once:
Create a complete test suite for the shopping cart feature including:
- Adding items
- Updating quantities
- Removing items
- Cart persistence
- Checkout initiation
The AI will create 10-15 related tests covering all aspects.

Quality Assurance

Review AI Output: Always review generated tests to ensure they match your specific requirements and context.
Provide Feedback: Tell the AI what to adjust: “Make step 3 more specific” or “Add assertions for response time”
Test Data: Be cautious with sensitive data. The AI generates sample data, but you should use appropriate test data for your environment.

Best Practices

Be Specific

Instead of “create a login test,” try “create a test for login with OAuth via Google”

Provide Context

Mention the feature, platform, or user story: “Create a test for the new multi-factor authentication feature on mobile”

Iterate

Refine the test through conversation: “Add error handling scenarios” or “Include performance assertions”

Save Templates

Once you get a good test format, save it as a template for consistency

What’s Next?