Overview

Test Suites are structured collections of test cases designed to validate specific application features or workflows. They utilize Supatest’s AI capabilities to analyze your application and generate effective automated tests.

Test Suites provide the following capabilities:

  • Test Organization: Logical grouping of related test cases
  • Environment Management: Shared environment variables across all tests
  • Dependency Handling: Common prerequisites such as authentication flows
  • Parallel Execution: Concurrent test execution for improved performance
  • Execution History: Complete audit trail of test runs and results

Test Suite Configuration

Basic Setup

To create a new test suite:

  1. Navigate to the Test Suites section
  2. Click Create Test Suite
  3. Configure the following parameters:
    • Title: Descriptive name for the test suite
    • Description: Documentation of testing scope and objectives
    • Starting URL: Base application URL for test execution
    • Screenshots: Visual references for AI analysis (maximum 5MB per image)

Environment Variables

Define suite-level environment variables accessible to all test cases:

  1. Select Add Environment Variable in suite settings
  2. Configure key-value pairs for environment-specific data
  3. Variables provide context to the AI agent for improved test generation

Snippet Dependencies

Configure reusable test components as prerequisites:

  1. Enable the Run Before Tests toggle
  2. Select the required snippet from the dropdown menu
  3. The selected snippet executes automatically before each test case

Note: Individual test cases can override the default starting snippet during creation.

Test Case Management

The test suite interface provides:

  • Test Inventory: Complete list of all test cases within the suite
  • Status Indicators: Real-time pass/fail status for each test
  • Health Metrics: Aggregated suite performance statistics

Test Execution

Manual Execution

To execute all tests in a suite:

  1. Click the Run All button
  2. Monitor real-time execution status
  3. Review detailed results in the Runs tab

Tests execute in parallel to optimize runtime performance.

Scheduled Execution

Configure automated test execution with the following options:

  • Frequency: Set execution intervals (daily, weekly, custom)
  • Environment: Select target testing environment
  • Notifications: Configure result alerts and reporting

For detailed scheduling configuration, see Test Plans.

Execution History

Access historical test data through:

  • Run Archive: Complete history of previous executions
  • Detailed Logs: Step-by-step execution records
  • Trend Analysis: Performance tracking over time

AI-Powered Testing

The Supatest AI agent provides intelligent test automation by:

  1. Application Analysis: Building a comprehensive understanding from suite configuration
  2. Visual Processing: Analyzing uploaded screenshots for UI structure recognition
  3. Structure Mapping: Discovering website navigation and functional relationships
  4. Test Optimization: Generating test cases based on application analysis

Dynamic Updates

The AI system automatically updates when:

  • Starting URL changes: Triggers re-analysis of the target application
  • Snippet modifications: Updates the product brain with new dependency information
  • Configuration updates: Incorporates changes into the application understanding

Best Practices

  • Maintain screenshot file sizes under 5MB for optimal AI processing
  • Use descriptive titles and documentation for better organization
  • Configure environment variables for different testing environments
  • Leverage snippet dependencies for common setup procedures