Test Suites
Organized collections of test cases for automated testing
Overview
Test Suites are structured collections of test cases designed to validate specific application features or workflows. They utilize Supatest’s AI capabilities to analyze your application and generate effective automated tests.
Test Suites provide the following capabilities:
- Test Organization: Logical grouping of related test cases
- Environment Management: Shared environment variables across all tests
- Dependency Handling: Common prerequisites such as authentication flows
- Parallel Execution: Concurrent test execution for improved performance
- Execution History: Complete audit trail of test runs and results
Test Suite Configuration
Basic Setup
To create a new test suite:
- Navigate to the Test Suites section
- Click Create Test Suite
- Configure the following parameters:
- Title: Descriptive name for the test suite
- Description: Documentation of testing scope and objectives
- Starting URL: Base application URL for test execution
- Screenshots: Visual references for AI analysis (maximum 5MB per image)
Environment Variables
Define suite-level environment variables accessible to all test cases:
- Select Add Environment Variable in suite settings
- Configure key-value pairs for environment-specific data
- Variables provide context to the AI agent for improved test generation
Snippet Dependencies
Configure reusable test components as prerequisites:
- Enable the Run Before Tests toggle
- Select the required snippet from the dropdown menu
- The selected snippet executes automatically before each test case
Note: Individual test cases can override the default starting snippet during creation.
Test Case Management
The test suite interface provides:
- Test Inventory: Complete list of all test cases within the suite
- Status Indicators: Real-time pass/fail status for each test
- Health Metrics: Aggregated suite performance statistics
Test Execution
Manual Execution
To execute all tests in a suite:
- Click the Run All button
- Monitor real-time execution status
- Review detailed results in the Runs tab
Tests execute in parallel to optimize runtime performance.
Scheduled Execution
Configure automated test execution with the following options:
- Frequency: Set execution intervals (daily, weekly, custom)
- Environment: Select target testing environment
- Notifications: Configure result alerts and reporting
For detailed scheduling configuration, see Test Plans.
Execution History
Access historical test data through:
- Run Archive: Complete history of previous executions
- Detailed Logs: Step-by-step execution records
- Trend Analysis: Performance tracking over time
AI-Powered Testing
The Supatest AI agent provides intelligent test automation by:
- Application Analysis: Building a comprehensive understanding from suite configuration
- Visual Processing: Analyzing uploaded screenshots for UI structure recognition
- Structure Mapping: Discovering website navigation and functional relationships
- Test Optimization: Generating test cases based on application analysis
Dynamic Updates
The AI system automatically updates when:
- Starting URL changes: Triggers re-analysis of the target application
- Snippet modifications: Updates the product brain with new dependency information
- Configuration updates: Incorporates changes into the application understanding
Best Practices
- Maintain screenshot file sizes under 5MB for optimal AI processing
- Use descriptive titles and documentation for better organization
- Configure environment variables for different testing environments
- Leverage snippet dependencies for common setup procedures