Overview
AI Test Generation allows you to create complete, ready-to-run test cases by describing your testing goals in natural language. The AI analyzes your description and generates appropriate test steps, including navigation, interactions, and assertions.How It Works
- Describe your test: Write what you want to test in plain English
- AI analyzes: The system interprets your intent and identifies required steps
- Steps generated: Complete test steps are created with appropriate locators
- Review and refine: Edit generated steps as needed before running
Getting Started
From the Dashboard
- Click New Test or use the + button
- Select Generate with AI
- Enter your test description
- Click Generate
- Review and save the generated test
From AI Chat
- Open the global AI Chat
- Describe the test you want to create
- The AI generates test steps
- Save as a new test case
Writing Effective Descriptions
Structure Your Description
Include these elements for best results:- Starting point: Where the test begins
- Actions: What steps to perform
- Verifications: What to check or assert
- Test data: Specific values to use
Example Descriptions
Basic Login Test:Generated Step Types
AI Test Generation creates various step types based on your description:| Description Element | Generated Step Type |
|---|---|
| ”Go to”, “Navigate to” | Navigate |
| ”Click”, “Press”, “Select” | Click |
| ”Enter”, “Type”, “Fill” | Fill |
| ”Verify”, “Check”, “Assert” | Verify Text/Visibility |
| ”Wait for” | Wait for Navigation/Element |
| ”Upload” | File Upload |
| ”Select option” | Select Option |
Customizing Generated Tests
Adding Environment Variables
Generated tests use hardcoded values by default. Replace with environment variables: Before:Improving Locators
AI generates locators based on common patterns. Improve them for stability:- Use data-testid attributes when available
- Prefer role-based selectors
- Add fallback locators for reliability
Adding Assertions
Enhance generated tests with additional verifications:- Add visual assertions for complex layouts
- Include URL verification after navigation
- Add wait steps for dynamic content
Use Cases
Rapid Prototyping
Quickly create test skeletons for new features:Coverage Expansion
Generate tests for common scenarios:Regression Test Creation
Build regression suites from feature descriptions:Best Practices
- Start with clear goals: Know what you want to verify before describing
- Include specific values: Use actual test data in descriptions
- Describe expected outcomes: Tell AI what success looks like
- Review all locators: Verify selectors match your application
- Add error handling: Include steps for expected error states
- Use environment variables: Replace hardcoded URLs and credentials
AI Credit Usage
| Action | Credits |
|---|---|
| Generate test from description | 1 credit |
Limitations
- Cannot access your live application during generation
- May not know application-specific selectors
- Complex conditional logic requires manual setup
- Multi-tab scenarios need manual configuration
Troubleshooting
Generated Steps Don’t Match App
- Provide more specific descriptions
- Include actual selector hints if known
- Use the recorder to capture accurate locators
Missing Assertions
- Explicitly state what to verify in your description
- Add verification phrases like “verify”, “check”, “assert”
Incorrect Navigation
- Specify full URLs or paths
- Describe the navigation flow step by step
Related
- AI Chat - Conversational test creation
- Test Planner Agent - AI-powered test planning
- Recorder - Record tests from browser actions
- AI Credits - Managing AI credit usage

