Skip to main content
The AI Assistant requires an LLM provider to be configured before it can generate tests, answer questions, or perform analysis. OneTest supports multiple providers so you can use your existing AI infrastructure. LLM Configuration page showing AWS Bedrock setup
The AI Assistant will not work until you configure an LLM provider. This is the first step to enabling AI features in OneTest.

Supported Providers

Required fields:
  • Access Key ID
  • Secret Access Key
  • Region (e.g., us-east-1, eu-central-1)
Recommended models:
  • High-Tier: anthropic.claude-sonnet-4-5-20250929-v1:0
  • Default: anthropic.claude-sonnet-4-5-20250929-v1:0
  • Low-Tier: anthropic.claude-haiku-4-5-20251001-v1:0

Model Tiers

OneTest uses three model tiers for different task complexities:
TierPurposeExamples
High-Tier (reasoning)Complex analysis, test generation from requirementsGenerating comprehensive test suites, analyzing coverage gaps
Default (balanced)General-purpose tasksAnswering questions, searching tests, simple generation
Low-Tier (fast)Quick, simple operationsSummarization, classification, simple lookups

Setup Steps

1

Navigate to LLM Configuration

Go to Settings > LLM Configuration from the sidebar.
2

Select Provider

Click your preferred provider (Azure, AWS Bedrock, OpenAI, or Anthropic).
3

Enter Credentials

Fill in the required credentials for your chosen provider.
4

Configure Model Tiers

Enter model identifiers for each tier. Use the recommended models above or your own deployments.
5

Test Connection

Click Test Connection to validate your credentials and model access.
6

Save Configuration

Click Save Configuration to activate the AI Assistant.
Once configured, you can manage your LLM setup with Revalidate (test again), Edit (change settings), or Delete (remove configuration).

What’s Next?

AI Assistant

Start using the AI Assistant to generate and search tests

MCP Servers

Extend AI capabilities with external tool integrations