Files
Ai-Interview-Assistant-Chro…/AI_PROVIDERS_GUIDE.md
2026-02-13 19:24:20 +01:00

67 lines
2.4 KiB
Markdown

# AI Providers Guide
## Scope
This guide covers **chat/response providers** used by the extension after transcription.
Note: Speech-to-text is configured separately in Assistant Setup (`STT Provider`, `STT Model`, language/task/VAD/beam settings).
## Supported Chat Providers
### OpenAI
- Default models in UI: `gpt-4o`, `gpt-4o-mini`, `gpt-4-turbo`, `gpt-3.5-turbo`
- API key: https://platform.openai.com/account/api-keys
- Good default: `gpt-4o-mini` (speed/cost balance)
### Anthropic
- Default models in UI: `claude-3-5-sonnet-20241022`, `claude-3-5-haiku-20241022`, `claude-3-opus-20240229`
- API key: https://console.anthropic.com/
- Good default: `claude-3-5-sonnet-20241022`
### Google Gemini
- Default models in UI: `gemini-1.5-pro`, `gemini-1.5-flash`, `gemini-pro`
- API key: https://aistudio.google.com/app/apikey
- Good default: `gemini-1.5-flash`
### DeepSeek
- Default models in UI: `deepseek-chat`, `deepseek-reasoner`
- API key: https://platform.deepseek.com/
- Good default: `deepseek-chat`
### Ollama (local)
- Default models in UI: `llama3.2`, `llama3.1`, `mistral`, `codellama`, `phi3`
- API key: not required
- Endpoint used by extension: `http://localhost:11434`
## Model List Behavior
- For cloud providers, if an API key is saved, the extension attempts to fetch live model lists.
- If model fetch fails, the extension falls back to the built-in default model list above.
- For Ollama, the extension reads models from `/api/tags`.
## Setup Steps
1. Open side panel -> `Assistant Setup`.
2. Choose `AI Provider`.
3. Save provider API key (not needed for Ollama).
4. Select model.
5. Start listening.
## Recommended Defaults
- Fastest general: `gpt-4o-mini` / `gemini-1.5-flash` / `claude-3-5-haiku-20241022`
- Highest quality: `gpt-4o` / `claude-3-5-sonnet-20241022` / `gemini-1.5-pro`
- Local-only privacy: `ollama` + local STT
## Troubleshooting
- `API key not set`: save provider key in Assistant Setup.
- `Failed to fetch models`: key may be invalid, provider API unavailable, or network blocked. Default model list is used as fallback.
- `Ollama connection failed`: ensure `ollama serve` is running and model is pulled.
- Slow or expensive responses: switch to smaller/faster model and enable Speed mode.
## Storage Note
- Provider API keys are stored in extension sync storage (`chrome.storage.sync`).
- Keep least-privilege keys where possible and rotate keys regularly.