Documentation

AI Providers

Configure your preferred AI provider for content analysis

Overview

WPLink needs an AI provider for two things:

  • Embeddings: Converting content into vectors for similarity search
  • Chat: Understanding content and finding link opportunities

Supported Providers

OpenAI

Recommended

Most reliable option with good accuracy

Embedding Models

text-embedding-3-small, text-embedding-3-large

Chat Models

gpt-4o, gpt-4o-mini

Pros

  • Good accuracy
  • Fast responses
  • Stable API

Considerations

  • Requires API key
  • Usage costs

Anthropic

Claude models with good context handling

Embedding Models

voyage-3 (via Voyage AI)

Chat Models

Claude Sonnet, Claude Haiku

Pros

  • Good at understanding context
  • Handles complex content

Considerations

  • Needs two API keys (Anthropic + Voyage AI)

Google Gemini

Google AI with competitive pricing

Embedding Models

text-embedding-004

Chat Models

gemini-2.0-flash, gemini-1.5-pro

Pros

  • Good price/performance
  • Fast embeddings

Considerations

  • Newer option

Ollama

Run AI models locally on your computer

Embedding Models

nomic-embed-text

Chat Models

llama3.2, mistral

Pros

  • No API costs
  • Data stays local
  • Works offline

Considerations

  • Needs good hardware
  • Slower than cloud
  • Lower accuracy

Setting Up a Provider

  1. 1

    Open Settings

    Click the gear icon in the sidebar

  2. 2

    Select a provider

    Click one of the provider cards (OpenAI, Anthropic, Gemini, or Ollama)

  3. 3

    Enter your API key

    Paste your key and click "Connect". This tests and saves the key in one step.

  4. 4

    Choose models (optional)

    After connecting, you can select specific embedding and chat models, then click "Save".

Anthropic note: Anthropic requires two keys: one for Claude (chat) and one for Voyage AI (embeddings).

Get API Keys

Using Ollama (Local)

Ollama runs on your computer, so no API key is needed. Just:

  1. Install Ollama from ollama.com
  2. Run ollama pull llama3.2 and ollama pull nomic-embed-text
  3. Make sure Ollama is running (localhost:11434)
  4. Select Ollama in WPLink and click Connect