AI Integration (Probe)

This page is the hub for Probe’s AI integrations. Pick the path that matches how you work, then follow the dedicated guide.

Choose your integration

Quick mental model

  • Probe supplies grounded code context.
  • Visor orchestrates deterministic workflows.
  • LLMs are optional — you can plug in any provider.
  • Agent mode can bootstrap from repo guidance (AGENTS.md, ARCHITECTURE.md, AgentSkills).

Providers & auth

Set provider keys via environment variables (Anthropic, OpenAI, Google, etc.). Each integration page shows the exact variables and flags to use.

Run from your repo

Most commands should be run from the root folder of the repo you want to analyze. That gives Probe the full project context.