Atla Insights supports a wide range of LLM providers and AI agent frameworks. All instrumentation methods share a common interface for easy integration.

LLM Providers

We currently support the following LLM providers:
ProviderInstrumentation FunctionNotes
Anthropicinstrument_anthropicAlso supports AnthropicBedrock client from Anthropic
Bedrockinstrument_bedrock
Google GenAIinstrument_google_genaiE.g., Gemini
LiteLLMinstrument_litellmSupports all available models in the LiteLLM framework
OpenAIinstrument_openaiIncludes Azure OpenAI

Basic Provider Usage

from atla_insights import configure, instrument_openai
from openai import OpenAI

configure(token="<MY_ATLA_INSIGHTS_TOKEN>")
instrument_openai()

client = OpenAI()
# All OpenAI calls are now instrumented
By default, instrumented LLM calls will be treated independently from one another. To logically group LLM calls into a trace, use the @instrument decorator.

Agent Frameworks

We currently support the following frameworks:
FrameworkInstrumentation FunctionNotes
Agnoinstrument_agnoSupported with openai, google-genai, litellm and/or anthropic models
BAMLinstrument_bamlSupported with openai, anthropic or bedrock models
Claude Code SDKinstrument_claude_code_sdk
CrewAIinstrument_crewai
LangChaininstrument_langchainThis includes e.g., LangGraph as well
MCPinstrument_mcpOnly includes context propagation. You will need to instrument the model calling the MCP server separately.
OpenAI Agentsinstrument_openai_agentsSupported with openai, google-genai, litellm and/or anthropic models
Smolagentsinstrument_smolagentsSupported with openai, google-genai, litellm and/or anthropic models

Framework + Provider Instrumentation

For Agno, BAML, OpenAI Agents and Smolagents in Python, you will need to instrument both the framework and the underlying LLM provider(s).
from atla_insights import configure, instrument_agno

configure(token="<MY_ATLA_INSIGHTS_TOKEN>")

# If you are using a single LLM provider (e.g., via `OpenAIChat`)
instrument_agno("openai")

# If you are using multiple LLM providers (e.g., `OpenAIChat` and `Claude`)
instrument_agno(["anthropic", "openai"])

Can’t find your framework or LLM provider?

If you are using a framework or LLM provider without native support, you can manually record LLM generations via our lower-level SDK.
from atla_insights.span import start_as_current_span

with start_as_current_span("my-llm-generation") as span:
    # Run my LLM generation via an unsupported framework.
    input_messages = [{"role": "user", "content": "What is the capital of France?"}]
    tools = [
        {
            "type": "function",
            "function": {
                "name": "get_capital",
                "parameters": {"type": "object", "properties": {"country": {"type": "string"}}},
            },
        }
    ]
    result = my_client.chat.completions.create(messages=input_messages, tools=tools)

    # Manually record LLM generation.
    span.record_generation(
        input_messages=input_messages,
        output_messages=[choice.message for choice in result.choices],
        tools=tools,
    )
Feel free to let us know which frameworks and LLM providers you would like to see supported! Schedule a call with the Atla team