Atla Insights supports a wide range of LLM providers and AI agent frameworks. All instrumentation methods share a common interface for easy integration.

LLM Providers

We currently support the following LLM providers:
ProviderInstrumentation FunctionNotes
OpenAIinstrumentOpenAIIncludes Azure OpenAI

Agent Frameworks

We currently support the following frameworks:
FrameworkInstrumentation FunctionNotes
LangChaininstrumentLangChainThis includes e.g., LangGraph as well
OpenAI AgentsinstrumentOpenAIAgents

Can’t find your framework or LLM provider?

If you are using a framework or LLM provider without native support, you can manually record LLM generations via our lower-level SDK.
import { startAsCurrentSpan } from "@atla-ai/insights-sdk-js";

const { span, endSpan } = startAsCurrentSpan("my-llm-generation");
try {
  // Run my LLM generation via an unsupported framework.
  const inputMessages = [{ role: "user", content: "What is the capital of France?" }];
  const tools = [
    {
      type: "function",
      function: {
        name: "get_capital",
        parameters: { type: "object", properties: { country: { type: "string" } } },
      },
    },
  ];
  const result = await myClient.chat.completions.create({ messages: inputMessages, tools });

  // Manually record LLM generation.
  span.recordGeneration({
    inputMessages,
    outputMessages: result.choices.map(choice => choice.message),
    tools,
  });
} finally {
  endSpan();
}
Feel free to let us know which frameworks and LLM providers you would like to see supported! Schedule a call with the Atla team