Many AI providers use OpenAI-compatible APIs internally (including Nebius, Anyscale, Together AI, Fireworks AI, and others), which means you can easily instrument them with Atla using our OpenAI instrumentation.

Quick Setup

Simply use the OpenAI instrumentation method:
from atla_insights import configure, instrument_openai

# Configure Atla
configure(token="<YOUR_ATLA_INSIGHTS_TOKEN>")

# Instrument OpenAI (works with compatible providers)
instrument_openai()

Example with Nebius

import os
from atla_insights import configure, instrument_openai
from openai import OpenAI

# Configure Atla Insights
configure(token=os.environ["ATLA_INSIGHTS_TOKEN"])

# Instrument OpenAI
instrument_openai()

# Configure client to use Nebius (or any OpenAI-compatible provider)
client = OpenAI(
    base_url="https://api.studio.nebius.ai/v1/",  # Provider endpoint
    api_key=os.environ["PROVIDER_API_KEY"]
)

# Now all calls will be automatically traced
response = client.chat.completions.create(
    model="meta-llama/Meta-Llama-3.1-405B-Instruct",
    messages=[{"role": "user", "content": "Hello!"}]
)

Supported Providers

This approach works with any OpenAI-compatible service:
  • Nebius AI: Studio platform with Llama models
  • Anyscale: OpenAI-compatible endpoint
  • Together AI: OpenAI-compatible API
  • Fireworks AI: OpenAI-compatible interface
  • Any other provider that implements the OpenAI API standard

Need Help?

If you encounter issues with your provider: