Skip to main content

Together AI SDK Integration

Integrate Proxy with the Together AI SDK to automatically capture telemetry. Demonstrated with Together AI’s Python SDK but should work in most languages.

Base URL

https://gateway.adaline.ai/v1/together-ai/

Chat Completions

Complete Chat

from together import Together

client = Together(
    base_url="https://gateway.adaline.ai/v1/together-ai",
    api_key="your-together-ai-api-key",
    supplied_headers={
        "adaline-api-key": "your-adaline-api-key",
        "adaline-project-id": "your-project-id",
        "adaline-prompt-id": "your-prompt-id",
    },
)

response = client.chat.completions.create(
    model="meta-llama/Llama-2-7b-chat-hf",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "What are the benefits of open source AI models?"}
    ],
    extra_headers={
        "adaline-trace-name": "togetherai-chat-completion"  # Optional
    }
)

print(response.choices[0].message.content)

Stream Chat

from together import Together

client = Together(
    base_url="https://gateway.adaline.ai/v1/together-ai",
    api_key="your-together-ai-api-key",
    supplied_headers={
        "adaline-api-key": "your-adaline-api-key",
        "adaline-project-id": "your-project-id",
        "adaline-prompt-id": "your-prompt-id",
    },
)

stream = client.chat.completions.create(
    model="meta-llama/Llama-2-7b-chat-hf",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Explain the concept of distributed computing."}
    ],
    stream=True,
    extra_headers={
        "adaline-trace-name": "togetherai-stream-chat"  # Optional
    }
)

for chunk in stream:
    if chunk.choices[0].delta.content is not None:
        print(chunk.choices[0].delta.content, end="")

Next Steps