Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Keywords-AI/keywordsai

Repository files navigation

Observability, prompt management, and evals for LLM engineering teams.

Y Combinator W24 Platform Documentation Twitter Discord

Respan Tracing

Respan's library for sending telemetries of LLM applications in OpenLLMetry format.

Integrations

OpenAI Agents SDK LangGraph Vercel AI SDK

Quickstart

1️⃣ Get an API key

Go to Respan platform and get your API key.

2️⃣ Download package

Python

pip install respan-tracing

TypeScript/JavaScript

npm install @respan/tracing

3️⃣ Initialize Respan tracing processor

Python

import os
from respan_tracing.main import RespanTelemetry

os.environ["RESPAN_BASE_URL"] = "https://api.respan.ai/api" # This is also the default value if not explicitly set
os.environ["RESPAN_API_KEY"] = "YOUR_RESPAN_API_KEY"
k_tl = RespanTelemetry()

Typescript/JavaScript

import { RespanTelemetry } from '@respan/tracing';

// Initialize clients
// Make sure to set these environment variables or pass them directly
const keywordsAI = new RespanTelemetry({
    apiKey: process.env.RESPAN_API_KEY || "",
    baseUrl: process.env.RESPAN_BASE_URL || "",
    appName: 'test-app',
    disableBatch: true  // For testing, disable batching
});

4️⃣ Trace agent workflows and tasks

Python

You can now trace your LLM applications using the decorators.

A workflow is the whole process of an AI agent run, and a workflow may contains several tasks also could say tools/LLM calls.

In the example, below, this means there's an Agent run named my_workflow and it contains 1 task my_task in this agent.

from respan_tracing.decorators import workflow, task

@workflow(name="my_workflow")
def my_workflow():
    @task(name="my_task")
    def my_task():
        pass
    my_task()

Typescript/JavaScript

You can now trace your LLM applications by wrapping the wrappers around your functions (keywordsAI.withTask in the below example)

A workflow is the whole process of an AI agent run, and a workflow may contains several tasks also could say tools/LLM calls.

In the example, below, this means there's an Agent run named pirate_joke_workflow and it contains 1 task joke_creation in this agent.

async function createJoke() {
    return await keywordsAI.withTask(
        { name: 'joke_creation' },
        async () => {
            const completion = await openai.chat.completions.create({
                messages: [{ role: 'user', content: 'Tell me a joke about TypeScript' }],
                model: 'gpt-3.5-turbo',
                temperature: 0.7
            });
            return completion.choices[0].message.content;
        }
    );
}

async function jokeWorkflow() {
    return await keywordsAi.withWorkflow(
        { name: 'pirate_joke_workflow' },
        async () => {
            const joke = await createJoke();
            return joke;
        }
    );
}

5️⃣ See traces in Respan

⭐️ Star us 🙏

Please star us if you found this is helpful!


For a comprehensive example, see the trace example run. Step by step guide can be below:

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 9