traceAI

April 23, 2026 · View on GitHub

traceAI Logo

traceAI

Open-source observability for AI applications - trace every LLM call, prompt, token, retrieval step, and agent decision.

Built on OpenTelemetry, traceAI sends structured traces to any OTel-compatible backend (Datadog, Grafana, Jaeger, Future AGI, and more). No new vendor. No new dashboard.

License Python TypeScript Java C# OpenTelemetry

PyPI Downloads npm Downloads NuGet Downloads

DocumentationExamplesSlackPyPInpmNuGet


traceAI Demo

What is traceAI?

Your agent calls an LLM, retrieves context, invokes a tool, and returns an answer. When that answer is wrong, you need to know exactly where it broke - which retrieval missed, which tool returned stale data, which prompt drifted.

traceAI captures every LLM call, prompt, token count, retrieval step, and agent decision as structured OpenTelemetry traces. Your traces live natively in Datadog, Grafana, Future AGI, Jaeger, or any OTel-compatible backend. No new vendor. No new dashboard.

  • Drop-in instrumentation for 50+ AI frameworks across 4 languages
  • OpenTelemetry-native - works with any OTel-compatible backend
  • Semantic conventions for LLM calls, agents, tools, retrieval, and vector databases
  • Python, TypeScript, Java, and C# with consistent APIs

Table of Contents

Key Features

FeatureDescription
Standardized TracingMaps AI workflows to consistent OpenTelemetry spans and attributes
Drop-in SetupAdd 3 lines to your existing code - no refactoring needed
Multi-Framework50+ integrations across Python, TypeScript, Java, and C#
Vendor AgnosticWorks with any OpenTelemetry-compatible backend
Rich ContextCaptures prompts, completions, tokens, model params, tool calls, and more
Production-gradeAsync support, streaming, error handling, and low-overhead tracing

Quickstart

Python Quickstart

1. Install

pip install traceai-openai

2. Instrument your application

import os
from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType
from traceai_openai import OpenAIInstrumentor
import openai

# Set up environment variables
os.environ["FI_API_KEY"] = "<your-api-key>"
os.environ["FI_SECRET_KEY"] = "<your-secret-key>"
os.environ["OPENAI_API_KEY"] = "<your-openai-key>"

# Register tracer provider
trace_provider = register(
    project_type=ProjectType.OBSERVE,
    project_name="my_ai_app"
)

# Instrument OpenAI
OpenAIInstrumentor().instrument(tracer_provider=trace_provider)

# Use OpenAI as normal - traces are captured automatically
response = openai.chat.completions.create(
    model="gpt-4.1",
    messages=[{"role": "user", "content": "Hello!"}]
)

Tip: Swap traceai-openai for any supported framework (e.g., traceai-langchain, traceai-anthropic)


TypeScript Quickstart

1. Install

npm install @traceai/openai @traceai/fi-core

2. Instrument your application

import { register, ProjectType } from "@traceai/fi-core";
import { OpenAIInstrumentation } from "@traceai/openai";
import { registerInstrumentations } from "@opentelemetry/instrumentation";
import OpenAI from "openai";

// Register tracer provider
const tracerProvider = register({
  projectName: "my_ai_app",
  projectType: ProjectType.OBSERVE,
});

// Register OpenAI instrumentation (before creating client!)
registerInstrumentations({
  tracerProvider,
  instrumentations: [new OpenAIInstrumentation()],
});

// Use OpenAI as normal - traces are captured automatically
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });

const response = await openai.chat.completions.create({
  model: "gpt-4.1",
  messages: [{ role: "user", content: "Hello!" }],
});

Java Quickstart

1. Add dependency (via JitPack)

<dependency>
    <groupId>com.github.future-agi.traceAI</groupId>
    <artifactId>traceai-java-openai</artifactId>
    <version>v1.0.0</version>
</dependency>

2. Instrument your application

import ai.traceai.TraceAI;
import ai.traceai.TraceConfig;
import ai.traceai.openai.TracedOpenAIClient;

// Initialize tracing
TraceAI.init(TraceConfig.builder()
    .apiKey("your-api-key")
    .secretKey("your-secret-key")
    .projectName("my_ai_app")
    .build());

// Wrap your OpenAI client
var tracedClient = new TracedOpenAIClient(openAIClient);

// Use as normal - traces are captured automatically
var response = tracedClient.createChatCompletion(request);

C# Quickstart

1. Install

dotnet add package fi-instrumentation-otel

2. Instrument your application

using FIInstrumentation;
using FIInstrumentation.Types;

// Register tracer
var tracer = TraceAI.Register(opts =>
{
    opts.ProjectName = "my_ai_app";
    opts.ProjectType = ProjectType.Observe;
    opts.ApiKey = "your-api-key";
    opts.SecretKey = "your-secret-key";
});

// Use the tracer with your AI calls

Supported Frameworks

Python

PackageDescriptionVersion
fi-instrumentation-otelCore instrumentation libraryPyPI

LLM Providers

PackageDescriptionVersion
traceAI-openaiOpenAIPyPI
traceAI-anthropicAnthropicPyPI
traceAI-google-genaiGoogle Generative AIPyPI
traceAI-vertexaiGoogle Vertex AIPyPI
traceAI-bedrockAWS BedrockPyPI
traceAI-mistralaiMistral AIPyPI
traceAI-groqGroqPyPI
traceAI-litellmLiteLLMPyPI
traceAI-cohereCoherePyPI
traceAI-ollamaOllamaPyPI
traceAI-togetherTogether AIPyPI
traceAI-deepseekDeepSeekPyPI
traceAI-fireworksFireworks AIPyPI
traceAI-cerebrasCerebrasPyPI
traceAI-huggingfaceHuggingFacePyPI
traceAI-xaixAI (Grok)PyPI
traceAI-vllmvLLMPyPI

Agent Frameworks

PackageDescriptionVersion
traceAI-langchainLangChainPyPI
traceAI-llamaindexLlamaIndexPyPI
traceAI-crewaiCrewAIPyPI
traceAI-openai-agentsOpenAI AgentsPyPI
traceAI-smolagentsSmolAgentsPyPI
traceAI-autogenAutoGenPyPI
traceAI-google-adkGoogle ADKPyPI
traceAI-agnoAgnoPyPI
traceAI-pydantic-aiPydantic AIPyPI
traceAI-claude-agent-sdkClaude Agent SDKPyPI
traceAI-strandsAWS Strands AgentsPyPI
traceAI-beeaiIBM BeeAIPyPI

Tools and Libraries

PackageDescriptionVersion
traceAI-haystackHaystackPyPI
traceAI-dspyDSPyPyPI
traceAI-guardrailsGuardrails AIPyPI
traceAI-instructorInstructorPyPI
traceAI-portkeyPortkeyPyPI
traceAI-mcpModel Context ProtocolPyPI
traceAI-pipecatPipecat (Voice AI)PyPI
traceAI-livekitLiveKit (Real-time)PyPI

Vector Databases

PackageDescriptionVersion
traceAI-pineconePineconePyPI
traceAI-chromadbChromaDBPyPI
traceAI-qdrantQdrantPyPI
traceAI-weaviateWeaviatePyPI
traceAI-milvusMilvusPyPI
traceAI-lancedbLanceDBPyPI
traceAI-mongodbMongoDB Atlas Vector SearchPyPI
traceAI-pgvectorpgvector (PostgreSQL)PyPI
traceAI-redisRedis Vector SearchPyPI

TypeScript

PackageDescriptionVersion
@traceai/fi-coreCore instrumentation librarynpm
@traceai/fi-semantic-conventionsSemantic conventionsnpm

LLM Providers

PackageDescriptionVersion
@traceai/openaiOpenAInpm
@traceai/anthropicAnthropicnpm
@traceai/google-genaiGoogle Generative AInpm
@traceai/fi-instrumentation-vertexaiGoogle Vertex AInpm
@traceai/bedrockAWS Bedrocknpm
@traceai/mistralMistral AInpm
@traceai/groqGroqnpm
@traceai/cohereCoherenpm
@traceai/ollamaOllamanpm
@traceai/togetherTogether AInpm
@traceai/deepseekDeepSeeknpm
@traceai/fireworksFireworks AInpm
@traceai/cerebrasCerebrasnpm
@traceai/huggingfaceHuggingFacenpm
@traceai/xaixAI (Grok)npm
@traceai/vllmvLLMnpm

Agent Frameworks

PackageDescriptionVersion
@traceai/langchainLangChain.jsnpm
@traceai/llamaindexLlamaIndexnpm
@traceai/openai-agentsOpenAI Agentsnpm
@traceai/fi-instrumentation-google-adkGoogle ADKnpm
@traceai/mastraMastranpm
@traceai/beeaiIBM BeeAInpm
@traceai/strandsAWS Strands Agentsnpm

Tools and Libraries

PackageDescriptionVersion
@traceai/vercelVercel AI SDKnpm
@traceai/guardrailsGuardrails AInpm
@traceai/instructorInstructornpm
@traceai/portkeyPortkeynpm
@traceai/mcpModel Context Protocolnpm
@traceai/fi-instrumentation-pipecatPipecat (Voice AI)npm
@traceai/fi-instrumentation-livekitLiveKit (Real-time)npm

Vector Databases

PackageDescriptionVersion
@traceai/pineconePineconenpm
@traceai/chromadbChromaDBnpm
@traceai/qdrantQdrantnpm
@traceai/weaviateWeaviatenpm
@traceai/milvusMilvusnpm
@traceai/lancedbLanceDBnpm
@traceai/mongodbMongoDB Atlas Vector Searchnpm
@traceai/pgvectorpgvector (PostgreSQL)npm
@traceai/redisRedis Vector Searchnpm

Java

Available via JitPack. Add the JitPack repository:

<repositories>
    <repository>
        <id>jitpack.io</id>
        <url>https://jitpack.io</url>
    </repository>
</repositories>
PackageDescription
traceai-java-coreCore instrumentation library

LLM Providers

PackageDescription
traceai-java-openaiOpenAI
traceai-java-azure-openaiAzure OpenAI
traceai-java-anthropicAnthropic
traceai-java-google-genaiGoogle Generative AI
traceai-java-cohereCohere
traceai-java-ollamaOllama
traceai-java-bedrockAWS Bedrock
traceai-java-vertexaiGoogle Vertex AI
traceai-java-watsonxIBM Watsonx

Agent Frameworks

PackageDescription
traceai-langchain4jLangChain4j
traceai-spring-aiSpring AI
traceai-spring-boot-starterSpring Boot Auto-Configuration
traceai-java-semantic-kernelMicrosoft Semantic Kernel

Vector Databases

PackageDescription
traceai-java-pineconePinecone
traceai-java-qdrantQdrant
traceai-java-milvusMilvus
traceai-java-weaviateWeaviate
traceai-java-chromadbChromaDB
traceai-java-mongodbMongoDB Atlas Vector Search
traceai-java-redisRedis Vector Search
traceai-java-azure-searchAzure AI Search
traceai-java-pgvectorpgvector (PostgreSQL)
traceai-java-elasticsearchElasticsearch

C#

Available on NuGet.

PackageDescriptionVersion
fi-instrumentation-otelCore instrumentation libraryNuGet

Compatibility Matrix

CategoryFrameworkPythonTypeScriptJavaC#
LLM ProvidersOpenAI
Anthropic
AWS Bedrock
Google Vertex AI
Google Generative AI
Mistral AI
Groq
Cohere
Ollama
LiteLLM
Together AI
DeepSeek
Fireworks AI
Cerebras
HuggingFace
xAI (Grok)
vLLM
Azure OpenAI
IBM Watsonx
Agent FrameworksLangChain
LlamaIndex
CrewAI
AutoGen
OpenAI Agents
SmolAgents
Google ADK
Agno
Pydantic AI
Claude Agent SDK
AWS Strands Agents
IBM BeeAI
Mastra
LangChain4j
Spring AI
Semantic Kernel
Tools & LibrariesHaystack
DSPy
Guardrails AI
Instructor
Portkey
Vercel AI SDK
MCP
Pipecat
LiveKit
Vector DatabasesPinecone
ChromaDB
Qdrant
Weaviate
Milvus
LanceDB
MongoDB Atlas
pgvector
Redis
Azure AI Search
Elasticsearch

Legend: ✅ Supported | blank = not yet available


Architecture

traceAI is built on top of OpenTelemetry and follows standard OTel instrumentation patterns:

Full OpenTelemetry Compatibility

  • Works with any OTel-compatible backend
  • Standard OTLP exporters (HTTP/gRPC)
  • Compatible with existing OTel setups

Bring Your Own Configuration

You can use traceAI with your own OpenTelemetry setup:

Python: Custom TracerProvider & Exporters
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from traceai_openai import OpenAIInstrumentor

# Set up your own tracer provider
tracer_provider = TracerProvider()
trace.set_tracer_provider(tracer_provider)

# Add custom exporters (example with Future AGI)
otlp_exporter = OTLPSpanExporter(
    endpoint="https://api.futureagi.com/tracer/v1/traces",
    headers={
        "X-API-KEY": "your-api-key",
        "X-SECRET-KEY": "your-secret-key"
    }
)
tracer_provider.add_span_processor(BatchSpanProcessor(otlp_exporter))

# Instrument with traceAI
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)
TypeScript: Custom TracerProvider, Span Processors & Headers
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";
import { Resource } from "@opentelemetry/resources";
import { registerInstrumentations } from "@opentelemetry/instrumentation";
import { OpenAIInstrumentation } from "@traceai/openai";

const provider = new NodeTracerProvider({
  resource: new Resource({ "service.name": "my-ai-service" }),
});

const exporter = new OTLPTraceExporter({
  url: "https://api.futureagi.com/tracer/v1/traces",
  headers: {
    "X-API-KEY": process.env.FI_API_KEY!,
    "X-SECRET-KEY": process.env.FI_SECRET_KEY!,
  },
});

provider.addSpanProcessor(new BatchSpanProcessor(exporter));
provider.register();

registerInstrumentations({
  tracerProvider: provider,
  instrumentations: [new OpenAIInstrumentation()],
});

What Gets Captured

traceAI automatically captures rich telemetry data:

  • Prompts & Completions: Full request/response content
  • Token Usage: Input, output, and total tokens
  • Model Parameters: Temperature, top_p, max_tokens, etc.
  • Tool Calls: Function/tool names, arguments, and results
  • Streaming: Individual chunks with delta tracking
  • Errors: Detailed error context and stack traces
  • Timing: Latency at each step of the AI workflow

All data follows OpenTelemetry Semantic Conventions for GenAI.


Roadmap

  • Go language support
  • Sampling strategies for high-volume production environments
  • Continuous semantic convention updates as the OTel GenAI spec evolves
  • Evaluation integration connecting traces to quality measurement pipelines
  • Expanded agent framework coverage

Request features or report bugs on GitHub Issues.


Contributing

We welcome contributions! Read our Contributing Guide for details.

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Found a bug? Open an issue with a minimal reproduction.


Contributors


Resources

ResourceDescription
WebsiteLearn more about Future AGI
DocumentationComplete guides and API reference
CookbooksStep-by-step implementation examples
Feature Requests & BugsRequest features or report issues
ChangelogAll release notes and updates
Contributing GuideHow to contribute to traceAI
SlackJoin our community
IssuesReport bugs or request features
TraceAI WorkshopWorkshop on tracing and evaluation

Connect With Us

Website LinkedIn Twitter Reddit Substack


Built with ❤️ by the Future AGI team and contributors.

If traceAI helps you debug faster, a ⭐ helps more teams find us.

🌐 futureagi.com · 📖 docs.futureagi.com · ☁️ app.futureagi.com