Skip to content

OpenTrace

Open In Colab GitHub

OpenTrace is an open-source, open-governance Python library for tracing and optimizing workflows using LLM-powered generative optimizers, maintained by the exact same group of developers for Trace.

A typical LLM agent workflow is defined by a sequence of operations, which usually involve user-written Python programs, instructions to LLMs (e.g., prompts, few-shot examples, etc.), and LLM-generated programs to use external tools (e.g., Wikipedia, databases, Wolfram Alpha). Popular LLM libraries often focus on optimizing the instructions. For example, libraries like LangChain focus on optimizing the LLM instructions by representing the instructions as special objects and construct pre/post-processing functions to help users get the most out of LLM calls. In the example figure, this approach updates and changes the brown squares of the agent workflow.

OpenTrace takes a different approach. The user writes the Python program as usual, and then uses primitives like node and @bundle to wrap over their Python objects and functions and to designate which objects are trainable parameters. This step is the declare phase where a user chooses how to represent the agent workflow as a graph. After the user has declared the inputs and operations, OpenTrace captures the execution flow of the program as a graph. This step is the forward phase. Finally, the user can optimize the entire program, such as by updating the LLM instructions, using OpenTrace. This step is the optimize phase.

Platform Overview

Platform Overview
  • Execution Graph Tracing


    Record traces of operations on Python objects and functions, automatically constructing execution graphs optimized for LLM workflows.

    Learn about tracing

  • LLM-Powered Optimization


    Use generative optimizers to automatically improve your AI workflows end-to-end without manual prompt engineering.

    Explore optimizers

  • PyTorch-Inspired API


    Familiar gradient tape mechanism design reduces learning curve while providing powerful workflow optimization capabilities.

    See examples

  • Framework Agnostic


    Pure Python implementation with no API dependencies. Composable with any existing libraries and tools in your stack.

    Integration guide


Installation

Get started with OpenTrace in just a few steps:

Quick Installation

pip install trace-opt

Development Installation

For the latest features or to contribute:

git clone https://github.com/AgentOpt/OpenTrace.git
cd OpenTrace
pip install -e .

Requirements

  • Python 3.8+
  • No additional dependencies required for core functionality
  • Optional: OpenAI API key for LLM-powered optimization

Full Installation Guide


Tracing Workflows

OpenTrace captures the execution flow of your Python programs as computational graphs, making it easy to understand and optimize complex AI workflows. Unlike traditional approaches that focus solely on prompt optimization, OpenTrace provides visibility into your entire pipeline.

Key Features

  • Automatic graph construction from Python execution
  • Operation recording for any Python objects and functions
  • Execution flow visualization for debugging and optimization
  • Minimal overhead with pure Python implementation

Get Started with Tracing


Optimization System

OpenTrace uses LLM-powered generative optimizers to automatically improve your workflows. The system can optimize prompts, function implementations, and entire execution paths without manual intervention.

Optimization Capabilities

  • End-to-end optimization of complete workflows
  • Automatic prompt tuning using feedback signals
  • Code generation and refinement for better performance
  • Multi-step reasoning improvement

Learn About Optimizers


Code Examples

OpenTrace features a PyTorch-inspired API design that makes it intuitive for developers familiar with gradient-based optimization. The familiar patterns reduce the learning curve while providing powerful capabilities.

Quick Example

from opto import trace

@trace.model
class MyAgent:
    def __init__(self):
        self.instruction = trace.node("Be helpful", trainable=True)

    def __call__(self, query):
        return trace.operators.call_llm(self.instruction, query)

View All Examples


Framework Integration

OpenTrace is designed to be composable with existing tools and libraries. Its pure Python implementation means no external dependencies or API calls are required, making it easy to integrate into any workflow.

Compatibility

  • No external API dependencies - works offline
  • Composable design - integrates with existing codebases
  • Flexible deployment - works in any Python environment
  • Library agnostic - use with any ML/AI frameworks

Integration Examples