Python · Multi-provider · Open source

lazybridge
Your Bridge to Agentic AI.

The fastest path from a single agent to full multi-provider orchestration. Zero boilerplate. Full control. Built-in tracking.


The idea behind it

LazyBridge is a Python framework for building agentic systems without accumulating orchestration debt.

It starts with a single idea: the same interface should work at every level of complexity — a one-line LLM call, a tool-using agent, or a nested multi-agent pipeline. Functions become tools. Agents become tools. Sessions become tools. The system grows by composition, not by rewriting the architecture.

The goal is not abstraction for its own sake. The goal is to keep control as complexity increases.

LazyBridge stays provider-agnostic and treats these as first-class primitives:

The API remains readable by both human developers and AI coding assistants.
Start simple. Scale into real orchestration. The grammar of the code never changes.


How agents compose

Example: an orchestrator dispatches a task to a composed pipeline. Three research agents run in parallel (vertical, fan-out), then the merged output flows through a writer and editor in sequence (horizontal chain).

Both the inner parallel session and the outer chain session expose .as_tool() — the orchestrator calls the whole pipeline as a single function and gets a typed result back.

LazyBridge pipeline diagram: orchestrator calls a chain session containing a parallel sub-session (tech, market, opinion agents) followed by writer and editor agents, returning a typed BlogPost result. orchestrator LazyAgent · .loop(task, tools=[pipeline]) pipeline.run({'task': '...'}) LazySession · mode="chain" parallel tech anthropic · web_search market anthropic · web_search opinion openai · web_search writer anthropic editor openai · output=BlogPost Blog Post returns BlogPost ↑ inner_sess.as_tool(mode="parallel") → LazyTool ↑ outer_sess.as_tool("pipeline", mode="chain") → the entire diagram is one callable LazyTool

01

See it in action

One API. Any scale. Five minutes to your first pipeline.

The simplest possible start. Pick a provider, call .text() — you get back a string. No SDK setup, no message arrays, no parsing. One line.
01 — single call
from lazybridge import LazyAgent

# "anthropic" · "openai" · "google" · "deepseek" — same code, swap one string
ai = LazyAgent("anthropic")

# .text() returns a plain string — no response object to unwrap
result = ai.text("Summarize the state of open-source LLMs")
Turn any Python function into a tool — automatically. LazyBridge reads the type hints and docstring to build the JSON schema. .loop() keeps calling tools until the model signals it's done. No manual dispatch loop.
02 — tool loop
from lazybridge import LazyAgent, LazyTool

# A plain Python function — no decorator, no boilerplate
def search(query: str, max_results: int = 5) -> str:
    """Search the web and return results."""
    ...

# Type hints + docstring → JSON schema, auto-generated
search_tool = LazyTool.from_function(search)

ai = LazyAgent("anthropic")
# .loop() runs the tool-call cycle until the model stops calling tools
result = ai.loop("Find recent AI papers and summarize them", tools=[search_tool])
Agents calling agents. Any LazyAgent becomes a tool via .as_tool(). The orchestrator decides when to invoke each one — and they can run on different providers. No glue code needed.
03 — agent as tool
from lazybridge import LazyAgent

# Two specialist agents — different providers, same interface
researcher = LazyAgent("anthropic", name="researcher")
analyst    = LazyAgent("openai",    name="analyst")

orchestrator = LazyAgent("anthropic")
# .as_tool() wraps each agent — the orchestrator picks who to call and when
result = orchestrator.loop(
    "Research and analyse the EV market",
    tools=[researcher.as_tool(), analyst.as_tool()]
)
A whole pipeline in one callable. LazySession groups agents together. .as_tool() wraps the entire session — parallel, chain, or mixed — into a single tool that any outer orchestrator can call like a function.
04 — pipeline as tool
from lazybridge import LazyAgent, LazySession

sess = LazySession()

# Inner parallel session + one chained editor → wrapped as a single tool
pipeline = sess.as_tool(
    "research_and_edit",
    "Research a topic in parallel, then edit into a report",
    mode="chain",         # steps run in sequence
    participants=[
        inner_sess.as_tool("research", "...", mode="parallel"),  # runs in parallel
        LazyAgent("anthropic", name="editor", session=sess),
    ]
)

# From the outside: just a function call — hides all internal complexity
result = pipeline.run({"task": "Analyse open-source LLM trends"})
Full multi-agent pipeline — ~40 lines. Three parallel searchers (mixed providers) feed into a writer, then a typed-output editor. The inner session becomes a tool. The outer session chains everything. The grammar never changes.
05 — nested pipelines
# Step 1 — three agents search in parallel (mixed providers)
research_sess = LazySession()
LazyAgent("anthropic", name="tech",    session=research_sess, native_tools=["web_search"])
LazyAgent("anthropic", name="market",  session=research_sess, native_tools=["web_search"])
LazyAgent("openai",    name="opinion", session=research_sess, native_tools=["web_search"])
# Wrap the whole parallel session as a single reusable tool
research_tool = research_sess.as_tool("research", "...", mode="parallel")

# Step 2 — chain: research_tool → writer → editor (typed output)
outer_sess = LazySession()
pipeline = outer_sess.as_tool(
    "full_pipeline", "Parallel research → write → edit", mode="chain",
    participants=[
        research_tool,                                                         # inner session
        LazyAgent("anthropic", name="writer", session=outer_sess),
        LazyAgent("openai",    name="editor", session=outer_sess, output_schema=BlogPost),
    ]
)

# result is a typed BlogPost — structured output, web search, parallel + chain, ~40 lines
post = pipeline.run({"task": "Open-source LLMs in 2025"})

02

Build fast.
Scale cleanly.
Stay flexible.

With LazyBridge you go from a single agent to a multi-agent pipeline without changing mental model. One grammar at every scale.

0

Zero Boilerplate

Less infrastructure code, more logic. Type hints and docstrings become tool schemas automatically.

Provider Agnostic

Same pattern on OpenAI, Anthropic, Google, DeepSeek. Change one string to switch providers.

Memory · Tracking · Structured Output

Stateful conversations with Memory. Session event logs with LazySession. Typed outputs with Pydantic. All built-in, all composable.

"From single-agent to multi-agent orchestration in the same mental model."

provider-agnostic · session tracking
from lazybridge import LazyAgent, LazySession
from lazybridge import Event

sess = LazySession(db="pipeline.db", tracking="basic")

# Same code. Different provider. One string to change.
agent = LazyAgent("anthropic", session=sess)
# or: "openai" | "google" | "deepseek"

result = agent.text("Summarize the state of open-source LLMs")

# Built-in event log — no external stack needed
calls = sess.events.get(event_type=Event.TOOL_CALL)

03

Everything
is a tool.

The abstraction is concrete: a Python function becomes a tool with LazyTool.from_function(...), an agent becomes a tool with agent.as_tool(), and a session can be packaged as one callable unit.

Once everything shares the same interface, the outer orchestrator can call a function-backed tool, an agent-backed tool, or a pipeline-backed tool in exactly the same loop.

  • Wrap existing Python functions directly
  • Promote agents into reusable specialists
  • Collapse pipelines into one callable surface
fn

Function → tool

Type hints become JSON schema. Docstring becomes description. Zero boilerplate.

ag

Agent → tool

agent.as_tool() wraps any agent. An orchestrator calls it by name.

ss

Session → tool

sess.as_tool(mode="parallel") or "chain" — N agents become one callable.

pp

Pipeline → tool

An entire nested pipeline — parallel research + chain editorial — is one tool. An outer orchestrator calls it without knowing anything inside.

Composable
Nestable
Typed
Multi-provider
LazyTool.from_function — type hints become schema, docstring becomes description
from lazybridge import LazyAgent, LazyTool

def search_web(query: str, max_results: int = 5) -> str:
    """Search the web. Returns top results as text."""
    ...

# Type hints → JSON schema. Docstring → description. Zero config.
search_tool = LazyTool.from_function(search_web)

agent = LazyAgent("anthropic")
result = agent.loop("What happened in AI this week?", tools=[search_tool])
agent.as_tool() — any agent becomes a callable specialist
from lazybridge import LazyAgent

analyst = LazyAgent(
    "anthropic",
    name="analyst",
    system="You are a data analyst. Be concise and quantitative.",
)

# Any agent becomes a named, callable tool for an orchestrator
analyst_tool = analyst.as_tool(
    description="Analyse data and return key insights.",
)

orchestrator = LazyAgent("anthropic")
result = orchestrator.loop("Analyse this dataset: ...", tools=[analyst_tool])
sess.as_tool(mode="chain") — a full pipeline becomes one callable
from lazybridge import LazyAgent, LazySession

inner_sess = LazySession()
pipeline_tool = inner_sess.as_tool(
    "research_and_summarise",
    "Research a topic and return a concise summary.",
    mode="chain",
    participants=[
        LazyAgent("anthropic", name="researcher", session=inner_sess),
        LazyAgent("openai",    name="summariser", session=inner_sess),
    ],
)

# researcher → summariser, wired automatically. One callable surface.
orchestrator = LazyAgent("anthropic")
result = orchestrator.loop("Cover these topics: ...", tools=[pipeline_tool])
fn + agent + pipeline — same interface, any depth, one orchestrator
from lazybridge import LazyAgent, LazySession, LazyTool

# 1. Python function → tool
search_tool = LazyTool.from_function(search_web)

# 2. Agent → tool
analyst_tool = analyst.as_tool(description="Analyse data and return insights.")

# 3. Pipeline → tool
inner_sess = LazySession()
pipeline_tool = inner_sess.as_tool(
    "research_and_summarise", "Research a topic and summarise it.",
    mode="chain",
    participants=[
        LazyAgent("anthropic", name="researcher", session=inner_sess),
        LazyAgent("openai",    name="summariser", session=inner_sess),
    ],
)

# Same interface. Any depth. One orchestrator.
orchestrator = LazyAgent("anthropic", system="You coordinate research tasks.")
result = orchestrator.loop(
    "Prepare a full report on open-source LLMs.",
    tools=[search_tool, analyst_tool, pipeline_tool],
)

04

Designed for Humans
and AI Assistants.

LazyBridge keeps code readable as orchestration grows. Compact APIs, predictable structure, less boilerplate to paste and adapt.

hu

Human developers

Keep architectural control. The framework is invisible — your system is the product.

ai

AI coding assistants

Generate more coherent, less fragile code. Consistent patterns mean fewer hallucinated APIs.

Readable
Predictable
AI-native
hu

lazy_wiki — Human Guide

developers

Comprehensive documentation written for engineers. Covers every class, method, and pattern — from a first pipeline to advanced multi-agent orchestration. Start with the Quickstart and build from there.

Read the Quickstart →
ai

lazy_wiki — AI Reference

AI-native

A structured reference index built for coding assistants and LLMs. Every class, pattern, and rule is machine-readable and unambiguous. Also available as a native Claude Code skill — install it once and Claude Code understands LazyBridge natively, without pasting docs or explaining the API.

Browse the AI reference →

05

Advanced orchestration,
without complexity debt.

  • Multi-agent sessions with shared store, event tracking, and serializable graph
  • Conditional routing with LazyRouter — send to different agents based on outcome
  • Typed structured output via output_schema=MyModel — Pydantic at any pipeline depth
  • Native provider tools: web search, code execution, file tools — no wrappers needed

Topology

research_sess
research_tool
parallel
tech_scout [WEB]
market_scout [WEB]
opinion_scout [WEB]
research_tool
→ chain →
writer
editor
BlogPost (typed)
Parallel Chain Typed output 2 providers Web search
from pydantic import BaseModel
from lazybridge import LazyAgent, LazySession
from lazybridge.core.types import NativeTool

class BlogPost(BaseModel):
    title: str; body: str; tags: list[str]

# Layer 1 — parallel research
rs = LazySession(tracking="basic", console=True)
LazyAgent("anthropic", name="tech",    session=rs,
          native_tools=[NativeTool.WEB_SEARCH])
LazyAgent("anthropic", name="market",  session=rs,
          native_tools=[NativeTool.WEB_SEARCH])
LazyAgent("openai",    name="opinion", session=rs,
          native_tools=[NativeTool.WEB_SEARCH])
research_tool = rs.as_tool("research", "...", mode="parallel")

# Layer 2 — chain editorial
os = LazySession(tracking="basic", console=True)
pipeline = os.as_tool(
    "blog_pipeline", "Research → write → edit", mode="chain",
    participants=[
        research_tool,
        LazyAgent("anthropic", name="writer", session=os),
        LazyAgent("openai",    name="editor", session=os,
                  output_schema=BlogPost),
    ]
)

post = pipeline.run({"task": "Open-source LLMs in 2025"})
# post is a BlogPost instance. Typed. Done.

Framework

The mechanics

Every primitive explained. Every pattern demonstrated. The architecture behind the claim.

Get Started →

Toolbox

Reusable blocks

Tools, pipelines, and templates. Download, study, modify, nest. Everything is a tool.

Browse toolbox →

Install

Get started

One package. Set your API key. The rest is composition.

$ pip install lazybridge

GitHub →