← Back to Blog

Introducing PAPI AI

The best standalone AI agent library in PHP.

Today we're releasing PapiAI — an open-source PHP library for building AI agents that are production-ready, type-safe, and framework-agnostic. After months of development and real-world testing, we believe PapiAI is the most complete and thoughtfully designed AI agent library available for PHP developers.

Why another AI library?

PHP is one of the most popular languages in the world, and since 8.0 its type system has matured dramatically. Developers build serious applications with Laravel and Symfony every day — yet when it comes to AI, the tooling has lagged behind Python and JavaScript. The existing options either tie you to a specific framework, wrap a single provider's API, or offer thin HTTP clients that leave you to build the agent runtime yourself.

We wanted something different. We wanted a library where you could swap Anthropic for Google Gemini without changing your agent code. Where tool calling, structured output, and streaming worked identically across every provider. Where you could start with a standalone script and later integrate into Laravel or Symfony without a rewrite.

PapiAI is that library.

Design principles

Interface-first architecture

At the heart of PapiAI is a set of contracts that define what an AI provider can do. The ProviderInterface is the core LLM contract — chat(), stream(), and capability checks. Additional interfaces cover embeddings, image generation, text-to-speech, and transcription. Providers implement only the interfaces they support, and your code depends on abstractions, not concrete classes.

This means you can write an agent that uses tools, streams responses, and validates structured output — then swap the underlying model by changing a single constructor argument. Your business logic never touches a provider-specific API.

Zero runtime dependencies in core

The core package requires nothing beyond PHP 8.2 itself. No Guzzle, no PSR-18 client, no HTTP abstraction layer. Provider packages use ext-curl directly for HTTP, keeping the dependency tree minimal and avoiding version conflicts in projects that already have their own HTTP stack.

For middleware that integrates with PSR standards, the core suggests psr/log and psr/simple-cache but never requires them. If you don't use LoggingMiddleware or CacheMiddleware, you don't install their dependencies.

The agentic loop

PapiAI implements a proper agentic loop: the agent sends a prompt to the LLM, receives a response, checks for tool calls, executes them, feeds results back, and repeats until the task is complete or maxTurns is reached. This loop runs transparently inside $agent->run() — you send a prompt and get a final answer, even if the agent made ten tool calls behind the scenes.

$agent = Agent::build()
    ->provider(new AnthropicProvider(apiKey: $_ENV['ANTHROPIC_API_KEY']))
    ->model('claude-sonnet-4-20250514')
    ->instructions('You are a research assistant.')
    ->tools([$searchTool, $summarizeTool, $saveTool])
    ->maxTurns(15)
    ->create();

// The agent will search, summarize, and save — autonomously
$response = $agent->run('Research PHP 8.4 features and save a summary');

What ships today

10 LLM providers

Anthropic (Claude), OpenAI (GPT-4o, o1), Google (Gemini 3.x, 2.x, 1.5), Ollama (local models), Mistral, Groq (LPU inference), Grok (xAI), DeepSeek, Cohere (Command R), and Azure OpenAI. Every provider supports chat, streaming, and tool calling. Most support vision, structured output, and embeddings.

A voice service

ElevenLabs integration for text-to-speech, implementing TextToSpeechProviderInterface. OpenAI's provider also supports TTS and audio transcription.

Structured output

A Zod-inspired schema system for validating LLM responses. Define the exact shape of data you expect — objects, arrays, enums, strings with constraints — and PapiAI ensures the response matches. The schema API is fluent, composable, and works across all providers that support JSON mode.

Middleware pipeline

Four built-in middleware classes: RetryMiddleware (exponential backoff), RateLimitMiddleware (token bucket), LoggingMiddleware (PSR-3), and CacheMiddleware (PSR-16). The pipeline is composable — stack them in any order, or implement MiddlewareInterface to create your own.

Framework bridges

Official Laravel and Symfony integrations. The Laravel bridge provides a service provider, facade, Eloquent conversation store, and queue adapter. The Symfony bridge is a full bundle with YAML configuration, dependency injection wiring, Doctrine persistence, and Messenger queue support.

What's next

PapiAI is at version 0.9. The road to 1.0 includes stabilizing the public API, expanding the test suite, adding more middleware (circuit breaker, cost tracking), and writing comprehensive documentation with real-world examples.

We're building PapiAI in the open. The code is MIT-licensed, the repos are public, and we welcome contributions. If you're a PHP developer who's been waiting for a proper AI agent library — one that respects PHP's strengths and doesn't try to be a Python port — give PapiAI a try.

composer require papi-ai/papi-core papi-ai/anthropic

That's all you need to start.

PapiAI is open source under the MIT license. Find us on GitHub and Packagist.