How to Use AI with PHP — APIs, Libraries, Tools, and Best Practices

PHP powers a massive chunk of the web — WordPress, Laravel, Symfony, Drupal — and AI is moving fast. The good news is the PHP ecosystem has caught up more than most people realise. Whether you want to call OpenAI or Claude from a Laravel app, add AI to a legacy codebase, or speed up your own PHP development with AI tools, here’s a practical rundown of what’s available and how to actually use it.

Calling AI APIs Directly from PHP

The simplest way to get started is hitting an AI provider’s API directly. All the major providers have REST APIs you can call with curl or Guzzle — no special library needed.

OpenAI API

Send a POST request to https://api.openai.com/v1/chat/completions with your API key and a JSON body. That’s it. You get GPT-4, GPT-4o, DALL-E for images, and Whisper for speech-to-text. If you want something structured rather than raw curl, the official PHP community client handles that cleanly.

Anthropic (Claude) API

Same pattern — POST to https://api.anthropic.com/v1/messages with your API key in the x-api-key header and an anthropic-version header. Claude is particularly strong for longer reasoning tasks, structured outputs, and anything where accuracy matters more than raw speed.

Google Gemini API

Available via REST or the Google AI PHP client. Good option if you’re already in the Google ecosystem or want multimodal capabilities — text, images, and code in the same model.

PHP Libraries Worth Using

openai-php/client

The most widely used OpenAI client for PHP. Clean API, actively maintained, works with Laravel and Symfony out of the box. Install via Composer:

composer require openai-php/client

Claude PHP SDK

An unofficial but comprehensive PHP SDK for Anthropic’s Claude. Tracks the official API closely, ships with 85+ examples, and covers streaming, tool use, batches, and agentic workflows. Requires PHP 8.1+.

composer require claude-php/claude-php-sdk

orhanerday/open-ai

A community-maintained OpenAI wrapper with over 3 million installs on Packagist. Works with PHP 7.4+ which makes it useful for older codebases where you can’t upgrade to 8.x yet. Good streaming support.

composer require orhanerday/open-ai

openai-php/laravel

The Laravel-specific wrapper around the OpenAI client. Adds service provider, facade, and config file. If you’re on Laravel, this is the path of least resistance.

composer require openai-php/laravel

elliotjreed/php-ai

A unified library that gives you the same interface for both Claude and OpenAI. Useful if you want to switch providers without changing your application code. PHP 8.3+ required.

composer require elliotjreed/php-ai

alle-ai/anthropic-api-php

A straightforward Anthropic integration for PHP projects that don’t need the full SDK. Simple to configure and good for getting Claude into a project quickly.

composer require alle-ai/anthropic-api-php

AI Tools for PHP Development

Cursor

Cursor is a VS Code fork with AI built in — codebase-aware chat, inline editing, agent mode. Works just as well for PHP as it does for JavaScript. If you’re on a Laravel or Symfony project and want to move faster, this is the tool most PHP developers are reaching for right now. Set up a .cursorrules file with your project’s conventions and it gets much more useful.

PhpStorm with AI

PhpStorm 2025.3 ships with native Claude Agent integration alongside their existing Junie coding agent. You can bring your own API key for OpenAI or Anthropic via BYOK. If PhpStorm is already your daily driver, the AI features are well integrated and feel natural rather than bolted on.

GitHub Copilot

Works in VS Code, PhpStorm, and most major editors. Good for autocomplete and quick generations. Less context-aware than Cursor for larger codebases, but a solid choice if you want something that just works with minimal setup.

The MCP PHP SDK — Worth Knowing About

Anthropic and the Symfony team released an official PHP SDK for MCP (Model Context Protocol) in 2025. This lets PHP applications act as MCP servers — exposing tools, resources, and data to Claude and other AI clients in a standardised way.

Practically, this means you can build Laravel or Symfony services that AI agents can call directly — database queries, email sending, CMS operations — without building custom integrations for each use case. It installs via Composer and has built-in adapters for Laravel and Symfony. If you’re building anything that needs AI agents to interact with PHP backend logic, this is the clean way to do it.

Best Practices

Store API keys in environment variables

Never hardcode API keys in your PHP files. Use .env and a library like vlucas/phpdotenv to load them. In Laravel this is already built in. Rotate them regularly and keep them out of version control.

Handle errors and timeouts properly

AI API calls can be slow, especially for longer responses. Set sensible timeouts on your curl or Guzzle requests, handle rate limit errors (HTTP 429) with exponential backoff, and always wrap API calls in try-catch. Assuming the API will always respond quickly and successfully will come back to bite you in production.

Use streaming for long responses

If you’re generating long content or building a chat interface, streaming responses give users something to read immediately rather than waiting for the full response. Both OpenAI and Anthropic support server-sent events for streaming. The orhanerday/open-ai package handles this cleanly on the PHP side.

Cache where it makes sense

AI API calls cost money and take time. If you’re generating content that won’t change — a summary, a translation, a set of tags for an article — cache the result rather than regenerating it on every request. Laravel’s cache layer or a simple Redis setup works well for this.

Validate and sanitise AI output before using it

AI can return unexpected formats, hallucinated data, or outputs that don’t match your expected structure. If you’re expecting JSON back, validate it before you try to use it. If you’re inserting AI-generated content into your database or displaying it to users, treat it the same way you’d treat any external input — don’t trust it blindly.

Be mindful of what you send to external APIs

User data, personal information, and proprietary content goes to the provider’s servers when you make API calls. Check the terms and privacy policies of the providers you use, especially if you’re building for regulated industries or handling sensitive data. For cases where data can’t leave your infrastructure, look into self-hosted models via Ollama — several PHP libraries support local models too.

Track token usage and costs

AI API costs add up fast if you’re not watching them. Log input and output tokens per request, set up budget alerts in your provider dashboard, and think carefully about which model you actually need for each task. GPT-4o mini or Claude Haiku is often perfectly good for straightforward tasks at a fraction of the cost of the bigger models.

Where to Go From Here