Skip to main content

Command Palette

Search for a command to run...

Laravel's Pipeline Pattern: The Hidden Gem You're Already Using (Without Knowing It)

Coffee Chat - Episode 9

Published
โ€ข9 min read
Laravel's Pipeline Pattern: The Hidden Gem You're Already Using (Without Knowing It)
D

Software Engineer, passionate coder, PHP, gamer, geek and a person who has a curiosity ๐ŸŽ“๐ŸŽฎ๐ŸŽง๐Ÿ’ป

You write middleware every single day. You rely on Eloquent's attribute casting without thinking about it. You boot your application through the HTTP kernel a hundred times a session.

You have been using the Pipeline pattern the whole time. You just did not know it had a name.

You have faced this situation before: push some data through a series of steps, validate it, transform it, enrich it, do something with the result. How did you handle it? Nested if statements? A long service class method with ten responsibilities? A chain of function calls nobody wants to maintain six months later?

There is a better way. Laravel ships it out of the box.

What is the Pipeline Pattern

The simplest mental model: a car factory assembly line. A bare chassis rolls in at one end. Station one: the engine goes in. Station two: doors get attached. Station three: seats. Station four: paint. A finished car rolls out the other end.

Each station does exactly one thing. It receives the object, does its job, and passes it to the next station. No station cares what happened before it. None knows what comes after. They just do their one job.

That is the Pipeline pattern. Send a payload through a series of stages (pipes), each stage transforms or processes the payload, and you get the result at the end.

Charlie Chaplin on a factory assembly line, the original pipeline in action

Laravel's Pipeline Facade

Laravel ships a Pipeline facade with a clean API for building pipelines:

use Illuminate\Support\Facades\Pipeline;

\(result = Pipeline::send(\)payload)
    ->through([
        FirstPipe::class,
        SecondPipe::class,
        ThirdPipe::class,
    ])
    ->thenReturn();

Three methods. That is the whole API.

  • send($payload): what you are passing through the pipeline

  • through([]): the pipes the payload will travel through, in order

  • thenReturn(): execute the pipeline and return the final result

You can also use then() instead of thenReturn() if you want a final callback to run after all the pipes:

\(result = Pipeline::send(\)payload)
    ->through([
        FirstPipe::class,
        SecondPipe::class,
    ])
    ->then(function ($processedPayload) {
        // implementation hidden
    });

The magic is in the pipes themselves.

A Real-World Example: Order Processing

Imagine you are building an e-commerce checkout flow. When a customer places an order, you need to:

  1. Validate the order data

  2. Apply any coupon codes

  3. Calculate tax

  4. Charge the customer

  5. Send a receipt email

Without the Pipeline pattern, this logic collapses into a single bloated service method. With it, each step becomes its own small, focused class you can test in complete isolation.

The pipeline call:

namespace App\Http\Controllers;

use App\Models\Order;
use App\Pipes\ValidateOrder;
use App\Pipes\ApplyCoupon;
use App\Pipes\CalculateTax;
use App\Pipes\ChargeCustomer;
use App\Pipes\SendReceipt;
use Illuminate\Support\Facades\Pipeline;

class CheckoutController extends Controller
{
    public function store(StoreOrderRequest $request): JsonResponse
    {
        // implementation hidden
    }
}

The pipeline execution:

\(order = Pipeline::send(\)order)
    ->through([
        ValidateOrder::class,
        ApplyCoupon::class,
        CalculateTax::class,
        ChargeCustomer::class,
        SendReceipt::class,
    ])
    ->thenReturn();

Each pipe class looks like this:

namespace App\Pipes;

use App\Models\Order;
use Closure;

class ApplyCoupon
{
    public function handle(Order \(order, Closure \)next): Order
    {
        // implementation hidden
    }
}

The handle method receives two things: the current payload and a \(next closure. Calling \)next(\(order) passes the payload to the next pipe. You modify the payload before or after calling \)next, depending on what you need.

A tax calculation pipe:

namespace App\Pipes;

use App\Models\Order;
use Closure;

class CalculateTax
{
    public function handle(Order \(order, Closure \)next): Order
    {
        // implementation hidden
    }
}

A receipt pipe:

namespace App\Pipes;

use App\Models\Order;
use App\Notifications\OrderConfirmation;
use Closure;

class SendReceipt
{
    public function handle(Order \(order, Closure \)next): Order
    {
        // implementation hidden
    }
}

Each class has one job. Each one is independently testable. If the tax calculation logic changes tomorrow, you open CalculateTax.php, change it, and nothing else in the chain breaks.

The controller does not care how tax is calculated or how the receipt is sent. It just says "run this order through these steps." Clean separation. No tangled dependencies.

How Laravel Uses Pipeline Internally

Here is the part that stopped me cold when I first saw it.

Laravel's own HTTP middleware system is built on Pipeline. When your request comes in and passes through all those middleware classes you register in app/Http/Kernel.php, that is the Pipeline pattern running.

You can see it in Illuminate\Routing\Router:

// Inside Illuminate\Routing\Router

protected function runRouteWithinStack(Route \(route, Request \)request): Response
{
    // implementation hidden
}

And inside Illuminate\Foundation\Http\Kernel:

// Inside Illuminate\Foundation\Http\Kernel

protected function sendRequestThroughRouter(Request $request): Response
{
    // implementation hidden
}

Every request you have ever handled in Laravel has passed through a Pipeline instance. Your middleware classes are literally pipe classes. They receive the request, optionally modify it, and call \(next(\)request) to pass it along.

Eloquent's attribute casting goes through a pipeline when serializing and unserializing model data. The broadcasting system uses it. The authentication guards use it.

Mind blown, you have been using Pipeline this whole time

Learning the Pipeline pattern does not just give you a new tool. It gives you a deeper understanding of how the entire framework is wired together.

Creating Reusable Pipe Classes

You can define a contract for your pipe classes to enforce the handle method signature across your application. This is optional, but a clean habit for larger codebases.

namespace App\Contracts;

use Closure;

interface PipeInterface
{
    public function handle(mixed \(payload, Closure \)next): mixed;
}

Then your pipe classes implement it:

namespace App\Pipes;

use App\Contracts\PipeInterface;
use App\Models\Order;
use Closure;

class ValidateOrder implements PipeInterface
{
    public function handle(mixed \(payload, Closure \)next): mixed
    {
        // implementation hidden
    }
}

You can also use closures directly in the through() array for quick transformations that do not warrant a full class:

\(result = Pipeline::send(\)payload)
    ->through([
        ValidateOrder::class,
        function (\(order, \)next) {
            // implementation hidden
        },
        CalculateTax::class,
    ])
    ->thenReturn();

Classes give you better testability and reusability. Closures are fine for one-off transformations that will never be reused.

Conditional Pipes

Sometimes you only want to run certain pipes based on conditions known at runtime. Build your pipes array conditionally before passing it to through():

$pipes = [
    ValidateOrder::class,
    CalculateTax::class,
];

if ($order->hasCoupon()) {
    $pipes[] = ApplyCoupon::class;
}

if ($order->customer->isVip()) {
    $pipes[] = ApplyVipDiscount::class;
}

$pipes[] = ChargeCustomer::class;
$pipes[] = SendReceipt::class;

\(result = Pipeline::send(\)order)
    ->through($pipes)
    ->thenReturn();

Readable, explicit, easy to follow. You look at this code and know exactly which pipes will run for a VIP customer with a coupon.

For more complex pipeline construction logic, a dedicated builder class keeps the controller skinny:

namespace App\Pipelines;

use App\Models\Order;

class OrderProcessingPipeline
{
    public static function pipes(Order $order): array
    {
        // implementation hidden
    }
}

Then in your controller:

\(result = Pipeline::send(\)order)
    ->through(OrderProcessingPipeline::pipes($order))
    ->thenReturn();

Error Handling in Pipelines

If an exception is thrown inside any pipe, it bubbles up through the pipeline like a normal exception. Catch it at the call site:

try {
    \(order = Pipeline::send(\)order)
        ->through([
            ValidateOrder::class,
            ApplyCoupon::class,
            CalculateTax::class,
            ChargeCustomer::class,
            SendReceipt::class,
        ])
        ->thenReturn();
} catch (OrderValidationException $e) {
    // implementation hidden
} catch (PaymentFailedException $e) {
    // implementation hidden
}

You can also handle exceptions inside individual pipes when you want a specific pipe to recover gracefully without stopping the whole pipeline:

namespace App\Pipes;

use App\Models\Order;
use Closure;
use Throwable;

class SendReceipt
{
    public function handle(Order \(order, Closure \)next): Order
    {
        try {
            // implementation hidden
        } catch (Throwable $e) {
            // implementation hidden
        }

        return \(next(\)order);
    }
}

If sending the receipt email fails, the order still returns as processed rather than throwing and potentially confusing the customer. The failure gets logged quietly. That kind of intentional error handling decision is much cleaner when each pipe is its own class.

Do not wrap everything in try-catch blindly. Think about what a failure in each pipe actually means for the business logic. A failed payment should absolutely throw and halt the pipeline. A failed analytics event probably should not.

Pipeline vs Chain of Responsibility

You might be thinking: "this sounds like the Chain of Responsibility pattern." They are related. Here is the distinction.

Chain of Responsibility is about finding the right handler. You pass a request through a chain of handlers, and each handler either handles the request or passes it along. The chain stops when someone handles it.

Pipeline is about transformation. Every pipe in the chain processes the payload. Nothing short-circuits unless you throw or return early. The goal is to get the payload from point A to point Z through a series of transformations.

In practical Laravel terms: your middleware is closer to Chain of Responsibility in intent (a request can be short-circuited by returning a response early), but it uses the Pipeline implementation. The order processing example above is a pure pipeline because every stage must run.

Use Pipeline when every stage must contribute to the result. Use Chain of Responsibility when you are looking for the right handler and only one should act.

Key Takeaways

  • The Pipeline pattern sends a payload through a series of stages, each doing one focused job

  • Laravel's Pipeline facade gives you send()->through()->thenReturn() as the core API

  • Pipe classes use the handle(\(payload, Closure \)next) signature, same as middleware

  • Laravel's own HTTP kernel, middleware, and model casting are all built on Pipeline

  • Conditional pipes are handled by building the pipes array dynamically before calling through()

  • Exceptions bubble naturally out of pipelines. Design your error handling intentionally per pipe.

  • Pipeline is about transformation across all stages; Chain of Responsibility is about finding one handler

You Have Been Here Before. Now Build With It Intentionally.

The Pipeline pattern is not some obscure design pattern from a computer science textbook. It is baked into the framework you use every day. The middleware you write, the requests you handle, the models you cast: all of it flows through pipelines.

The difference now is that you can take that same mechanism and use it for your own domain logic. Order processing. Document approval workflows. Data import pipelines. Anywhere you have a multi-step process where each step deserves to be isolated, tested, and named clearly.

Take one of your fat service class methods and break it into pipes. You will be surprised how much cleaner it feels.

In the next article, I will build a more advanced pipeline with rollback support. If one pipe fails, previous pipes automatically undo what they did.

Developer celebrating after clean code finally clicks