Laravel and NodeJS messaging using Redis Pub/Sub

I was recently working on this project that was composed of two different parts: a web application built in PHP with Laravel, and an AWS Lambda function written in NodeJS. In the past, both applications exchanged data using a common MySQL database. With time, this setup showed up very inefficient. As the number of “messages” sent and received increased, the database started to not handling well the volume of reads and writes required to support both “applications” — the Lambda function is not an application per se but you know what I mean, right?

The first thing we tried was changing the database schema to focus on performance, rather than on data integrity. We dropped some constraints and changed how the data was stored to achieve that. The updates soon showed themselves not enough.

In a second iteration, we started playing around with Redis. Due to its nature, a key/value store and not a relational database, it’s a lot faster than MySQL. The first attempt using Redis involved simply moving the data we’re storing into the database to a set. It seemed to work well but just after a few tests on a staging server we realized that approach wouldn’t work for the system needs. When retrieving the data using the SCAN command, the order of returned elements is not guaranteed. And that was an important downside for us, the business logic required us to read the data in the same order it was written.

Finally, we got to the setup we have now: both sides — the web app and the Lambda function — were updated to use Redis Pub/Sub implementation. Laravel supports Redis out of the box, which was a nice thing to have. For the NodeJS part, we used NodeRedis.

Subscribing to a channel

As I mentioned, Laravel already has an interface to deal with Redis. It still needs an underlying client, but most of the operations are pretty straightforward. You may refer to the Laravel docs for more info. Subscribing to a channel requires a single method call:

Redis::subscribe([ 'channel_name' ], function ($message) {
    /* Do whatever you need with the message */

I’m using an Artisan command to start this listener, this way:

class Subscriber extends Command
    protected $signature = 'redis:subscriber';

    protected $description = '...';

    public function handle()
        Redis::subscribe([ 'channel_name' ], function ($message) {

    public function processMessage(string $message)
        /* Handles the received message */
        $this->info(sprintf('Message received: %s', $message));

Now we simply have to trigger the command to start listening to the channel.

You’ll notice that after a minute without receiving any data, the next time the subscriber gets a message an error will be thrown. That’s because the connection timed out. To fix that, we added the following settings to the config/database.php file, inside the "redis" block:

'read_write_timeout' => 0,
'persistent' => 1,

Publishing to the channel

On the NodeJS side, we need the aforementioned library. To install it:

$ npm install redis

After that, we’ll need to write our Lambda function that publishes to the channel. Since the focus is the Pub/Sub flow, I’m not using any particular logic to create the message here, just returning the attribute received with the event.

const redis = require('redis');
const client = redis.createClient();

const handler = (event, context) => {
    const message = processEvent(event);
    client.publish('channel_name', message);
    return context.done(null, {

const processEvent = (event) => {
    /* Handles the event and return the message to publish */
    return event.message;

exports.handler = handler;

Notice I’m not passing any properties to the createClient function. You’ll probably want to set the host or any other custom configuration you have to properly connect to the Redis instance. Check the NodeRedis docs for more info about the available properties.

Testing all together

First, start the Artisan command. If you used the same name from my example above, you should be able to run the following:

$ php artisan redis:subscriber

Then, you have to run your Lambda function to publish messages. You can do that after deploying the code to AWS. Or, you can run it locally with a mockup of the Lambda env. Something like this:

const http = require('http');

// This is where the Lambda function is
const lambda = require('./lambda');

const context = {
    done: (error, success) => {
        if (error) {
            console.error('FAIL:', error);
        console.log('OK:', success);

const server = http.createServer((request, response) => {
    let data = '';
    request.on('data', (chunk) => {
        data += chunk;
    request.on('end', () => {
        if (data) {
            const event = JSON.parse(data);
            lambda.handler(event, context);

server.on('clientError', (error, socket) => {
    socket.end('HTTP/1.1 400 Bad Request\r\n\r\n');


This stub is a very basic mockup of Lambda env. It lacks some better error handling and validation. But for the purpose of this test, it does what we need. I strongly don’t recommend using this code in production, though.

If you named the script above, for instance, as web.js, you should be able to run it:

$ node web.js

And then invoke the function with cURL:

$ curl -d '{"message":"Hello world!"}' http://localhost:3000

The request body (with the -d param in the command) will be parsed as JSON and sent to the Lambda function as the event. If you check the function again, you’ll notice we’re using the message attribute there.

After executing that command, you should see two different outputs in your console. One from the Lambda mockup, which may look like this:

OK: { message: 'Hello world!' }

And another from the Artisan command:

Message received: Hello world!

The output will change in according to the message with the request body.


In this sample code, I showed the basics of Redis Pub/Sub. You don’t necessarily need AWS Lambda to use it. I just wanted to show up a “nearly” real-life use case. Sure, this is still not a real application, but I hope you got the idea.

You may have noticed, but this is a way to build what the cool kids out there call Microservices. If this is all new to you, maybe this is an opportunity to give it a chance and try to build your first distributed application.

Got comments or questions? Feel free to share them below.


Testing Stripe webhooks when using Laravel Cashier

I’m not sure why this is not in the docs, but if you’re using Laravel Cashier and want to test Stripe webhooks – in test, not live, mode – you have to set the following env var:


I’ve spent some time checking around my code until I found that.

Injecting controller actions in Laravel views

Disclaimer: Depending on the kind of logic you need, it’s also possible to use View Composers to achieve a similar result.

I’m using Laravel in this new project I’m working on. Some other PHP frameworks have a feature to use controllers as services. Symfony, for instance, has something like that. The project team thought Laravel, as Symfony-based, would have something like that. Well, if it has, it’s not clear in the docs.

Another team member ended up with a solution I never thought before:

 @inject('someController', 'App\Http\Controllers\SomeController')
 {!! $someController->index() !!}

Now, we’re using Blade’s @inject directive to call controller actions from inside views. That’s useful for reusing actions as widgets, for example.

If you find that interesting and want to use in your application, remember two things:

  1. Since you’re calling the action method directly, you have to pass all the required params. If it expects a request instance, you can do this: $someController->index(request()).
  2. Probably the method returns a view that contains HTML code. So wrap the call within {!! and !!}. Using the {{ }} regular tag will cause the code to be escaped.

Implementing an SDK compatible with Laravel Container Service

When developing an SDK — and by SDK I mean an API implementation — you can leave for your users the task of integrating it with their apps. However, it’s a good idea to make it compatible with trendy frameworks out of the box.

In the PHP world, Laravel is becoming a very popular choice of framework. I want to share how you can make your SDK compatible with its Service Container.

The API client

For the sake of example, let’s imagine your API client class looks like this:

class Client
    public function __construct($username, $password)
        // Set up client.

Maybe you have a different way to set up API configuration. Anyway, it’s a good idea to have them set through a method, and not directly via a config file or other env-depending method. Using a method will make it easy to pass the settings from user’s app to the SDK. Also, it makes the configuration process more abstract and easier to plug-in.

The config file

In the sample client above, we have to pass a username and password to create a new instance. It’s clear we need to get that from somewhere. When using the SDK directly, you may do the following:

 define('API_USER', 'username');
 define('API_PASSWORD', '***');
 $client = new Client(API_USER, API_PASSWORD);

Since our goal is to inject the API implementation into the app, it’s better to config that using the framework way. Laravel stores application settings in different files inside a config dir. We can create a file like those to store the API settings:


return [

    // The API user.
    'username' => 'username',

    // The API password.
    'password' => '***',


Even better than set plain values to that config array, it’s using env vars. Laravel is shipped with PHP dotenv. It allows each environment to have its own settings without any change in the application code. So, let’s change our config file a little bit:


return [

    // The API user.
    'username' => env('API_USER'),

    // The API password.
    'password' => env('API_PASSWORD'),


The vars we’re using here have a very generic name. You should use a more specific name to avoid conflicts with other services. Something like DUMMY_API_USER, for an API called Dummy, for example.

The Service Provider

According to Laravel docs:

Service providers are the central place of all Laravel application bootstrapping. Your own application, as well as all of Laravel’s core services are bootstrapped via service providers.

But, what do we mean by “bootstrapped”? In general, we mean registering things, including registering service container bindings, event listeners, middleware, and even routes. Service providers are the central place to configure your application.

We have to create a service provider to tell Laravel that our API client can be injected as a dependency into application classes and methods. Also, the service provider will be in charge to merge the API configuration into the application configs.

It will look like this:

use Illuminate\Support\ServiceProvider;

class ApiServiceProvider extends ServiceProvider

    public function boot()
            __DIR__ . '/config.php' => config_path('api.php'),

    public function register()
        $this->mergeConfigFrom(__DIR__ . '/config.php', 'api');

        $this->app->singleton('api.config', function ($app) {
            return $this->app['config']['api'];

        $this->app->singleton(Client::class, function ($app) {
            $config = $app['api.config'];
            return new Client($config['username'], $config['password']);

    public function provides()
        return [

In the boot method, we tell to Laravel which config files can be published to application’s config dir. So users of our API can overwrite those settings.

Within the register method, the service provider binds the config and the API client instance into the service container.

To improve performance, we use the provides method to let the framework know what are the binds this service provider offers. This way, it will only try to resolve the bind when it’s actually needed.

Using the service provider

After you added the SDK to the application, probably using Composer, you have to register its service provider. Open the config/app.php file of the app and add the service provider to the providers array:

$providers = [
    // ...


Now you can inject the API client into the app classes, like controllers:

class UserController extends Controller
    public function show(Client $client)
        $user = $client->getUser();
        return view('', [ 'user' => $user ]);

To set the API username and password, you have to publish the config file:

$ php artisan vendor:publish --provider="ApiServiceProvider"

Then edit the config/api.php file if needed. This file may have another name if you changed its name in the service provider, which you should do.

You also may want to create the env vars inside the application’s .env.example and .env files.


Making your API SDK compatible with Laravel is very simple and requires only one extra class. It worth adding that to reach more users and make their work easier.

P.S.: I’ve omitted some implementation details and stuff like namespaces in the samples above. You can find a complete functioning example in Github:

Laravel Logs to Sentry

If text-based logging is all you have, you should give a try to Sentry. It’s an amazing way to visualize and get notified about exceptions. And it supports a bunch of languages and frameworks. You can find more on its website, now I want to share a small tip to having your Laravel log outputted to Sentry.

When you set up Sentry to work with Laravel based on their guide, only exceptions will be sent to Sentry. If you have something like this:

try {
    throw new \Exception('Foo');
} catch (\Exception $e) {

That exception won’t reach the handler. So, Sentry won’t receive it.

There are a couple ways to forward that exception to Sentry, though. One of them is by doing the following:

try {
    throw new \Exception('Foo');
} catch (\Exception $e) {

Yet, if you already wrote a lot of code, seeking for all your logging calls and adding that line can be a pain. To avoid that it’s possible to listen to the log event. Add a listener to your EventServiceProvider:

public function boot()
    Event::listen('illuminate.log', function ($level, $message, $context) {
        $sentry = app('sentry');
        if ($message instanceof \Exception) {
            $sentry->captureException($message, $context);
        } else {
            $sentry->captureMessage($message, null, $context);

This way all you’re logging calls will go to Sentry. You may found that some debug logging, for instance, may be pulled out from this logic. Add a condition to prevent it:

Event::listen('illuminate.log', function ($level, $message, $context) {
    if (in_array($level, [ 'debug', 'info' ])) {

    // ...


BTW, Happy new year! ;)

Debugging requests with cURL

For more than one time I had to debug HTTP request or response headers and other details. To do that, I use two techniques, both based on cURL library. I explain them ahead.

Technique #1: From the command line

This is the easiest way to debug. It doesn’t require writing any actual code. Just call curl program from the command line, as usual, adding a new param: -vvv. This will enable the highest verbosity level.

$ curl -vvv
* Rebuilt URL to:
* Trying 2800:3f0:4001:802::200e...
* Connected to (2800:3f0:4001:802::200e) port 80 (#0)
> GET / HTTP/1.1
> Host:
> User-Agent: curl/7.43.0
> Accept: */*
< HTTP/1.1 302 Found
< Cache-Control: private
< Content-Type: text/html; charset=UTF-8
< Location:
< Content-Length: 262
< Date: Tue, 23 Aug 2016 12:28:29 GMT
<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
<H1>302 Moved</H1>
The document has moved
<A HREF=";ei=bUG8V53JGcvK8gfp3L-YBg">here</A>.
* Connection #0 to host left intact

As you can see in the example above, it outputs all request and response info.

It’s possible to output everything to a file, by adding > output_file.txt to the end of the command. Using our previous call:

$ curl -vvv > output.txt

Well, one may now ask: if this is so easy, why do you have a second way to debug request? Following we’ll see why that.

Technique #2: From a PHP script

I’ve written on debugging cURL and PHP at Blog. Let’s say you have to send a dynamic header with the request, like a JWT authorization token. It’s not impossible to that from the command line, but it’s easier using programming. For those cases, I use the cURL PHP extension. Check out the script below.

$url = '';
$headers = [
    'Accept' => 'application/json',

 * We're going to use the output buffer to store the debug info.
$out = fopen('php://output', 'w');

$handler = curl_init($url);

 * Here we set the library verbosity and redirect the error output to the 
 * output buffer.
curl_setopt($handler, CURLOPT_VERBOSE, true);
curl_setopt($handler, CURLOPT_STDERR, $out);

$requestHeaders = [];
foreach ($headers as $k => $v) {
    $requestHeaders[] = $k . ': ' . $v;
curl_setopt($handler, CURLOPT_HTTPHEADER, $requestHeaders);
curl_setopt($handler, CURLOPT_RETURNTRANSFER, true);
$response = curl_exec($handler);

 * Joining debug info and response body.
$data = ob_get_clean();
$data .= PHP_EOL . $response . PHP_EOL;
echo $data;

Now, you can customize this code to add some dynamic data to a header or any other request part. After doing that, run it using the PHP program from the command line:

$ php curldebug.php

P.S.: I’m assuming that you saved the script as curldebug.php.

As we did with the curl program, it’s possible to output everything to a file. Just append the > output_file.txt to the call.


Debugging requests can be a lifesaver when dealing with third-party APIs and other services. Headers may contain helpful info to find what is going wrong with that weird response body.

Multiple Domain plugin for WordPress

For a few projects in the past, I had to find a way to make WordPress work with more than one domain. There are a couple plugins to do that, but they are outdated and/or don’t work well. So, I wrote this plugin.

Check this out:

Multiple Domain allows you having more than one domain in a single WordPress installation. This plugin doesn’t support more than one theme or advanced customizations for each domain. It’s only intended to enable constant navigation under many domains. For a more complex setup, there is WordPress Multisite (MU).

When there is more than one domain set in your host, all links and resources will point to the default domain. This is the default WordPress behavior. With Multiple Domain installed and properly configured, it’ll update all link on the fly. This way, the user navigation will be end-to-end under the same domain.

You can also set an optional base URL. If you want only a set of URL’s available under a given domain, you can use this restriction.

Photo credit: Roya Ann Miller