Implementing an SDK compatible with Laravel Container Service

When developing an SDK — and by SDK I mean an API implementation — you can leave for your users the task of integrating it with their apps. However, it’s a good idea to make it compatible with trendy frameworks out of the box.

In the PHP world, Laravel is becoming a very popular choice of framework. I want to share how you can make your SDK compatible with its Service Container.

The API client

For the sake of example, let’s imagine your API client class looks like this:

class Client
    public function __construct($username, $password)
        // Set up client.

Maybe you have a different way to set up API configuration. Anyway, it’s a good idea to have them set through a method, and not directly via a config file or other env-depending method. Using a method will make it easy to pass the settings from user’s app to the SDK. Also, it makes the configuration process more abstract and easier to plug-in.

The config file

In the sample client above, we have to pass a username and password to create a new instance. It’s clear we need to get that from somewhere. When using the SDK directly, you may do the following:

 define('API_USER', 'username');
 define('API_PASSWORD', '***');
 $client = new Client(API_USER, API_PASSWORD);

Since our goal is to inject the API implementation into the app, it’s better to config that using the framework way. Laravel stores application settings in different files inside a config dir. We can create a file like those to store the API settings:


return [

    // The API user.
    'username' => 'username',

    // The API password.
    'password' => '***',


Even better than set plain values to that config array, it’s using env vars. Laravel is shipped with PHP dotenv. It allows each environment to have its own settings without any change in the application code. So, let’s change our config file a little bit:


return [

    // The API user.
    'username' => env('API_USER'),

    // The API password.
    'password' => env('API_PASSWORD'),


The vars we’re using here have a very generic name. You should use a more specific name to avoid conflicts with other services. Something like DUMMY_API_USER, for an API called Dummy, for example.

The Service Provider

According to Laravel docs:

Service providers are the central place of all Laravel application bootstrapping. Your own application, as well as all of Laravel’s core services are bootstrapped via service providers.

But, what do we mean by “bootstrapped”? In general, we mean registering things, including registering service container bindings, event listeners, middleware, and even routes. Service providers are the central place to configure your application.

We have to create a service provider to tell Laravel that our API client can be injected as a dependency into application classes and methods. Also, the service provider will be in charge to merge the API configuration into the application configs.

It will look like this:

use Illuminate\Support\ServiceProvider;

class ApiServiceProvider extends ServiceProvider

    public function boot()
            __DIR__ . '/config.php' => config_path('api.php'),

    public function register()
        $this->mergeConfigFrom(__DIR__ . '/config.php', 'api');

        $this->app->singleton('api.config', function ($app) {
            return $this->app['config']['api'];

        $this->app->singleton(Client::class, function ($app) {
            $config = $app['api.config'];
            return new Client($config['username'], $config['password']);

    public function provides()
        return [

In the boot method, we tell to Laravel which config files can be published to application’s config dir. So users of our API can overwrite those settings.

Within the register method, the service provider binds the config and the API client instance into the service container.

To improve performance, we use the provides method to let the framework know what are the binds this service provider offers. This way, it will only try to resolve the bind when it’s actually needed.

Using the service provider

After you added the SDK to the application, probably using Composer, you have to register its service provider. Open the config/app.php file of the app and add the service provider to the providers array:

$providers = [
    // ...


Now you can inject the API client into the app classes, like controllers:

class UserController extends Controller
    public function show(Client $client)
        $user = $client->getUser();
        return view('', [ 'user' => $user ]);

To set the API username and password, you have to publish the config file:

$ php artisan vendor:publish --provider="ApiServiceProvider"

Then edit the config/api.php file if needed. This file may have another name if you changed its name in the service provider, which you should do.

You also may want to create the env vars inside the application’s .env.example and .env files.


Making your API SDK compatible with Laravel is very simple and requires only one extra class. It worth adding that to reach more users and make their work easier.

P.S.: I’ve omitted some implementation details and stuff like namespaces in the samples above. You can find a complete functioning example in Github:


Documenting or not

There are lots of people talking about the importance of writing code documentation. Some, otherwise, advocate in favor of not documenting at all. These usually say your code must tell the history by itself. I’d like to make my own statement in this matter: I agree with both sides in this discussion.

I love to write code documentation. And sometimes, no matter how much I try to make my code be expressive, I need the support from something more textual. I guess that’s due to my code-writing style. I prefer to use a concise naming to my symbols. Instead of `getAllOrdersFromUser(user)`, I’d rather to use `getOrders(user)`. Unless the context requires a meaningful name, the latter is enough expressive. It says what the function does — gets orders — and based on what — a user. For this example, writing what the function actually does is not so important. I can then omit the doc block. In other cases, though, there are some logic details that can be hidden behind a weak name. For those, a statement on what that piece of code does may be essential for understanding.

Yet, for some projects, I struggle keeping docs up-to-date. This happens especially when I’m writing something that must be deployed ASAP. Or when working with pairs that don’t care about documenting. You should know, real life not always allows you to do everything in the way you want to. If I don’t plan to keep my code and documentation in sync, I do not write a comment that won’t say a thing about the code next to it. It’s preferable having no doc than something that will confuse other dev or even myself.

There is a place in the middle of these two points of view where the day-by-day programming resides. Writing expressively is the better way to coding. But, if you’re adding documentation keep in my mind you must update it constantly as your logic changes.

Laravel Logs to Sentry

If text-based logging is all you have, you should give a try to Sentry. It’s an amazing way to visualize and get notified about exceptions. And it supports a bunch of languages and frameworks. You can find more on its website, now I want to share a small tip to having your Laravel log outputted to Sentry.

When you set up Sentry to work with Laravel based on their guide, only exceptions will be sent to Sentry. If you have something like this:

try {
    throw new \Exception('Foo');
} catch (\Exception $e) {

That exception won’t reach the handler. So, Sentry won’t receive it.

There are a couple ways to forward that exception to Sentry, though. One of them is by doing the following:

try {
    throw new \Exception('Foo');
} catch (\Exception $e) {

Yet, if you already wrote a lot of code, seeking for all your logging calls and adding that line can be a pain. To avoid that it’s possible to listen to the log event. Add a listener to your EventServiceProvider:

public function boot()
    Event::listen('illuminate.log', function ($level, $message, $context) {
        $sentry = app('sentry');
        if ($message instanceof \Exception) {
            $sentry->captureException($message, $context);
        } else {
            $sentry->captureMessage($message, null, $context);

This way all you’re logging calls will go to Sentry. You may found that some debug logging, for instance, may be pulled out from this logic. Add a condition to prevent it:

Event::listen('illuminate.log', function ($level, $message, $context) {
    if (in_array($level, [ 'debug', 'info' ])) {

    // ...


BTW, Happy new year! ;)

Debugging requests with cURL

For more than one time I had to debug HTTP request or response headers and other details. To do that, I use two techniques, both based on cURL library. I explain them ahead.

Technique #1: From the command line

This is the easiest way to debug. It doesn’t require writing any actual code. Just call curl program from the command line, as usual, adding a new param: -vvv. This will enable the highest verbosity level.

$ curl -vvv
* Rebuilt URL to:
* Trying 2800:3f0:4001:802::200e...
* Connected to (2800:3f0:4001:802::200e) port 80 (#0)
> GET / HTTP/1.1
> Host:
> User-Agent: curl/7.43.0
> Accept: */*
< HTTP/1.1 302 Found
< Cache-Control: private
< Content-Type: text/html; charset=UTF-8
< Location:
< Content-Length: 262
< Date: Tue, 23 Aug 2016 12:28:29 GMT
<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
<H1>302 Moved</H1>
The document has moved
<A HREF=";ei=bUG8V53JGcvK8gfp3L-YBg">here</A>.
* Connection #0 to host left intact

As you can see in the example above, it outputs all request and response info.

It’s possible to output everything to a file, by adding > output_file.txt to the end of the command. Using our previous call:

$ curl -vvv > output.txt

Well, one may now ask: if this is so easy, why do you have a second way to debug request? Following we’ll see why that.

Technique #2: From a PHP script

I’ve written on debugging cURL and PHP at Blog. Let’s say you have to send a dynamic header with the request, like a JWT authorization token. It’s not impossible to that from the command line, but it’s easier using programming. For those cases, I use the cURL PHP extension. Check out the script below.

$url = '';
$headers = [
    'Accept' => 'application/json',

 * We're going to use the output buffer to store the debug info.
$out = fopen('php://output', 'w');

$handler = curl_init($url);

 * Here we set the library verbosity and redirect the error output to the 
 * output buffer.
curl_setopt($handler, CURLOPT_VERBOSE, true);
curl_setopt($handler, CURLOPT_STDERR, $out);

$requestHeaders = [];
foreach ($headers as $k => $v) {
    $requestHeaders[] = $k . ': ' . $v;
curl_setopt($handler, CURLOPT_HTTPHEADER, $requestHeaders);
curl_setopt($handler, CURLOPT_RETURNTRANSFER, true);
$response = curl_exec($handler);

 * Joining debug info and response body.
$data = ob_get_clean();
$data .= PHP_EOL . $response . PHP_EOL;
echo $data;

Now, you can customize this code to add some dynamic data to a header or any other request part. After doing that, run it using the PHP program from the command line:

$ php curldebug.php

P.S.: I’m assuming that you saved the script as curldebug.php.

As we did with the curl program, it’s possible to output everything to a file. Just append the > output_file.txt to the call.


Debugging requests can be a lifesaver when dealing with third-party APIs and other services. Headers may contain helpful info to find what is going wrong with that weird response body.

Multiple Domain plugin for WordPress

For a few projects in the past, I had to find a way to make WordPress work with more than one domain. There are a couple plugins to do that, but they are outdated and/or don’t work well. So, I wrote this plugin.

Check this out:

Multiple Domain allows you having more than one domain in a single WordPress installation. This plugin doesn’t support more than one theme or advanced customizations for each domain. It’s only intended to enable constant navigation under many domains. For a more complex setup, there is WordPress Multisite (MU).

When there is more than one domain set in your host, all links and resources will point to the default domain. This is the default WordPress behavior. With Multiple Domain installed and properly configured, it’ll update all link on the fly. This way, the user navigation will be end-to-end under the same domain.

You can also set an optional base URL. If you want only a set of URL’s available under a given domain, you can use this restriction.

Photo credit: Roya Ann Miller

Passing Node args to Mocha tests

This is a really quick tip. I was looking around on the internet for a way to pass Node arguments when calling Mocha binary. And I couldn’t find anything useful. Then I tried the following and it worked:

$ test node --expose-gc ./node_modules/.bin/mocha [...]

The --expose-gc argument is just an example. You can pass any argument accepted by Node program.

In my specific case, I was trying to load dotenv config. In the end, the project’s MakeFile looked like:

    @NODE_ENV=test node -r dotenv/config ./node_modules/.bin/mocha \
        --require should \
        --reporter spec \
        --harmony \
        --bail \

.PHONY: test

Photo credit: Matt Benson

Mixing HTTP and WebSocket routes in a Koa-based application

I’ve started to use the koa-websocket package. And it took me some time to figure out how to mix HTTP and WebSocket routes in a single Koa-based application. I’m not sure if this solution is obvious, but I’m sharing it anyway.

First of all, we’ll need two separate routers for regular HTTP and WebSocket routes:

// Creating a Koa app instance.
const app = require('koa')();
// "Websockifying" the application.
const socket = (require('koa-websocket'))(app);
// Loading router package
const router = require('koa-router');

// Here they are, our 2 routers
const http = router();
const ws = router();

Then, we can write our routes, plugging them to the specific router:

http.get('/', function *(next) {
    this.status = 200;
    this.body = 'Hello!';

ws.get('/socket', function *(next) {
    this.websocket.on('message', function (message) {

Finally, let’s make the app use the routers we created:


Notice that the second router was added to instead of app directly.

And… That’s it.

Photo credit: Aron Van de Pol