This post was originally published on the BetterDoc Dev-Blog.
For a while now I’ve been a big fan of using the Elixir library mox
for creating test mocks.
When using mox
you define behaviours as basis for your mocks.
You can think of behaviours as interfaces in object oriented languages such as Java: a set of function signatures that a module has to implement.
While this approach is great - check out this blog post on the why - it tends to require a bit of boilerplate code. In this post we are going to explore how I use behaviours for mocking, the reasoning behind it, and how I reduce the necessary boilerplate to an absolute minimum by using metaprogramming.
Mocks and Mox
Recently I changed parts of authentication flow at work.
For reasons which are beyond this post, I needed to authenticate a user by communicating with a separate system and then keep the user’s authentication token around until it expires.
Instead of sprinkling calls to the external system and storage all over the application, I introduced a central Authentication
module which provides high-level functions for everything related to … well authentication.
As an example, the login
function roughly takes care of the following steps:
- authenticating the user with the external system
- persisting the tokens
- returning an ID which can be used to fetch the tokens again
As a good TDD practicioner, I wanted to write proper unit tests for this module. This of course meant that I had to mock out calls to the external system and storage.
Which is where mox
came into play.
Based on the requirements I came up with two behaviours: Authentication.Provider
and Authentication.Storage
.
To give you an impression, this is what the Authentication.Provider
behaviour looks like (shortened for brevity):
defmodule Authentication.Provider do
alias Authentication.Tokens
@type user :: String.t()
@type password :: String.t()
@type success :: {:ok, Tokens.t()}
@type error :: {:error, reason :: any()}
@callback login(user(), password()) :: success() | error()
@callback refresh(Tokens.t()) :: success() | error()
end
Relatively straightforward, right? With this in place creating a mock was easy:
Mox.defmock(Authentication.Provider.Mock, for: Authentication.Provider)
To make matters complete I now only needed to swap out the implementations for tests. There exist a number of different approaches here but I chose to put the implementation into the application config.
As such I added a line like this to our config/test.exs
:
config :our_app, Authentication.Provider, Authentication.Provider.Mock
And to make the implementation easily accessible I added an implementation
function Authentication.Provider
:
defmodule Authentication.Provider do
alias Authentication.Tokens
@type user :: String.t()
@type password :: String.t()
@type success :: {:ok, Tokens.t()}
@type error :: {:error, reason :: any()}
@callback login(user(), password()) :: success() | error()
@callback refresh(Tokens.t()) :: success() | error()
@implementation Application.fetch_env!(:our_app, __MODULE__)
def implementation, do: @implementation
end
Now in Authentication
I simply had to replace all calls to Authentication.Provider
with Authentication.Provider.implementation()
.
All nice and dandy, right?
A Story about encapsulation
While this approach works fine, I felt it would be nice to encapsulate the fact that Authentication.Provider
was a behaviour even more.
I wanted a way to just call Authentication.Provider.login/2
and magically delegate the call to the real implementation.
Lucky for us Elixir provides the nifty defdelegate
which - as the name suggests - delegates a function call to a given module.
With this in mind let’s refactor our Authentication.Provider
module!
defmodule Authentication.Provider do
alias Authentication.Tokens
@type user :: String.t()
@type password :: String.t()
@type success :: {:ok, Tokens.t()}
@type error :: {:error, reason :: any()}
@callback login(user(), password()) :: success() | error()
@callback refresh(Tokens.t()) :: success() | error()
@implementation Application.fetch_env!(:our_app, __MODULE__)
defdelegate login(user, password), to: @implementation
defdelegate refresh(tokens), to: @implemetation
end
Now I can simply call Authentication.Provider.login/2
and the call will be delegated to the implementation.
This works great and is a joy to use … as long as nothing changes.
Let’s assume for a moment that we have a new requirement and need to add a logout/1
function which somehow explicitly invalidates all of the user’s tokens.
Okay, so let’s add a new @callback
:
defmodule Authentication.Provider do
alias Authentication.Tokens
@type user :: String.t()
@type password :: String.t()
@type success :: {:ok, Tokens.t()}
@type error :: {:error, reason :: any()}
@callback login(user(), password()) :: success() | error()
@callback refresh(Tokens.t()) :: success() | error()
@callback logout(Tokens.t()) :: success() | error()
@implementation Application.fetch_env!(:our_app, __MODULE__)
defdelegate login(user, password), to: @implementation
defdelegate refresh(tokens), to: @implemetation
end
Great, now just add a call to logout
in Authentication
and … gosh, I forgot the defdelegate
!
No big issue, I’m just gonna add it real quick.
While it’s no big deal for a behaviour as small as the one above, it only gets worse for larger behaviours.
What if we need to add a certain argument to a bunch of these callbacks?
We always have to remember to update the defdelegate
s.
Of course this pattern also bloats the module definition quite a bit. More code means more opportunities to make mistakes.
Seems like this was not such a great idea after all, right?
Teaching the machine to help us
When something like this happens I always ask myself the following question:
Is there a way the machine can help us with this?
And when you look at it, you realize that the major issue is the necessity for duplication.
Technically, the @callback
definitions have everything I need: the function names and the number of arguments (arity).
Surely there must be a way to use this knowledge to our advantage and simply generate the delegations, right?
So, as a Friday project I ventured into the world of metaprogramming. And after a few hours of tinkering around (I already have some experience with metaprogramming in Elixir), I had a first rough version:
defmodule Authentication.Provider do
use DelegateGeneration,
implementation: Application.fetch_env!(:our_app, __MODULE__)
alias Authentication.Tokens
@type user :: String.t()
@type password :: String.t()
@type success :: {:ok, Tokens.t()}
@type error :: {:error, reason :: any()}
@callback login(user(), password()) :: success() | error()
@callback refresh(Tokens.t()) :: success() | error()
@callback logout(Tokens.t()) :: success() | error()
end
By “using” DelegateGeneration
it registers a @before_compile
-hook which uses the defined @callback
s to generate defdelegate
s to the given implementation
.
In the last few weeks I’ve continued to work on this and the project is now ready to be published as an open source library.
Introducing Knigge: Teach your behaviours some manners
Knigge
is the extended and polished version of the prototype above.
In addition to “using” Knigge
in a behaviour directly, you can also pass it an external behaviour and configure a whole other bunch of things, such as defining a “default” implementation for @optional_callbacks
.
But see for yourself.
While this pattern works well for me, it is fairly opinionated. I don’t expect everybody to do use or even like it.
I named the library after the German writer Adolph Freiherr Knigge. Knigge wrote a book named Über den Umgang mit Menschen (On Human Relations) which has the reputation of being the guide for etiquette, politeness, and good behaviour.
Portrait of Adolph Freiherr Knigge
In Germany, the word “Knigge” is pretty much equivalent to “good manners” but a lot of people think that the Knigge rules are excessive and unnecessary.
Simply think of Knigge as an opinionated approach to “good” behaviour(s).