Loading Form...
Thank you! The form was submitted successfully.
Jan 13, 2016 | 4 minute read
written by Israel Sotomayor
This information is specific to the deprecated version one. For more up-to-date details, see our API Reference.
How we used OpenResty and Lua to improve API performance.
Just to clarify, this won’t be a detailed technical guide about how you can build your own authentication layer using OpenResty + Lua, rather it is a document explaining the process behind the solution.
This is a real example of how our API has come to rely on OpenResty + Lua to handle our oauth2 authentication for all users.
The logic used to authenticate a user was originally embedded into our API, built using the PHP Framework Laravel. This means a lot of code had to be booted before we could authenticate, reject or validate a user request resulting in a high latency.
I’m not going to give details about how much time a PHP Framework could take to give a basic response, but if we compare it to other languages/frameworks, you can probably understand.
This is roughly how it looked at that time:
public function filter($route, $request) { try { // Initiate the Request handler $this->request = new OAuthRequest; // Initiate the auth server with the models $this->server = new OAuthResource(new OAuthSession); // Is it a valid token? if ($this->accessTokenValid() == false) { throw new InvalidAccessTokenException('Unable to validate access token'); } ...
So we decided to move all logic one layer up to OpenResty + Lua which achieved the following:
We wanted, and needed, to have more control on each request before hitting the actual API and so we decided to use something fast enough to allow us to pre-process each request and flexible enough to be integrated into our actual system. Doing this led us to use ‘OpenResty’, a modified version of Nginx, that allowed us to use Lua, the language used to pre-process those requests. Why? Because it’s robust and fast enough to use for these purposes and it’s a highly recognized scripting language used daily by many large companies.
We followed the concept behind Kong, who use OpenResty + Lua, to offer several micro-services that could be plugged into your API project. However, we found that Kong is in a very early stage and is actually trying to offer more than what we needed, therefore, we decided to implement our own Auth layer to allow us to have more control over it.
Below is how our infrastructure currently looks:
This is the bit that rules them all.
We have routing in place to process each of the different user’s requests as you can see below:
nginx.conf
location ~/oauth/access_token { ... } location /v1 { ... }
So for each of those endpoints, we have to:
... location ~/oauth/access_token { content_by_lua_file "/opt/openresty/nginx/conf/oauth/get_oauth_access.lua"; ... } location /v1 { access_by_lua_file "/opt/openresty/nginx/conf/oauth/check_oauth_access.lua"; ... } ...
We make use of the OpenResty directives content_by_lua_file and access_by_lua_file.
This is where all the magic happens. We have two scripts to do this:
get_oauth_access.lua
... ngx.req.read_body() args, err = ngx.req.get_post_args() -- If we don't get any post data fail with a bad request if not args then return api:respondBadRequest() end -- Check the grant type and pass off to the correct function -- Or fail with a bad request for key, val in pairs(args) do if key == "grant_type" then if val == "client_credentials" then ClientCredentials.new(args) elseif val == "password" then Password.new(args) elseif val == "implicit" then Implicit.new(args) elseif val == "refresh_token" then RefreshToken.new(args) else return api:respondForbidden() end end end return api:respondOk() ...
check_oauth_access.lua
... local authorization, err = ngx.req.get_headers()["authorization"] -- If we have no access token forbid the beasts if not authorization then return api:respondUnauthorized() end -- Check for the access token local result = oauth2.getStoredAccessToken(token) if result == false then return api:respondUnauthorized() end ...
This is where the created access tokens are stored. We can remove, expire or refresh them as we please. We use Redis as a storage layer and we use openresty/lua-resty-redis to connect Lua to Redis.
The results speak for themselves. Since we moved the authentication layer out of our monolithic API and rebuilt it in Lua, we’ve seen a huge difference. From 250-500ms, we’re now averaging between 4-40ms per access token!
Note: We’re not considering the networking time during these measurements, just internal processing time.
Here are some interesting resources on Lua that we used when creating our authentication layer.