Keycloak integration: Part 4 : Integration with Nginx based on docker

Integrating keycloak with Nginx would be a valuable combination. There are many blogs on this but I faced many issues getting it working. Hope this article will help someone.

Custom Nginx with Openresty dependency

To integrate Nginx with Keycloak, we need Lua dependency. Openresty is a webserver built on top of Nginx. Taking this simplifies the dependency installation flow. Else, we need to take an nginx and add all dependencies.

Dockerfile

Running as docker containers

By default, running keycloak as a docker and nginx as another docker., and trying to connect them is little tedious.

We will face issues like openidc_discover(): accessing discovery url (http://keycloak:8080/auth/realms/test/.well-known/openid-configuration) failed: keycloak could not be resolved (3: Host not found)

Docker comes with an embedded DNS server. We need to configure NGINX to use docker’s resolver instead of it’s own resolver

To resolve this issue., we need to add following line to the nginx configuration

resolver 127.0.0.11 valid=1s;

The nginx configuration file described below has the complete example configuration.

Keycloak realm

A realm manages a set of users, credentials, roles, and groups. A user belongs to and logs into a realm. Realms are isolated from one another and can only manage and authenticate the users that they control.

Please create a realm and provide the name of the realm in the nginx.conf file that we discuss below. I have used myrealm as a placeholder.

Keycloak client configuration

Clients are entities that can request Keycloak to authenticate a user. Most often, clients are applications and services that want to use Keycloak to secure themselves and provide a single sign-on solution. Clients can also be entities that just want to request identity information or an access token so that they can securely invoke other services on the network that are secured by Keycloak.

Let’s configure nginx as a client in keycloak. Once created., Under credentials tab, please copy the secret. This need to mentioned in the nginx.conf that we discuss below.

nginx configured as a client

Dockerised keycloak and auth-server-url issue

Earlier keycloak used to have two different configurable parameter: auth-server-url-for-backend-requests and auth-server-url But there was an issue reported on this and to resolve that., keycloak removed this flow.

So we need to handle this our self at the DNS level or by adding entries to the host file.

Nginx config

default.conf

We need to do mount the above nginx config as volume bound. Will be doing this with the following docker-compose file.

Docker compose with keycloak and nginx

The above file is tested as docker stack deploy. Should also work as a normal docker-compose.

Needed change in /etc/hosts file

In the /etc/hosts file., please add following

127.0.0.1       host.docker.internal

From Docker 18.03 onwards it’s recommendation is to connect to the special DNS name host.docker.internal, which resolves to the internal IP address used by the host.

Now on loading http://localhost should take us to nginx and redirect to keycloak login page. Login with user created in keycloak. It should redirect us to nginx home page — http://localhost

Redirect_url

This is coming from the nginx.conf file mentioned above. This need to point to the url that the user should be take on successful login. This need to be the one mentioned inside keycloak client configuration for redirect_uri

Conclusion

Now we have successfully configured keycloak to secure nginx. This way any application reverse proxied with nginx is now behind keycloak. There were many steps that is mentioned above not listed in neither keycloak documentation nor docker documentation. I spent lot of time in going around and make it work. While going to staging, qa, or production., URL of keycloak will be a public url and that need to be placed in nginx.conf. That doesn’t need /etc/hosts file change.

Happy coding.

Software Architect ★ Data engineer ★ Committed to improve data science productivity