NGINX.COM
Web Server Load Balancing with NGINX Plus

It feels like the word “security” is on everyone’s lips these days. It has never been easy to protect and secure applications, but running them in the cloud presents even more challenges. One concept that seems like a promising solution is “zero trust”, which Gartner defines as:

…an approach where implicit trust is removed from all computing infrastructure. Instead, trust levels are explicitly and continuously calculated and adapted to allow just-in-time, just‑enough access to enterprise resources.

But exactly how does “zero trust” work in a cloud context, and what technologies are available to help you implement it? In this blog, we’ll consider zero trust in the context of a common use case:

You’re an insurance company with a varied set of APIs and services powered by Java. Now that you’ve migrated to the cloud, production workloads are automatically built by a CI/CD pipeline and deployed in a Kubernetes cluster at a public cloud provider. As you are dealing with sensitive customer information, one major requirement is to encrypt all traffic with TLS.

You’ve enabled encryption on the edge load balancer and the Ingress controller, but what’s the best way to encrypt the traffic between the Ingress controller and the application itself? That involves enabling the application server to handle TLS.

Many Java shops use Apache Tomcat as the application server of choice, and Spring Boot as a framework to build stand‑alone and production‑ready Spring applications more easily than with Java itself. In this blog, we show in detail how to configure Spring Boot with Apache Tomcat (and then NGINX Unit) for applications that can handle HTTPS traffic.

Spring Boot: Talk HTTPS to Me…

With almost 60,000 stars on GitHub as of this writing, Spring Boot is the shining star in the Java frameworks sky: easy to get started with, lightweight, and powerful at the same time. A Spring Boot project can be compiled into a self‑contained .jar file with a built‑in application server like Apache Tomcat. To start the Java service, you simply execute the .jar file and start sending traffic to the exposed port (by default, 8080). Simple!

Following are a few more steps you need to perform for your service to handle TLS connections (HTTPS traffic) properly. For more details about the steps, see the Spring documentation.

These instructions are for a self‑signed certificate and key, but for production environments we strongly recommend that you substitute a certificate‑key pair from an official Certificate Authority (CA).

  1. Create a key store containing a certificate and a key:

    # keytool -genkey -alias tomcat -keyalg RSA -keystore certstore
  2. Place the key store in your container image, which Tomcat must be able to access.

  3. Add these properties to your application.properties file, replacing secret with the appropriate password:

    server.port = 8443
    server.ssl.key-store = classpath:keystore.jks
    server.ssl.key-store-password = secret

With this configuration in place your Spring Boot application listens on port 8443 for HTTPS connections. But what if you also want to accept HTTP connections? Once you’ve configured HTTPS in the application.properties file, you can’t also configure HTTP there; you must implement HTTP handling in the Java code itself. For a good example, see the spring-projects repo on GitHub.

So it turns out that delegating Layer 4 TLS encryption to application frameworks like Spring Boot is possible, but not very straightforward. If you also write applications in other languages and frameworks (like Ruby and Rails, or Python and Flask), the situation is even more complicated – each framework has its own way of configuring listeners and handling keys and certificates. Fortunately, there’s something that makes the situation much simpler!

NGINX Unit to the Rescue!

NGINX Unit is an open source polyglot application server, a reverse proxy, and a static file server, written by the core NGINX engineering team for Unix‑like systems. With NGINX Unit you use a standardized API to simultaneously run and manage applications written in many different languages – as of this writing, it supports seven languages in addition to Java: assembly, Go, JavaScript (Node.js®), Perl, PHP, Python, and Ruby.

Unit also enables you to configure HTTP and HTTPS interfaces independently of the applications using them. Let’s explore this powerful feature with our Spring Boot API example. First, we have to build the Spring Boot application for our Unit server. As Unit implements the Java Servlet API version 3, the only change is a line added to the Gradle or Maven build definitions. I used Gradle for my testing.

  1. Add the war plug‑in to the build.gradle file:

    plugins {
      id 'org.springframework.boot' version '2.4.4'
      id 'io.spring.dependency-management' version '1.0.11.RELEASE'
      id 'java'
      id 'war'
    }
  2. Build the .war file:

    # ./gradlew build
  3. The resulting file is build/libs/rootProject‑Version.war, where:

  4. Define the Unit configuration in a file called config.json:

    {
        "listeners": {
            "*:8080": {
                "pass": "applications/java"
            }
        },
        "applications": {
            "java": {
                "user": "unit",
                "group": "unit",
                "type": "java",
                "environment": {
                    "Deployment": "0.0.1"
                },
                "classpath": [],
                "webapp": "/path/to/build/libs/demo-0.0.1-SNAPSHOT.war"
            }
        }
    }
  5. Activate the configuration (for details, see the documentation):

    # curl -X PUT --data-binary @config.json --unix-socket \
           /path/to/control.unit.sock http://localhost/config/applications/java-app

That’s it! The Spring Boot application is now running on Unit. No Tomcat or other Java application server is needed.

Enabling HTTPS

You might be asking, “but what about HTTPS?” Fair enough – let’s enable it! That’s easy with the following steps. (As above we’re using a self‑signed certificate. In a production environment, make sure you use CA‑signed certificates.)

  1. Create the self‑signed certificate bundle:

    # cat cert.pem ca.pem key.pem > bundle.pem
  2. Upload the bundle to Unit:

    # curl -X PUT --data-binary @bundle.pem --unix-socket \
           /path/to/control.unit.sock http://localhost/certificates/bundle
  3. Define the configuration for the HTTPS listener in a file called listener.json:

    "127.0.0.1:443": {
        "pass": "applications/java-app",
        "tls": {
            "certificate": "bundle"
        }
    }
  4. Activate the new listener:

    # curl -X PUT --data-binary @listener.json --unix-socket \ 
    /path/to/control.unit.sock http://localhost/config/listeners

The application now accepts TLS‑encrypted connections – without restarting or rebooting either the application or Unit. But the most powerful thing is that the preceding process is the same for applications written in any of the languages and frameworks supported by Unit. There’s no need to dig into language‑specific details to configure HTTPS.

Summary

The powerful NGINX Unit listener feature makes supporting HTTP and HTTPS simple and completely application‑agnostic, because encryption is applied at the level of the listener, not the application. To learn about other TLS features like Server Name Indication (SNI) and custom OpenSSL configuration commands, see the NGINX Unit documentation.

To get started with NGINX Unit, see the installation instructions.

NGINX Plus subscribers get support for NGINX Unit at no additional charge. Start your free 30-day trial today or contact us to discuss your use cases.

Hero image
[Free O'Reilly Ebook] NGINX Unit Cookbook

Learn how to get started with NGINX Unit



About The Author

Timo Stark

Product Management Engineer

About F5 NGINX

F5, Inc. is the company behind NGINX, the popular open source project. We offer a suite of technologies for developing and delivering modern applications. Together with F5, our combined solution bridges the gap between NetOps and DevOps, with multi-cloud application services that span from code to customer.

Learn more at nginx.com or join the conversation by following @nginx on Twitter.