Secure your codeless REST API with automatic HTTPS using Data API Builder and Caddy

Introduction

In my previous article, I demonstrated how we can build a codeless REST API with Data API Builder and how the endpoints can be write-protected by introducing roles with the help of Azure AD.

Creating and securing a codeless REST API on Azure using Data API Builder
This article describes how we can build a codeless REST API using Data API Builder and host it securely on Azure Container Instances

Unfortunately, the described architecture doesn't provide HTTPS out of the box, which makes its use insecure. This is especially true for any operations requiring an access token.

So in this article, I'll describe an architecture that will protect our codeless REST API with a reverse proxy providing automatic HTTPS to further reduce maintenance!

For this article, the REST API will build on the famous AdventureWorksLT data set, that we host on a slim Azure SQL database.

Then, we will use Caddy as a sidecar to the Data API Builder runtime, and host the container group on Azure Container Instances.

💡 Caddy is a powerful, enterprise-ready, open source web server with automatic HTTPS written in Go. Some benchmarks promise 4 times higher performance than Nginx.

Features & Components

  • Caddy 2, acting as a reverse proxy and providing automatic HTTPS
  • Data API Builder
  • An Azure Container Instance Group
  • Azure SQL Server & Database
  • Azure Storage Account hosting configuration files
  • Let's Encrypt and the ACME protocol
The target architecture featuring automatic HTTPS

Without further ado, let's get started 🧪

Step by step...

🔎 In the sections below, readers of my previous article on the Data API Builder might realize some repetiting parts. I want my articles to be as easy as possible to follow, that's why I decided to list the required sub-steps again...

Azure SQL Database

🪛 First, we'll need an Azure SQL server to host our demo database

az group create `
  --location westeurope `
  --name rg-demo

az sql server create `
  --name sql-azureblue `
  --resource-group rg-demo `
  --admin-password "your-password" `
  --admin-user "sqladmin"
Create an Azure SQL server

🪛 Next, let's set up the database and use the Adventure Works LT sample data.

 az sql db create `
   --name sqldb-adventureworks `
   --resource-group rg-demo `
   --server sql-azureblue `
   --backup-storage-redundancy Local `
   --edition Basic `
   --capacity 5 `
   --max-size 2GB `
   --sample-name AdventureWorksLT
Create Azure SQL demo database

🪛 Finally, we need to make sure, that all Azure Services are able to access our database.

az sql server firewall-rule create `
  --server sql-azureblue `
  --resource-group rg-demo `
  --name AllowAzureServices `
  --start-ip-address 0.0.0.0 ` 
  --end-ip-address 0.0.0.0
Allow Azure Services to access the Azure SQL Server

Don't worry because of the IP range. The command won't open up the server to the entire Internet. Instead, it ticks the checkbox saying Allow Azure service and resources to access this server.

Azure Storage Account & File Shares

For the purpose of this article, we'll need four file shares, which are

  • dab-config
  • proxy-caddyfile
  • proxy-config
  • proxy-data

The dab-config file share will host the dab-config.json file providing input to the Data API Builder runtime. The proxy-caddyfile file share will host the Caddyfile, which configures our reverse proxy, proxy-config will host the Caddy configuration directory and last but not least, proxy-data will persist the Caddy data directory.

🔎 From the Caddy docs: [...] The Caddy data directory stores TLS certificates, private keys, OCSP staples, and other necessary information to the data directory. It should not be purged without an understanding of the implications.

🪛 Okay, let's create the storage account named stdabtlsdemo and the beforementioned file shares.

# Create storage account 
az storage account create `
  --name stdabtlsdemo `
  --resource-group rg-demo `
  --location westeurope

# Store connection string 
$env:AZURE_STORAGE_CONNECTION_STRING = $(az storage account show-connection-string --name stdabtlsdemo --resource-group rg-demo --output tsv)

# Create file shares
az storage share create `
  --name dab-config `
  --account-name stdabtlsdemo
  
az storage share create `
  --name proxy-caddyfile `
  --account-name stdabtlsdemo

az storage share create `
  --name proxy-config `
  --account-name stdabtlsdemo
  
  az storage share create `
  --name proxy-data `
  --account-name stdabtlsdemo
Create a new storage account and file share

Before we move on and create the Azure Container Instance Group, let's have a closer look at the configuration files.

💡 You can find all configuration files in my GitHub repository.

Data API Builder Runtime Configuration

Here is a basic runtime configuration, that exposes a single endpoint $baseUrl/api/product for public read access. The endpoint gets fed by data coming from the table SalesLT.Product.

Further, the connection string is injected by an environment variable called DATABASE_CONNECTION_STRING.

{
    "$schema": "https://dataapibuilder.azureedge.net/schemas/v0.5.35/dab.draft.schema.json",
    "data-source": {
        "database-type": "mssql",
        "options": {
            "set-session-context": false
        },
        "connection-string": "@env('DATABASE_CONNECTION_STRING')"
    },
    "runtime": {
        "rest": {
            "enabled": true,
            "path": "/api"
        },
        "graphql": {
            "allow-introspection": true,
            "enabled": true,
            "path": "/graphql"
        },
        "host": {
            "mode": "development",
            "cors": {
                "origins": [],
                "allow-credentials": false
            },
            "authentication": {
                "provider": "StaticWebApps"
            }
        }
    },
    "entities": {
        "product": {
            "source": "SalesLT.Product",
            "permissions": [
                {
                    "role": "anonymous",
                    "actions": [
                        "read"
                    ]
                }
            ]
        }
    }
}
dab-config.json

🪛 Now is a good time to copy the file dab-config.json to the share called dab-config.

The Caddy runtime configuration

At a first glance, configuring Caddy as a reverse proxy seems straightforward. However, there are some implications worth mentioning.

dab-tls-demo-api.westeurope.azurecontainer.io {
	reverse_proxy http://localhost:5000
}
Caddyfile

As we'll run Caddy as a sidecar to the DAB runtime, both containers need to communicate with each other. By default, DAB runs on 5000/TCP and doesn't provide SSL. This is why the reverse_proxy directive is prefixed with http.

Also, containers within an ACI group can only communicate via localhost with each other. This contrasts with the configuration you might be familiar with when creating docker-compose.yaml files.  There, you can reference the containers by name, which is not possible with ACI groups.

Further, the Caddy runtime (read ACME client) needs to be reachable by the  defined domain name (dab-tls-demo-api.westeurope.azurecontainer.io in my example), otherwise, Let's Encrypt won't be able to issue certificates.

🪛 Now copy the file Caddyfile to the share called proxy-cadyfile.

The ACI Group YAML configuration

The configuration defines two containers called reverse-proxy and data-api-builder, from which only Caddy gets exposed to the Internet by 80/TCP and 443/TCP. The instance binds to the hostname dab-tls-demo-api, which later will be reachable via dab-tls-demo-api.westeurope.azurecontainer.io.

Then, we mount the beforementioned Azure File Shares to the container and inject the database connection string as a secret environment variable into the data-api-builder container.

name: ci-adventureworks-tls-api
apiVersion: "2021-10-01"
location: westeurope
properties:
  containers:
    - name: reverse-proxy
      properties:
        image: caddy:2.6
        ports:
          - protocol: TCP
            port: 80
          - protocol: TCP
            port: 443
        resources:
          requests:
            memoryInGB: 1
            cpu: 1
          limits:
            memoryInGB: 1
            cpu: 1
        volumeMounts:
          - name: proxy-caddyfile
            mountPath: /etc/caddy
          - name: proxy-data
            mountPath: /data
          - name: proxy-config
            mountPath: /config

    - name: data-api-builder
      properties:
        image: mcr.microsoft.com/azure-databases/data-api-builder:0.5.35
        resources:
          requests:
            memoryInGB: 1
            cpu: 1
          limits:
            memoryInGB: 1
            cpu: 1
        volumeMounts:
          - name: dab-config
            mountPath: /dab-config
        environmentVariables:
          - name: DATABASE_CONNECTION_STRING
            secureValue: "<your-connection-string>"
          - name: ASPNETCORE_LOGGING__CONSOLE__DISABLECOLORS
            value: true
        command:
          - dotnet
          - Azure.DataApiBuilder.Service.dll
          - --ConfigFileName
          - /dab-config/dab-config.json

  ipAddress:
    ports:
      - protocol: TCP
        port: 80
      - protocol: TCP
        port: 443
    type: Public        
    dnsNameLabel: dab-tls-demo-api

  osType: Linux

  volumes:
    - name: proxy-caddyfile
      azureFile: 
        shareName: proxy-caddyfile
        storageAccountName: stdabtlsdemo 
        storageAccountKey: "<your-key>"
    - name: proxy-data
      azureFile: 
        shareName: proxy-data
        storageAccountName: stdabtlsdemo 
        storageAccountKey: "<your-key>"
    - name: proxy-config
      azureFile: 
        shareName: proxy-config
        storageAccountName: stdabtlsdemo 
        storageAccountKey: "<your-key>"
    - name: dab-config
      azureFile: 
        shareName: dab-config
        storageAccountName: stdabtlsdemo 
        storageAccountKey: "<your-key>"
ci-adventureworks-tls-api.yaml

🪛 Obviously, you need to replace the placeholders with your values. You can retrieve the storage account keys as follows...

az storage account keys list `
  --resource-group rg-demo `
  --account-name stdabtlsdemo `
  --query [0].value `
  --output tsv
Retrieve the storage account key

... and the connection string.

az sql db show-connection-string `
  --client ado.net `
  --name sqldb-adventureworks `
  --server sql-azureblue `
  --output tsv 
Retrieve the connection string
🔎 Unfortunately, for now, we can't mount subfolders of a single Azure File Share into containers, and therefore need a dedicated file share for every configuration file 😞

✅ Checkpoint

By now, your File Shares should like as follows.

.
└── File shares/
    ├── dab-config/
    │   └── dab-config.json
    ├── proxy-caddyfile/
    │   └── Caddyfile
    ├── proxy-config/
    │   └── (empty)
    └── proxy-data /
        └── (empty)
File Share structure

Azure Container Instance Group

Now that you have replaced the values we can finally fire up the container group.

az container create `
  --resource-group rg-demo `
  --file ci-adventureworks-tls-api.yaml
Create the Azure Container Instance Group

Testing it out

Give the containers some time to start and then verify if TLS 1.3 is properly set up for our API. The easiest way is to use a browser, so go ahead and copy the URL https://dab-tls-demo-api.westeurope.azurecontainer.io/api/product/ProductID/680 to your favorite browser...

Success!!! 🚀🚀🚀As depicted in the screenshot, the certificate got issued by Let's Encrypt, and our data in transit is secured from prying eyes. 💪🏼

Considerations

  • For production environments, you'd usually create a CNAME record, e.g. api.mydomain.io, which points to dab-tls-demo-api.westeurope.azurecontainer.io. If you do so, remember to use this CNAME record in your Caddyfile, and not the DNS label managed by ACI!
  • Also, you'd usually build your own docker image, instead of mounting the Caddyfile from an Azure Storage Account.

Conclusion

With the help of the Data API Builder and Caddy, we have provisioned a TLS-secured, codeless REST API ready to serve our requirements!

Both key components, DAB and Caddy, have helped us to keep development time and maintenance low. 🚀

Again, that was fun! 😎 Stay tuned for more articles!

Further reading

Docker
Official Caddy Docker Image
Welcome - Caddy Documentation
Caddy is a powerful, enterprise-ready, open source web server with automatic HTTPS written in Go
Caddy Documentation
YAML reference for container group - Azure Container Instances
Reference for the YAML file supported by Azure Container Instances to configure a container group
Azure Container Instances YAML Reference
data-api-builder-article/dab-with-caddy-and-tls at main · matthiasguentert/data-api-builder-article
Contribute to matthiasguentert/data-api-builder-article development by creating an account on GitHub.
Supplement repository with example files