Technology

Directus CMS Load Testing Strategy — ทดสอบประสิทธิภาพ Directus ด้วย k6 และ Locust

directus cms load testing strategy
Directus CMS Load Testing Strategy | SiamCafe Blog
2025-06-12· อ. บอม — SiamCafe.net· 1,445 คำ

Directus CMS คืออะไรและทำไมต้อง Load Test

Directus เป็น open source headless CMS ที่สร้าง REST และ GraphQL API จาก SQL database โดยอัตโนมัติ รองรับ PostgreSQL, MySQL, SQLite, MS SQL และ Oracle ให้ Admin UI สำหรับจัดการ content โดยไม่ต้องเขียน code เหมาะสำหรับ backend ของ websites, mobile apps และ IoT applications

Load Testing สำคัญสำหรับ Directus เพราะ Directus เป็น API-first CMS ที่ทุก request ผ่าน API ต้องรู้ว่ารองรับ concurrent users ได้เท่าไหร, response time เป็นอย่างไรเมื่อ traffic สูง, database queries ที่ Directus generate มี performance อย่างไร และ caching strategy ทำงานดีหรือไม่

สิ่งที่ต้อง test ได้แก่ REST API endpoints (GET, POST, PUT, DELETE), GraphQL queries, File uploads และ downloads, Authentication flows, Real-time subscriptions (WebSocket), Complex filters และ relations queries และ Bulk operations

เครื่องมือที่แนะนำสำหรับ load testing Directus คือ k6 (Grafana) สำหรับ scripted load tests, Locust สำหรับ Python-based distributed testing, Artillery สำหรับ YAML-based testing และ Apache JMeter สำหรับ GUI-based complex scenarios

ติดตั้ง Directus และตั้งค่า Production

Deploy Directus สำหรับ production

# === Docker Compose สำหรับ Directus Production ===
# docker-compose.yml

# version: "3.8"
# services:
#   directus:
#     image: directus/directus:10.10
#     ports: ["8055:8055"]
#     volumes:
#       - directus-uploads:/directus/uploads
#       - directus-extensions:/directus/extensions
#     depends_on: [postgres, redis]
#     environment:
#       KEY: "random-secret-key-here"
#       SECRET: "random-secret-here"
#       DB_CLIENT: pg
#       DB_HOST: postgres
#       DB_PORT: 5432
#       DB_DATABASE: directus
#       DB_USER: directus
#       DB_PASSWORD: directus_pass
#       CACHE_ENABLED: "true"
#       CACHE_STORE: redis
#       CACHE_REDIS: "redis://redis:6379"
#       CACHE_AUTO_PURGE: "true"
#       CACHE_TTL: "5m"
#       RATE_LIMITER_ENABLED: "true"
#       RATE_LIMITER_STORE: redis
#       RATE_LIMITER_POINTS: "50"
#       RATE_LIMITER_DURATION: "1"
#       ADMIN_EMAIL: admin@example.com
#       ADMIN_PASSWORD: admin_password
#       PUBLIC_URL: "https://cms.example.com"
#       MAX_PAYLOAD_SIZE: "10mb"
#       ASSETS_TRANSFORM_MAX_CONCURRENT: "10"
#
#   postgres:
#     image: postgres:16
#     volumes: ["pgdata:/var/lib/postgresql/data"]
#     environment:
#       POSTGRES_DB: directus
#       POSTGRES_USER: directus
#       POSTGRES_PASSWORD: directus_pass
#     command: >
#       postgres
#       -c shared_buffers=256MB
#       -c effective_cache_size=1GB
#       -c work_mem=64MB
#       -c max_connections=200
#
#   redis:
#     image: redis:7-alpine
#     command: redis-server --maxmemory 256mb --maxmemory-policy allkeys-lru
#     volumes: ["redis-data:/data"]
#
#   nginx:
#     image: nginx:alpine
#     ports: ["80:80", "443:443"]
#     volumes:
#       - ./nginx.conf:/etc/nginx/nginx.conf
#     depends_on: [directus]
#
# volumes:
#   directus-uploads:
#   directus-extensions:
#   pgdata:
#   redis-data:

# === Nginx Reverse Proxy ===
# nginx.conf (simplified)
# upstream directus {
#     server directus:8055;
# }
# server {
#     listen 80;
#     server_name cms.example.com;
#     
#     location / {
#         proxy_pass http://directus;
#         proxy_set_header Host $host;
#         proxy_set_header X-Real-IP $remote_addr;
#         proxy_cache_valid 200 5m;
#     }
#     
#     location /assets/ {
#         proxy_pass http://directus;
#         proxy_cache_valid 200 1h;
#         expires 1h;
#     }
# }

# === สร้าง Test Data ===
# ใช้ Directus API สร้าง test collections
curl -X POST http://localhost:8055/collections \
  -H "Authorization: Bearer $TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "collection": "articles",
    "schema": {},
    "fields": [
      {"field": "title", "type": "string"},
      {"field": "content", "type": "text"},
      {"field": "status", "type": "string", "schema": {"default_value": "draft"}},
      {"field": "author", "type": "string"},
      {"field": "views", "type": "integer", "schema": {"default_value": 0}}
    ]
  }'

# Seed test data
for i in $(seq 1 1000); do
  curl -s -X POST http://localhost:8055/items/articles \
    -H "Authorization: Bearer $TOKEN" \
    -H "Content-Type: application/json" \
    -d "{\"title\":\"Article $i\",\"content\":\"Content for article $i with enough text for testing.\",\"status\":\"published\",\"author\":\"Author $((i % 10))\"}"
done
echo "Seeded 1000 articles"

สร้าง Load Test Scripts ด้วย k6

k6 scripts สำหรับ test Directus API

// k6_directus_test.js — Directus Load Test with k6
import http from 'k6/http';
import { check, sleep, group } from 'k6';
import { Rate, Trend } from 'k6/metrics';

// Custom metrics
const errorRate = new Rate('errors');
const apiDuration = new Trend('api_duration');

// Test configuration
export const options = {
  stages: [
    { duration: '1m', target: 10 },   // Ramp up
    { duration: '3m', target: 50 },   // Sustain
    { duration: '2m', target: 100 },  // Peak
    { duration: '1m', target: 0 },    // Ramp down
  ],
  thresholds: {
    http_req_duration: ['p(95)<500', 'p(99)<1000'],
    errors: ['rate<0.05'],
    api_duration: ['avg<200'],
  },
};

const BASE_URL = __ENV.BASE_URL || 'http://localhost:8055';
const TOKEN = __ENV.DIRECTUS_TOKEN || '';

const headers = {
  'Content-Type': 'application/json',
  'Authorization': `Bearer `,
};

// Login and get token
export function setup() {
  const loginRes = http.post(`/auth/login`, JSON.stringify({
    email: 'admin@example.com',
    password: 'admin_password',
  }), { headers: { 'Content-Type': 'application/json' } });

  const token = loginRes.json('data.access_token');
  return { token };
}

export default function (data) {
  const authHeaders = {
    'Content-Type': 'application/json',
    'Authorization': `Bearer `,
  };

  group('Read Operations', () => {
    // List articles
    const listRes = http.get(
      `/items/articles?limit=20&sort=-date_created`,
      { headers: authHeaders }
    );
    check(listRes, { 'list 200': (r) => r.status === 200 });
    errorRate.add(listRes.status !== 200);
    apiDuration.add(listRes.timings.duration);

    // Get single article
    const singleRes = http.get(
      `/items/articles/`,
      { headers: authHeaders }
    );
    check(singleRes, { 'single 200': (r) => r.status === 200 });

    // Filter query
    const filterRes = http.get(
      `/items/articles?filter[status][_eq]=published&limit=10`,
      { headers: authHeaders }
    );
    check(filterRes, { 'filter 200': (r) => r.status === 200 });

    // Search
    const searchRes = http.get(
      `/items/articles?search=test&limit=10`,
      { headers: authHeaders }
    );
    check(searchRes, { 'search 200': (r) => r.status === 200 });
  });

  group('Write Operations', () => {
    // Create article
    const createRes = http.post(
      `/items/articles`,
      JSON.stringify({
        title: `Load Test Article `,
        content: 'Generated during load testing',
        status: 'draft',
        author: 'load-tester',
      }),
      { headers: authHeaders }
    );
    check(createRes, { 'create 200': (r) => r.status === 200 });

    if (createRes.status === 200) {
      const id = createRes.json('data.id');

      // Update article
      const updateRes = http.patch(
        `/items/articles/`,
        JSON.stringify({ title: `Updated ` }),
        { headers: authHeaders }
      );
      check(updateRes, { 'update 200': (r) => r.status === 200 });

      // Delete article
      const deleteRes = http.del(
        `/items/articles/`,
        null,
        { headers: authHeaders }
      );
      check(deleteRes, { 'delete 204': (r) => r.status === 204 });
    }
  });

  group('GraphQL', () => {
    const gqlRes = http.post(`/graphql`, JSON.stringify({
      query: `{
        articles(limit: 10, sort: ["-date_created"]) {
          id
          title
          status
          author
        }
      }`,
    }), { headers: authHeaders });
    check(gqlRes, { 'graphql 200': (r) => r.status === 200 });
  });

  sleep(Math.random() * 2 + 0.5);
}

// Run: k6 run --env BASE_URL=http://localhost:8055 --env DIRECTUS_TOKEN=xxx k6_directus_test.js

Load Testing Directus API ด้วย Locust

Python-based distributed load testing

#!/usr/bin/env python3
# locustfile.py — Directus Load Test with Locust
from locust import HttpUser, task, between, events
import json
import random
import logging
import time

logging.basicConfig(level=logging.INFO)

class DirectusUser(HttpUser):
    wait_time = between(1, 3)
    host = "http://localhost:8055"
    token = None
    
    def on_start(self):
        resp = self.client.post("/auth/login", json={
            "email": "admin@example.com",
            "password": "admin_password",
        })
        if resp.status_code == 200:
            self.token = resp.json()["data"]["access_token"]
            self.headers = {
                "Authorization": f"Bearer {self.token}",
                "Content-Type": "application/json",
            }
        else:
            logging.error(f"Login failed: {resp.status_code}")
    
    @task(5)
    def list_articles(self):
        self.client.get(
            "/items/articles?limit=20&sort=-date_created",
            headers=self.headers,
            name="/items/articles [LIST]",
        )
    
    @task(3)
    def get_article(self):
        article_id = random.randint(1, 1000)
        self.client.get(
            f"/items/articles/{article_id}",
            headers=self.headers,
            name="/items/articles/:id [GET]",
        )
    
    @task(2)
    def filter_articles(self):
        self.client.get(
            "/items/articles?filter[status][_eq]=published&limit=10&fields=id, title, status",
            headers=self.headers,
            name="/items/articles [FILTER]",
        )
    
    @task(2)
    def search_articles(self):
        terms = ["test", "article", "content", "draft", "published"]
        self.client.get(
            f"/items/articles?search={random.choice(terms)}&limit=10",
            headers=self.headers,
            name="/items/articles [SEARCH]",
        )
    
    @task(1)
    def create_article(self):
        resp = self.client.post(
            "/items/articles",
            json={
                "title": f"Locust Test {time.time()}",
                "content": "Created by Locust load test",
                "status": "draft",
                "author": "locust-tester",
            },
            headers=self.headers,
            name="/items/articles [CREATE]",
        )
        
        if resp.status_code == 200:
            article_id = resp.json()["data"]["id"]
            
            self.client.patch(
                f"/items/articles/{article_id}",
                json={"status": "published"},
                headers=self.headers,
                name="/items/articles/:id [UPDATE]",
            )
            
            self.client.delete(
                f"/items/articles/{article_id}",
                headers=self.headers,
                name="/items/articles/:id [DELETE]",
            )
    
    @task(1)
    def graphql_query(self):
        self.client.post(
            "/graphql",
            json={
                "query": """
                {
                    articles(limit: 10, sort: ["-date_created"], filter: {status: {_eq: "published"}}) {
                        id
                        title
                        author
                        date_created
                    }
                }
                """,
            },
            headers=self.headers,
            name="/graphql [QUERY]",
        )
    
    @task(1)
    def get_server_info(self):
        self.client.get("/server/info", headers=self.headers, name="/server/info")

class DirectusAdmin(HttpUser):
    wait_time = between(5, 10)
    weight = 1  # lower weight = fewer admin users
    
    def on_start(self):
        resp = self.client.post("/auth/login", json={
            "email": "admin@example.com",
            "password": "admin_password",
        })
        if resp.status_code == 200:
            self.token = resp.json()["data"]["access_token"]
            self.headers = {"Authorization": f"Bearer {self.token}"}
    
    @task
    def admin_dashboard(self):
        self.client.get("/activity?limit=20", headers=self.headers, name="/activity [ADMIN]")
        self.client.get("/collections", headers=self.headers, name="/collections [ADMIN]")
        self.client.get("/fields/articles", headers=self.headers, name="/fields [ADMIN]")

# Run: locust -f locustfile.py --host http://localhost:8055
# Web UI: http://localhost:8089
# Headless: locust -f locustfile.py --headless -u 100 -r 10 -t 5m

วิเคราะห์ผลและ Optimize Performance

วิเคราะห์ผล load test และ optimize

#!/usr/bin/env python3
# analyze_results.py — Load Test Results Analysis
import json
import pandas as pd
import numpy as np
from pathlib import Path
from datetime import datetime

class LoadTestAnalyzer:
    def __init__(self):
        self.results = []
    
    def parse_k6_results(self, json_file):
        with open(json_file) as f:
            data = json.load(f)
        
        metrics = data.get("metrics", {})
        
        return {
            "tool": "k6",
            "http_req_duration_avg": metrics.get("http_req_duration", {}).get("avg", 0),
            "http_req_duration_p95": metrics.get("http_req_duration", {}).get("p(95)", 0),
            "http_req_duration_p99": metrics.get("http_req_duration", {}).get("p(99)", 0),
            "http_reqs_total": metrics.get("http_reqs", {}).get("count", 0),
            "http_reqs_rate": metrics.get("http_reqs", {}).get("rate", 0),
            "errors_rate": metrics.get("errors", {}).get("rate", 0),
            "vus_max": metrics.get("vus_max", {}).get("value", 0),
        }
    
    def parse_locust_csv(self, stats_csv):
        df = pd.read_csv(stats_csv)
        aggregated = df[df["Name"] == "Aggregated"].iloc[0] if "Aggregated" in df["Name"].values else df.iloc[-1]
        
        return {
            "tool": "locust",
            "total_requests": int(aggregated.get("Request Count", 0)),
            "failure_count": int(aggregated.get("Failure Count", 0)),
            "avg_response_time": float(aggregated.get("Average Response Time", 0)),
            "p50_response_time": float(aggregated.get("50%", 0)),
            "p95_response_time": float(aggregated.get("95%", 0)),
            "p99_response_time": float(aggregated.get("99%", 0)),
            "requests_per_sec": float(aggregated.get("Requests/s", 0)),
        }
    
    def generate_report(self, results, output_path="load_test_report.html"):
        html = f"""

Directus CMS Load Testing Strategy — | SiamCafe




Generated: {datetime.now().strftime('%Y-%m-%d %H:%M')}

""" thresholds = { "avg_response_time": 200, "p95_response_time": 500, "p99_response_time": 1000, "error_rate": 0.05, } for key, value in results.items(): threshold = thresholds.get(key, "-") if isinstance(value, float): status = "good" if threshold != "-" and value <= threshold else "bad" if threshold != "-" else "" html += f'\n' else: html += f'\n' html += "
MetricValueThresholdStatus
{key}{value:.2f}{threshold}{"PASS" if status == "good" else "FAIL" if status == "bad" else "-"}
{key}{value}--
" Path(output_path).write_text(html) print(f"Report saved to {output_path}") def recommend_optimizations(self, results): recommendations = [] avg_rt = results.get("avg_response_time", 0) or results.get("http_req_duration_avg", 0) p95_rt = results.get("p95_response_time", 0) or results.get("http_req_duration_p95", 0) if avg_rt > 200: recommendations.append({ "priority": "high", "area": "Response Time", "suggestion": "Enable Redis caching (CACHE_ENABLED=true, CACHE_STORE=redis)", }) if p95_rt > 1000: recommendations.append({ "priority": "high", "area": "Database", "suggestion": "Add database indexes on frequently queried columns, increase shared_buffers", }) if avg_rt > 100: recommendations.append({ "priority": "medium", "area": "Caching", "suggestion": "Increase CACHE_TTL, add CDN for static assets", }) recommendations.append({ "priority": "medium", "area": "Connection Pooling", "suggestion": "Use PgBouncer for PostgreSQL connection pooling", }) return recommendations # analyzer = LoadTestAnalyzer() # results = analyzer.parse_locust_csv("locust_stats.csv") # analyzer.generate_report(results) # recs = analyzer.recommend_optimizations(results) # for r in recs: # print(f"[{r['priority'].upper()}] {r['area']}: {r['suggestion']}")

CI/CD Integration และ Automated Testing

รวม load testing เข้ากับ CI/CD pipeline

# .github/workflows/load-test.yml
name: Directus Load Test

on:
  schedule:
    - cron: '0 2 * * 1'  # Weekly Monday 02:00
  workflow_dispatch:
    inputs:
      target_url:
        description: 'Target Directus URL'
        required: true
        default: 'http://staging.example.com'
      duration:
        description: 'Test duration'
        required: true
        default: '5m'
      users:
        description: 'Max concurrent users'
        required: true
        default: '50'

jobs:
  load-test-k6:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Install k6
        run: |
          sudo gpg -k
          sudo gpg --no-default-keyring --keyring /usr/share/keyrings/k6-archive-keyring.gpg --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys C5AD17C747E3415A3642D57D77C6C491D6AC1D68
          echo "deb [signed-by=/usr/share/keyrings/k6-archive-keyring.gpg] https://dl.k6.io/deb stable main" | sudo tee /etc/apt/sources.list.d/k6.list
          sudo apt-get update
          sudo apt-get install k6

      - name: Run k6 Load Test
        run: |
          k6 run \
            --env BASE_URL=} \
            --env DIRECTUS_TOKEN=} \
            --out json=results.json \
            --summary-trend-stats="avg, min, med, max, p(90), p(95), p(99)" \
            tests/k6_directus_test.js
        continue-on-error: true

      - name: Upload Results
        uses: actions/upload-artifact@v4
        with:
          name: k6-results
          path: results.json

      - name: Check Thresholds
        run: |
          python3 -c "
          import json, sys
          with open('results.json') as f:
              data = json.load(f)
          metrics = data.get('metrics', {})
          p95 = metrics.get('http_req_duration', {}).get('p(95)', 0)
          errors = metrics.get('errors', {}).get('rate', 0)
          print(f'P95 Response Time: {p95:.0f}ms')
          print(f'Error Rate: {errors:.2%}')
          if p95 > 500 or errors > 0.05:
              print('FAIL: Performance thresholds exceeded!')
              sys.exit(1)
          print('PASS: All thresholds met')
          "

  load-test-locust:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Setup Python
        uses: actions/setup-python@v5
        with:
          python-version: '3.11'

      - name: Install Locust
        run: pip install locust

      - name: Run Locust Test
        run: |
          locust -f tests/locustfile.py \
            --host } \
            --headless \
            -u } \
            -r 5 \
            -t } \
            --csv=locust_results \
            --html=locust_report.html
        continue-on-error: true

      - name: Upload Reports
        uses: actions/upload-artifact@v4
        with:
          name: locust-results
          path: |
            locust_results_stats.csv
            locust_report.html

# === Performance Budget (Directus) ===
# Endpoint            | P95 Target | Max Error Rate
# GET /items/:coll    | 200ms      | 0.1%
# POST /items/:coll   | 500ms      | 0.5%
# GET /items/:id      | 100ms      | 0.1%
# GraphQL query       | 300ms      | 0.5%
# File upload         | 2000ms     | 1.0%
# Auth login          | 500ms      | 0.5%

FAQ คำถามที่พบบ่อย

Q: Directus รองรับ concurrent users ได้เท่าไหร?

A: ขึ้นอยู่กับ hardware และ configuration Single instance บน 4 CPU / 8GB RAM รองรับ 100-200 concurrent users สำหรับ read-heavy workloads ด้วย Redis cache เปิด สามารถเพิ่มเป็น 500+ concurrent users ได้ สำหรับ traffic สูงกว่านั้น ใช้ horizontal scaling ด้วยหลาย Directus instances หลัง load balancer

Q: ทำไม Directus ช้าเมื่อ relations ซับซ้อน?

A: Directus สร้าง SQL queries จาก API request อัตโนมัติ เมื่อ request มี deep relations (เช่น fields=*.*.*) Directus จะ generate JOINs หลายตัว ทำให้ query ช้า แก้ไขโดย limit fields ที่ request (fields=id, title, author.name แทน fields=*), ใช้ limit/offset pagination, enable Redis cache และเพิ่ม database indexes บน foreign key columns

Q: k6 กับ Locust เลือกอันไหน?

A: k6 เขียนด้วย JavaScript รัน binary เดียวไม่ต้อง install dependencies, resource efficient มาก, output metrics เยอะ, integrate กับ Grafana ได้ดี เหมาะสำหรับ CI/CD Locust เขียนด้วย Python มี web UI สวย, distributed testing ง่าย, customize logic ได้ flexible กว่า เหมาะสำหรับ complex scenarios แนะนำ k6 สำหรับ automated CI/CD testing และ Locust สำหรับ exploratory testing

Q: จะ optimize Directus สำหรับ high traffic ได้อย่างไร?

A: เปิด Redis cache (CACHE_ENABLED=true), ใช้ CDN สำหรับ assets, เพิ่ม database indexes, ใช้ PgBouncer สำหรับ connection pooling, horizontal scale ด้วยหลาย instances หลัง load balancer, ใช้ read replicas สำหรับ read-heavy workloads, limit fields ใน API requests, set appropriate CACHE_TTL และ optimize database queries ด้วย EXPLAIN ANALYZE

📖 บทความที่เกี่ยวข้อง

Azure Front Door Load Testing Strategyอ่านบทความ → Directus CMS DevOps Cultureอ่านบทความ → Directus CMS Automation Scriptอ่านบทความ → Directus CMS Metric Collectionอ่านบทความ → Elasticsearch OpenSearch Load Testing Strategyอ่านบทความ →

📚 ดูบทความทั้งหมด →