Skip to content

Dynamic Analysis: Integrate Artillery for Performance & Load Testing#53

Merged
1 commit merged into
mainfrom
tools/artillery-dynamic-analysis
Mar 19, 2026
Merged

Dynamic Analysis: Integrate Artillery for Performance & Load Testing#53
1 commit merged into
mainfrom
tools/artillery-dynamic-analysis

Conversation

@sliang-code
Copy link
Copy Markdown

@sliang-code sliang-code commented Mar 12, 2026

Description

This PR introduces Artillery as a dynamic analysis tool for our NodeBB instance. Unlike static analysis which checks code quality, Artillery allows us to simulate real-world user traffic to evaluate the system's performance, stability, and latency under load.

Installation Details

  1. Lockfile: Updated package-lock.json reflects the installation of the Artillery engine and its dependencies.
  2. Configuration: Added artillery-test.yml in the root directory to define the testing scenarios.

Execution Details

  1. Terminal Output: Attached below is the summary of the 150-request load test.JSON
  2. Report: The full report.json file is included in the commit as well as the branch.
  3. Screenshots (Partial):
image

Tool Evaluation: Artillery

Pros & Strengths

  1. Node.js Ecosystem Alignment: Being a Node-based tool, it integrates perfectly with our stack and requires zero external runtimes.
  2. Percentile-Based Metrics (Quantitative): Artillery provides $p95$ and $p99$ latencies. In our test, while the Median was 7ms, the p99 was 46.1ms. This allows us to see the "outlier" experiences that averages hide.
  3. CI/CD Ready: The tool is designed to be run in headless environments. We can set "thresholds" (e.g., fail if $p95 > 200ms$) to automatically block performance-degrading PRs in the future.

Cons & Weaknesses

  1. Protocol Limits: While excellent for HTTP/WebSockets, it requires additional plugins for more complex database-level or protocol-specific stress testing.
  2. Resource Contention (Qualitative): Running the tool on the same machine as the NodeBB server can cause CPU contention, potentially skewing results. For more accurate data, it should eventually be run from a separate runner.

Customization & Maintenance

  1. A Priori (Initial Setup): We customized the target to point to the NodeBB local port and defined a "Warm-up" phase. We also filtered for 2xx response codes to ensure we weren't just measuring the speed of error pages.
  2. Ongoing Maintenance: As we add features like "New Post" or "User Registration," we will need to update the YAML files to include CSRF token handling and JSON body payloads. We should also implement the artillery-plugin-expect to verify content at the same time we measure speed.

@sliang-code sliang-code self-assigned this Mar 12, 2026
@sliang-code
Copy link
Copy Markdown
Author

report.json
Attached the report of the running artillery on the codebase.

@pebble-fish pebble-fish closed this pull request by merging all changes into main in 74c9773 Mar 19, 2026
@sliang-code
Copy link
Copy Markdown
Author

  1. Tool Name & Description
    • Name: Artillery
    • Description: A dynamic load-testing tool used to simulate high volumes of concurrent user traffic. It provides performance metrics like latency, throughput, and error rates to ensure a system remains stable under stress.
    • Source: https://github.com/artilleryio/artillery (this is the repo founded online and the actual integration is from npm install)

  2. Static or Dynamic?
    • Type: Dynamic Analysis. It requires the NodeBB server to be running on http://localhost:4567/ (or Github action tests) and tests the system's live response to network requests.

  3. Problems Caught
    • Common/Tail Latency (p50/p999): The median (p50) response time is excellent (=7ms, low response delay). However, there are some extreme outliers like p999 value (= 133ms, higher delay). This catches edge-case delays that might happen in a few users’ experience.
    • HTTP Access Success Rates: The tool confirmed a 100% success rate (http.codes.200: 150), meaning that the server didn't crash or drop requests at an arrival rate of 5 users/sec.
    • Bandwidth Bottlenecks: It tracked ~7MB of downloaded data, which helps identify if the homepage payload is too heavy for the network.

  4. Customization Necessary
    • Target Configuration: I had to point the tool specifically to the NodeBB port (4567).
    • Phase Definition: I customized a 30-second "Warm up" phase with a steady arrivalRate of 5 (meaning 5 user accesses/second). For a production-ready test, I would need to add a higher load phase with much higher numbers (e.g., 50–100 users/sec).
    • Scenario Scripting: I defined a specific "Land on homepage" flow using a GET request to / (the homepage of NodeBB). For production-ready test, I need to add more detailed test paths like “/login” (login page) or “/category/2/general-discussion” (general discussion page) to test other specifics.

  5. Integration into Development
    • This tool should be integrated into the Continuous Integration & Continuous Development (CI/CD) pipeline.
    • The Strategy: Every time a PR is merged to the main branch, Artillery should run a "Smoke Test." If the p95 response time exceeds a specific threshold (e.g., 200ms) or if vusers.failed > 0, the deployment should automatically roll back.

  6. Accuracy
    • False Positives: If I run this test while my computer is also doing a heavy task, Artillery might report high latency. This is a false positive because the server isn't slow but my testing environment.
    • True Positives (Don't Care): My report shows a max response time of 324ms. In some cases, this happens during the very first request due to cache-warming. While it's a "True" slow response, I will ignore it because it doesn't represent the long-term performance of the app.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants