Performance & Load Testing with JMeter & Java

📚 Performance & Load Testing with JMeter & Java

A Comprehensive Learning Guide (Beginner → Advanced)

Why this chapter matters
Performance testing ensures that your Java application can handle real‑world traffic without degrading user experience. Apache JMeter is the most widely used open‑source tool for this purpose, and Java gives you the power to generate custom data, integrate with your codebase, and extend JMeter’s capabilities.

Goal – By the end of this chapter you will be able to:

  • Build a robust JMeter test plan from scratch.
  • Parameterize tests using Java code (custom samplers, functions, data sets).
  • Run realistic load tests (single machine, distributed, CI‑integrated).
  • Extract and interpret performance metrics.
  • Optimize scripts for speed, reliability, and maintainability.

📑 Table of Contents

SectionTopicsDeliverables
1. Fundamentals• What is performance testing?• JMeter architecture• Core components (Thread Group, Samplers, Listeners, Config Elements)• Basic test plan example• Clear explanations• Simple JMeter GUI walkthrough
2. Implementation• Create a JMeter test plan• Parameterize with Java (CSV, Java Sampler, JSR223)• Run load tests (non‑GUI, ramp‑up, think time)• Analyze metrics (Summary, Aggregate, Dashboard)• Optimize scripts (listeners, properties, multi‑threading)• Step‑by‑step code snippets• Practical JMeter files
3. Advanced Topics• Advanced parameterization (dynamic data, DB, REST)• Distributed JMeter (master/slave, Docker, Kubernetes)• System monitoring (PerfMon, JMX, Grafana)• Custom Java Sampler (interface, code)• Performance dashboard & alerts• Advanced techniques• Performance budget examples
4. Real‑World Applications• E‑commerce site load test• Microservices API test• Database‑heavy workload• CI/CD integration• Case studies with diagrams
5. Exercises• Build, run, analyze, optimize test plans• Project‑style tasks with solutions

1. Fundamentals

1.1 What is Performance & Load Testing?

TermDefinitionTypical Goal
Load TestingSimulate expected user load to confirm the system handles the load.Verify throughput, response time, error rate.
Stress TestingPush the system beyond expected load to find breaking points.Determine maximum capacity, identify bottlenecks.
Soak TestingRun under normal load for extended period.Detect memory leaks, resource exhaustion.
Spike TestingAbruptly increase/decrease load.Validate system’s reaction to sudden load changes.

Industry Insight – In agile environments, performance testing is usually a continuous activity. It should be part of every sprint, not a one‑off “pre‑launch” task.

1.2 JMeter Overview

Apache JMeter is a pure Java application that runs on the JVM.
Key features:

FeatureWhy it mattersExample
GUI & Non‑GUIGUI is for design; non‑GUI (CLI) for production runs.jmeter -n -t test.jmx -l results.jtl
ExtensibilityPlugins (PerfMon, JSON, CSV, etc.)jmeter-plugins-manager
Distributed ExecutionMaster/Slave architecture for large loads.-Rslave1,slave2
ScriptingBeanshell, JSR223, Java Sampler.org.apache.jmeter.samplers.JSR223Sampler

Best Practice – Keep the JMeter GUI for development only. Run all production tests in non‑GUI mode to avoid memory leaks and UI overhead.

1.3 Core Components

ComponentRoleExample
Thread GroupDefines virtual users (threads), ramp‑up, loop count.Thread Group → 10 users, ramp‑up 30s, 1 loop
SamplersPerform actual requests (HTTP, JDBC, JMS, etc.).HTTP Request – GET /api/products
ListenersCapture results (View Results Tree, Summary Report, Aggregate Report).Aggregate Report – Avg, Min, Max, Std Dev
Config ElementsProvide shared data (CSV Data Set, HTTP Header Manager).CSV Data Setusernames.csv
AssertionsValidate responses (Response Assertion, Size Assertion).Check status code 200
TimersSimulate think‑time (Constant Timer, Uniform Random Timer).Uniform Random Timer – 200–800ms

TipNever put heavy listeners (like View Results Tree) in a load test. They consume memory and skew results.

1.4 Simple Test Plan Example

  1. Create a new Test Plan
    File > New > Test Plan
    
  2. Add Thread Group – 5 users, ramp‑up 10s, loop 1
  3. Add HTTP Request
    • Server: example.com
    • Path: /api/products
    • Method: GET
  4. Add ListenerAggregate Report
  5. Save as simple.jmx

Run it: jmeter -n -t simple.jmx -l results.jtl

The results will show average response time, throughput, error count.


2. Implementation

2.1 Create JMeter Test Plan

Below is a step‑by‑step approach to build a realistic test plan for a REST API.

StepActionJMeter UI
1Test Plan – Add Test Plan node.File > New > Test Plan
2Thread Group – Add Thread Group (10 threads, ramp‑up 20s).Add > Threads (Users) > Thread Group
3HTTP Header Manager – Add default headers (e.g., Content-Type: application/json).Add > Config Element > HTTP Header Manager
4CSV Data Set Config – Provide dynamic data (user IDs).Add > Config Element > CSV Data Set Config
5HTTP Request – Add a request for each API endpoint.Add > Sampler > HTTP Request
6Assertions – Add Response Assertion to check status 200.Add > Assertion > Response Assertion
7Listeners – Add Aggregate Report and View Results Tree (for debugging).Add > Listener
8Save the plan.File > Save

Tip: Use Thread Group → Number of Threads = 10, Ramp-Up Period = 20 seconds. This means 0.5 users per second, which gives a steady load.


2.2 Parameterize JMeter with Java

2.2.1 CSV Data Set vs. Java Sampler

ApproachUse‑CaseProsCons
CSV Data SetStatic data (e.g., user IDs).Simple, no code.Not dynamic; requires file.
Java SamplerGenerate data on‑the‑fly (random strings, timestamps).Full control, no file I/O.Requires Java knowledge, extra jar.

2.2.2 Java Sampler Example

Create a Java class that implements org.apache.jmeter.samplers.JavaSamplerClient.

import org.apache.jmeter.samplers.*;
import org.apache.jmeter.samplers.SampleResult;
import org.apache.jmeter.samplers.SampleResult;
import org.apache.jmeter.samplers.SampleResult;
import org.apache.jmeter.samplers.SampleResult;
import org.apache.jmeter.samplers.SampleResult;

public class RandomUserSampler implements JavaSamplerClient {

    @Override
    public SampleResult runTest(JavaSamplerContext context) {
        SampleResult sr = new SampleResult();
        sr.setSampleLabel("RandomUserSampler");

        try {
            // 1. Generate random user id
            String userId = "user-" + UUID.randomUUID().toString();

            // 2. Build JSON payload
            String payload = "{\"userId\":\"" + userId + "\"}";

            // 3. Call REST endpoint
            URL url = new URL("https://example.com/api/users");
            HttpURLConnection conn = (HttpURLConnection) url.openConnection();
            conn.setRequestMethod("POST");
            conn.setDoOutput(true);
            conn.setRequestProperty("Content-Type", "application/json");
            try (OutputStream os = conn.getOutputStream()) {
                os.write(payload.getBytes(StandardCharsets.UTF_8));
            }

            // 4. Read response
            int status = conn.getResponseCode();
            sr.setSuccessful(status == 200);
            sr.setResponseMessage("HTTP " + status);
            sr.setResponseData(conn.getResponseBody());
            sr.setDataType(SampleResult.TEXT);
        } catch (Exception e) {
            sr.setSuccessful(false);
            sr.setResponseMessage(e.getMessage());
        }

        return sr;
    }

    @Override
    public void setupTest(JavaSamplerContext context) { /* optional */ }

    @Override
    public void teardownTest(JavaSamplerContext context) { /* optional */ }
}

How to use:

  1. Compile the class and create a jar.
  2. Add Java Sampler to your test plan.
  3. In the Java Sampler’s Class field, enter RandomUserSampler.

Pro Tip – Keep the Java Sampler lightweight; avoid blocking I/O inside runTest.

2.2.3 JSR223 Sampler with Groovy

import java.util.UUID

def userId = "user-" + UUID.randomUUID()
def payload = "{\"userId\":\"${userId}\"}"
def url = new URL("https://example.com/api/users")
def conn = url.openConnection()
conn.setRequestMethod("POST")
conn.setDoOutput(true)
conn.setRequestProperty("Content-Type", "application/json")
conn.getOutputStream().withCloseable { os ->
    os.write(payload.getBytes("UTF-8"))
}
def status = conn.getResponseCode()
SampleResult.setSuccessful(status == 200)
SampleResult.setResponseMessage("HTTP ${status}")
SampleResult.setResponseData(conn.getResponseBody())

Why Groovy? – Faster to write, no need for compiled jars, still runs on JVM.


2.3 Run Load Tests

2.3.1 Non‑GUI Mode

jmeter -n -t my_test_plan.jmx -l results.jtl -j jmeter.log
  • -n – Non‑GUI
  • -t – Test plan file
  • -l – Results file (JTL)
  • -j – Log file

2.3.2 Ramp‑Up & Think Time

ParameterMeaningExampleImpact
Ramp‑UpTime to start all threads30s for 10 threads → 3s per threadSmooth load
Think TimeDelay between requests200–800msSimulates real user pause

Add a Uniform Random Timer (200–800ms) after each HTTP Request.

2.3.3 Distributed JMeter

  1. Master – Runs test plan.
  2. Slaves – Execute requests.
# Master
jmeter -n -t test.jmx -Rslave1,slave2 -l master_results.jtl

# Slave
jmeter -n -t test.jmx -s -l slave_results.jtl

Docker Example – Use jmeter-standalone image; map volumes for test plan and results.


2.4 Analyze Performance Metrics

ListenerKey MetricTypical Threshold
Aggregate ReportAvg, Min, Max, Std Dev, ThroughputAvg < 200ms, Throughput > 1000 req/s
Summary ReportSample Count, Error %, Avg, 90th, 95thError % < 1%
View Results TreeResponse body, error detailsDebug only

2.4.1 JMeter Dashboard

jmeter -n -t test.jmx -l results.jtl -e -o ./dashboard
  • -e – Generate dashboard
  • -o – Output folder

The dashboard (HTML) contains charts for throughput, latency, errors, and response times.

2.4.2 PerfMon Plugin

FeatureUse‑Case
CPU, Memory, DiskMonitor system resources on target machine
Network I/ODetect bandwidth bottlenecks
JVMMonitor heap usage, GC

Industry Insight – Combine JMeter metrics with system metrics to pinpoint whether the issue is application or infrastructure.


2.5 Optimize Test Scripts

IssueFixRationale
Too many listenersRemove or move to a separate debugging test planReduces memory overhead
Hard‑coded dataUse CSV or Java SamplerRe‑usability
Synchronous waitsUse timers or think‑timePrevents unrealistic load
Large result filesStore results in CSV formatEasier to process
Hard‑coded URLsUse HTTP Request DefaultsCentral management

2.5.1 Use Properties & Variables

# user.properties
threads=20
rampUp=30

In JMeter, reference as ${__P(threads)}.

2.5.2 Thread Group vs. Constant Throughput Timer

  • Thread Group – Controls number of threads.
  • Constant Throughput Timer – Controls requests per minute, independent of threads.

Pro Tip – Use Constant Throughput Timer for throughput‑based tests (e.g., 1000 req/min).


3. Advanced Topics

3.1 Advanced Parameterization

TechniqueDescriptionExample
Dynamic JSONGenerate JSON on‑the‑fly using Groovypayload = "{\"id\":\"${UUID.randomUUID()}\"}"
DB DataRead from database via JDBC SamplerSELECT * FROM users WHERE active=1
REST AuthGet JWT token in a pre‑processor, store in ${__P(token)}

3.1.1 Parameterization with JSR223 PreProcessor

def token = getAuthToken()
vars.put("token", token)

Add to HTTP Request as Authorization: Bearer ${token}.


3.2 Distributed JMeter (Advanced)

ComponentRoleTips
MasterOrchestrates slavesKeep it on a separate machine
SlavesExecute requestsUse -s to run in slave mode
Load BalancerDistribute load evenlyUse Constant Throughput Timer on each slave

Docker Compose Example – Spin up JMeter master + 3 slaves:

services:
  jmeter-master:
    image: jmeter-standalone
    command: ["-n", "-t", "/test/test.jmx", "-R", "slave1,slave2,slave3"]
  jmeter-slave1:
    image: jmeter-standalone
    command: ["-n", "-s"]
  jmeter-slave2:
    image: jmeter-standalone
    command: ["-n", "-s"]
  jmeter-slave3:
    image: jmeter-standalone
    command: ["-n", "-s"]

3.3 Performance Dashboard & Alerts

ToolIntegrationUse‑Case
GrafanaVisualize JMeter metrics via InfluxDBReal‑time dashboards
PrometheusExport metrics from JMeter via pluginAlerting
SlackSend alerts on threshold breachDevOps

Pro Tip – Export JTL to CSV and ingest into InfluxDB; Grafana can plot throughput, latency, errors.


3.4 System Monitoring (PerfMon)

MetricToolThreshold
CPU< 80%
Memory< 70%
GC< 5%
Disk I/O< 70%

Industry Insight – If CPU is high but latency is low, the issue may be network. If GC is high and latency spikes, consider heap tuning.


3.5 Custom Java Sampler – Full Example

package com.example.jmeter;

import org.apache.jmeter.samplers.*;
import org.apache.jmeter.samplers.SampleResult;
import org.apache.jmeter.samplers.SampleResult;
import org.apache.jmeter.samplers.SampleResult;
import java.io.*;
import java.net.*;
import java.nio.charset.StandardCharsets;

public class CustomApiSampler implements JavaSamplerClient {

    @Override
    public SampleResult runTest(JavaSamplerContext context) {
        SampleResult sr = new SampleResult();
        sr.setSampleLabel("CustomApiSampler");

        try {
            // 1. Build request
            URL url = new URL("https://example.com/api/compute");
            HttpURLConnection conn = (HttpURLConnection) url.openConnection();
            conn.setRequestMethod("POST");
            conn.setDoOutput(true);
            conn.setRequestProperty("Content-Type", "application/json");

            // 2. Payload
            String payload = "{\"value\": " + Math.random() + "}";
            conn.getOutputStream().write(payload.getBytes(StandardCharsets.UTF_8));

            // 3. Read response
            int status = conn.getResponseCode();
            sr.setSuccessful(status == 200);
            sr.setResponseMessage("HTTP " + status);
            sr.setResponseData(conn.getResponseBody());
            sr.setDataType(SampleResult.TEXT);
        } catch (Exception e) {
            sr.setSuccessful(false);
            sr.setResponseMessage(e.getMessage());
        }

        return sr;
    }

    @Override
    public void setupTest(JavaSamplerContext context) { }

    @Override
    public void teardownTest(JavaSamplerContext context) { }
}

Compile & jarjavac -cp jmeter-core-5.6.jar CustomApiSampler.javajar cvf CustomApiSampler.jar com/example/jmeter/*.class

Add to JMeter – Java Sampler → com.example.jmeter.CustomApiSampler.


3.6 Advanced Optimizations

OptimizationWhyHow
Thread Group vs. Constant Throughput TimerControl load by requests per minute.Use -t with Constant Throughput Timer.
JMeter PropertiesReduce memory usage.Set jmeter.propertiesjmeter.save.screenshot=false.
Result FormatSmaller files.Use -l results.csv instead of .jtl.
ListenersOnly essential.Use Aggregate Report only.
Use of JSR223Groovy is fast, no compile.Replace Beanshell with Groovy.
Parallel ControllerExecute sub‑requests concurrently.Add Parallel Controller inside Thread Group.
Distributed JMeterScale to thousands of users.Add more slave nodes.

Pro Tip – For millions of requests, use Non‑GUI + Result file + CSV + Parallel Controller.


4. Real‑World Applications

ScenarioTest Plan HighlightsMetrics FocusLessons Learned
E‑commerce siteLoad product listing, add to cart, checkout.Throughput, latency, error rate.Identify database bottleneck.
Microservices APISimulate 3 services (auth, catalog, payment).End‑to‑end latency, service errors.Use distributed tracing (OpenTelemetry).
Database‑heavy workloadJDBC Sampler with bulk inserts.CPU, memory, GC.Tune JDBC pool, connection reuse.
CI/CD integrationRun JMeter in Jenkins pipeline.Build stability, performance regression.Set performance budgets, fail builds on thresholds.

4.1 E‑commerce Load Test (Example)

  1. Endpoints/products, /cart, /checkout.
  2. User Flow – Browse → Add to cart → Checkout.
  3. Thread Group – 1000 virtual users, ramp‑up 5 min.
  4. Data – Use CSV for user credentials.
  5. Metrics – 95th percentile < 2s, error % < 0.5%.

Outcome – Identify that checkout endpoint is the slowest due to DB transaction lock.

4.2 Microservices API (Example)

  1. Services – Auth (JWT), Catalog (JSON), Payment (SOAP).
  2. Test Plan – Chain requests with Transaction Controller.
  3. Metrics – End‑to‑end latency, individual service latency.
  4. Result – Payment service is 3× slower due to external gateway.

Lesson – Use Distributed Tracing (e.g., Zipkin) to correlate JMeter metrics with service logs.

4.3 Database‑Heavy Workload (Example)

  1. Sampler – JDBC Sampler, bulk inserts (1000 rows).
  2. Thread Group – 200 threads.
  3. Metrics – CPU > 90%, GC > 5%.
  4. Result – Add connection pooling, tune heap.

Lesson – Performance testing can surface JVM issues before production.

4.4 CI/CD Integration (Example)

  1. Jenkins Pipelinesh 'jmeter -n -t test.jmx -l results.csv'.
  2. Post‑build – Parse CSV, check thresholds.
  3. Fail – If 95th percentile > 2s.
  4. Artifacts – Store dashboard.

Industry InsightPerformance budgets (e.g., max 1s latency) help maintain quality.


5. Exercises

Goal – Build, run, analyze, and optimize tests.

#TaskDeliverableDifficulty
1Create a simple JMeter plan for GET /api/status..jmx fileBeginner
2Parameterize the request with user IDs from a CSV file.CSV Data Set ConfigBeginner
3Write a Java Sampler that posts random user data to /api/users.Java class + jarIntermediate
4Run a load test with 50 threads ramp‑up 20s.Result CSVBeginner
5Generate a dashboard and identify the 90th percentile.Dashboard HTMLIntermediate
6Optimize the test plan: remove listeners, use Constant Throughput Timer.Updated .jmxAdvanced
7Create a distributed test: master + 2 slaves.Docker Compose fileAdvanced
8Integrate JMeter with Jenkins pipeline.Jenkinsfile snippetAdvanced
9Write a JSR223 PreProcessor that obtains an OAuth token and stores it in a variable.Groovy scriptIntermediate
10Build a custom Java Sampler that reads from a database and posts data.Java class + DB connectionAdvanced

Solution Snippets (partial)

# 1. Run non‑GUI
jmeter -n -t status.jmx -l status_results.csv
// 9. OAuth token
def url = new URL("https://auth.example.com/token")
def conn = url.openConnection()
conn.setRequestMethod("POST")
conn.setDoOutput(true)
conn.getOutputStream().write("grant_type=client_credentials".getBytes("UTF-8"))
def token = new JSONParser().parseText(conn.getResponseBody()).access_token
vars.put("oauth_token", token)
// 7. Custom Sampler
public class DbPostSampler implements JavaSamplerClient {
    @Override
    public SampleResult runTest(JavaSamplerContext ctx) {
        // Connect to DB
        Connection conn = DriverManager.getConnection("jdbc:mysql://db:3306/test", "user", "pass");
        // Query
        ResultSet rs = conn.createStatement().executeQuery("SELECT id, name FROM users LIMIT 10");
        // Build JSON payload
        String payload = new StringBuilder().append("{\"users\":[");
        while (rs.next()) {
            payload.append("{\"id\":").append(rs.getInt("id")).append(",\"name\":\"")
                   .append(rs.getString("name")).append("\"},");
        }
        payload.setLength(payload.length() - 1); // remove trailing comma
        payload.append("]}");
        // Post to API
        // ...
    }
}

🎓 Key Takeaways

ConceptWhy It MattersPractical Tip
Start small – Build a minimal test plan first.Easier to debug.
Parameterization – Use CSV or Java Sampler.Avoid hard‑coded data.
Non‑GUI – For production runs.Saves memory, faster.
Listeners – Keep only essential.Reduces overhead.
Distributed JMeter – Scale to thousands of users.Use Docker/K8s for slaves.
Dashboard & System Monitoring – Correlate application & infrastructure.Use PerfMon + Grafana.
CI/CD – Automate performance regression.Fail build on thresholds.

📚 Further Reading & Resources

ResourceFocusLink
Apache JMeter Official DocsComprehensive guidehttps://jmeter.apache.org
PerfMon PluginSystem metricshttps://jmeter-plugins.org
JMeter Plugins ManagerInstall plugins easilyhttps://jmeter-plugins.org
Java Sampler APIExtending JMeterhttps://jmeter.apache.org/api/org/apache/jmeter/samplers/JavaSamplerClient.html
Grafana Dashboards for JMeterVisualize resultshttps://grafana.com/grafana/dashboards
Microservices Performance TestingEnd‑to‑end approachhttps://dzone.com/blog/microservices-performance-testing

🚀 Next Steps

  1. Build a complete end‑to‑end load test for a sample Spring Boot application.
  2. Add distributed JMeter using Docker Compose.
  3. Integrate with Jenkins to trigger tests on each commit.
  4. Set up alerts in Grafana when thresholds are breached.

Happy testing! 🚀