wrk - HTTP Benchmarking Tool
Prerequisites
- Operating System: Linux, macOS, or other Unix-like systems
- Compiler: GCC or Clang (C compiler required)
- Build System: Make
- Optional: LuaJIT (for scripting support)
- System Configuration: Sufficient ephemeral ports and proper socket recycling settings
Installation
Building from Source
-
Clone the repository:
git clone https://github.com/wg/wrk.git cd wrk -
Build the project:
make -
(Optional) Install to system:
sudo make install
System Configuration
For optimal performance, ensure your system has sufficient ephemeral ports:
# Check current ephemeral port range
sysctl net.ipv4.ip_local_port_range
# Increase if necessary
sudo sysctl -w net.ipv4.ip_local_port_range="1024 65535"
Configuration
Command Line Options
-t, --threads: Number of threads to use (default: 1)-c, --connections: Total HTTP connections to keep open (default: 10)-d, --duration: Test duration (e.g., 30s, 2m, 2h)-s, --script: LuaJIT script for custom request generation-H, --header: Custom HTTP headers to add to requests--latency: Print detailed latency statistics--timeout: Response timeout threshold
Environment Variables
No specific environment variables are required for basic operation.
Lua Scripting
For advanced usage with Lua scripts:
- Ensure LuaJIT is installed
- Place custom scripts in the
scripts/directory or specify with-sflag
Build & Run
Development
-
Build the project:
make -
Run a basic benchmark:
./wrk -t12 -c400 -d30s http://127.0.0.1:8080/index.html
Production
For production benchmarking:
- Ensure system is properly configured (see Prerequisites)
- Run with appropriate thread and connection counts:
wrk -t$(nproc) -c1000 -d5m --latency http://your-server.com/
Server Configuration
For the server being tested:
- Set listen backlog greater than concurrent connections
- Ensure proper socket recycling
- Monitor system resources during testing
Deployment
wrk is a command-line tool that doesn't require traditional deployment. For distributed testing:
Local Deployment
- Build and install on each testing machine
- Coordinate tests from a central location
- Aggregate results from multiple nodes
Cloud Deployment
For cloud-based testing:
- Deploy to multiple cloud instances
- Use SSH or orchestration tools to coordinate tests
- Consider using containerization for consistent environments
Troubleshooting
Common Issues
1. "Address already in use" errors
- Solution: Increase ephemeral port range or wait for sockets to timeout
2. Insufficient connections
- Solution: Check system limits and increase
net.ipv4.ip_local_port_range
3. Poor performance
- Check: CPU affinity, network interface saturation, system resource limits
4. Lua script errors
- Verify: LuaJIT installation and script syntax
Performance Tuning
- Thread Count: Match to CPU cores
- Connection Count: Balance with system resources
- Duration: Long enough for steady state (typically 30+ seconds)
Monitoring
During tests, monitor:
- CPU usage
- Network throughput
- System load
- Memory usage
Validation
Verify results by:
- Running multiple tests
- Checking for consistent metrics
- Comparing with other benchmarking tools if needed