Scaling out your load tests using Gatling, Packer, and a cloud provider (in this example we will use DigitalOcean and Amazon EC2.
*This post assumes that you know what Gatling is all about and need to scale your existing scenario to run from several machines / from different networks.
*want to try this out? use this referral link and get free 10$ in digital ocean (you can fire up many machines with 10$ )
Why scale out ?
-Problem I’ve encountered with Gatling is the max amount of connection 1 machine can send to a server in order to stress test it.
-In a fast explanation the default setting for a ubuntu machine is around 470~ connections (link)
-with some tweaking on the machine (setting virtual network interfaces and increasing the port range) you can get more (link)
-firing up a new VM on the cloud and running reuqests from it is easy, fast and cheap!
First lets create a image for our “Gatling Node”, we can deploy this image multiple times and the images are set to enable ssh from 1 to another, this image template is a packer json – link to packer json file
Image contains
- ubuntu14.04 as base image
- all necessary pre-executes to run Gatling scenario
- SSH keys to enable SSH from nodes to eachother
- Gatling installation
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 |
{ "builders": [ { "type": "digitalocean", "api_token": "XXXX", "image": "ubuntu-14-04-x64", "region" : "nyc3", "size" : "1gb", "droplet_name" : "gatlingNode" }, { "type": "amazon-ebs", "access_key": "XXX", "secret_key": "XXX", "region": "us-east-1", "source_ami": "ami-b8067ed0", "instance_type": "t2.micro", "ssh_username": "ubuntu", "ami_name": "gatlingNode" } ], "provisioners": [ { "type": "shell", "inline": [ "sudo apt-get update", "sudo apt-get install -y unzip", "sudo apt-get install -y default-jre", "wget https://repo1.maven.org/maven2/io/gatling/highcharts/gatling-charts-highcharts-bundle/2.1.4/gatling-charts-highcharts-bundle-2.1.4-bundle.zip -P /tmp", "mkdir ~/gatling", "unzip /tmp/gatling-charts-highcharts-bundle-2.1.4-bundle.zip -d ~/gatling/", "ssh-keygen -f ~/.ssh/id_rsa -t rsa -N ''", "touch ~/.ssh/authorized_keys", "cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys", "touch ~/.ssh/config", "echo \"Host *\" >> ~/.ssh/config", "echo \" StrictHostKeyChecking no\" >> ~/.ssh/config", "sync", "sleep 120", "echo \"finished running the script\"" ] } ] } |
After your new cluster is ready you can use this script to run your scenario on all the nodes link to script
Main steps of the script
- Copy simulation to all remote hosts
- Running simulations all remote hosts
- Running simulation on localhost
- Collect all simulation to localhost
- Generate report with aggregated data
- Open browser with the report
the script is pretty strait forward and well documented on each step :
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 |
#!/bin/bash ################################################################################################################## #Gatling scale out/cluster run script: #If you are using the packer images this should run out of the box #Before running this script some assumptions are made: #1) Public keys were exchange inorder to ssh with no password promot (ssh-copy-id on all remotes) #2) Check read/write permissions on all folders declared in this script. #3) Gatling installation (GATLING_HOME variable) is the same on all hosts #4) Assuming all hosts has the same user name (if not change in script) ################################################################################################################## #Assuming same user name for all hosts USER_NAME='nimrod' #Remote hosts list HOSTS=( 192.168.28.24 192.123.123.12 180.123.98.1) #Assuming all Gatling installation in same path (with write permissions) GATLING_HOME=/gatling/gatling-charts-highcharts-1.5.6 GATLING_SIMULATIONS_DIR=$GATLING_HOME/user-files/simulations GATLING_RUNNER=$GATLING_HOME/bin/gatling.sh #Change to your simulation class name SIMULATION_NAME='nimrodstech.GatlingClusterTest' #No need to change this GATLING_REPORT_DIR=$GATLING_HOME/results/ GATHER_REPORTS_DIR=/gatling/reports/ echo "Starting Gatling cluster run for simulation: $SIMULATION_NAME" echo "Cleaning previous runs from localhost" rm -rf $GATHER_REPORTS_DIR mkdir $GATHER_REPORTS_DIR rm -rf $GATLING_REPORT_DIR for HOST in "${HOSTS[@]}" do echo "Cleaning previous runs from host: $HOST" ssh -n -f $USER_NAME@$HOST "sh -c 'rm -rf $GATLING_REPORT_DIR'" done for HOST in "${HOSTS[@]}" do echo "Copying simulations to host: $HOST" scp -r $GATLING_SIMULATIONS_DIR/* $USER_NAME@$HOST:$GATLING_SIMULATIONS_DIR done for HOST in "${HOSTS[@]}" do echo "Running simulation on host: $HOST" ssh -n -f $USER_NAME@$HOST "sh -c 'nohup $GATLING_RUNNER -nr -s $SIMULATION_NAME > /gatling/run.log 2>&1 &'" done echo "Running simulation on localhost" $GATLING_RUNNER -nr -s $SIMULATION_NAME echo "Gathering result file from localhost" ls -t $GATLING_REPORT_DIR | head -n 1 | xargs -I {} mv ${GATLING_REPORT_DIR}{} ${GATLING_REPORT_DIR}report cp ${GATLING_REPORT_DIR}report/simulation.log $GATHER_REPORTS_DIR for HOST in "${HOSTS[@]}" do echo "Gathering result file from host: $HOST" ssh -n -f $USER_NAME@$HOST "sh -c 'ls -t $GATLING_REPORT_DIR | head -n 1 | xargs -I {} mv ${GATLING_REPORT_DIR}{} ${GATLING_REPORT_DIR}report'" scp $USER_NAME@$HOST:${GATLING_REPORT_DIR}report/simulation.log ${GATHER_REPORTS_DIR}simulation-$HOST.log done mv $GATHER_REPORTS_DIR $GATLING_REPORT_DIR echo "Aggregating simulations" $GATLING_RUNNER -ro reports #using macOSX open ${GATLING_REPORT_DIR}reports/index.html #using ubuntu #google-chrome ${GATLING_REPORT_DIR}reports/index.html |
Gatling report example.