6 Things to Consider When Deploying a Load Balancerby Donna Toomey, July 19th, 2017
When you have made the sensible decision to implement a load balancer within your infrastructure, there are a few points you should consider to get the best from your solution. Load Balancing technology has evolved considerably and as a result, the terminology and architecture surrounding it can be confusing and overwhelming. This blog explores the various features and functionality worth considering when looking for a load balancer.
First off you need to determine your performance requirements. This will influence the most suitable deployment environment and load balancing platform for your needs, be it Hardware, Virtual Appliance, or Cloud. If you are looking for dedicated performance, physical hardware is your best bet. If you are looking for ease of deployment and maintenance, then you may wish to buy a virtual machine and deploy on an existing VMware or Hyper V estate. If you are looking for complete flexibility then perhaps the Cloud is for you, spinning up as and when required and selecting the size of Virtual Machine to suit your needs. Incidentally edgeNEXUS caters for all of these 😉
Once you have spec’d up and selected your platform, you need to look at your networking to architect where the Load balancer fits. Number one rule: you should always place the load balancer behind your network firewall to protect your business-critical systems. Are you deploying load balancers in multiple data centres? If yes then will you have them on stretched V-Lan’s or use a Global Server Load Balancer (GSLB) to give you failover between the multiple sites?
You will need to put the load balancer on a suitable network and assign it an IP accordingly. Perhaps you plan to sit the load balancer behind a firewall for external access, without requiring internal access. No doubt that you have a network architect who can work their magic to ensure a smooth deployment.
Your Application Traffic and Throughput
Its widely understood that load balancers mitigate the threat of downtime to provide always-available applications. But what about optimising application performance? Applications are the lifeblood of organisations. Workers expect instant access to business applications and a drop in availability or performance can tank business productivity.
Similarly, consumers demand a responsive shopping or browsing experience. In such competitive times, poorly performing sites can result in a loss of profit, a dent to brand image and a blow to customer retention.
It’s important to consider the number of users you have and scope the number of transactions or connections that will be made. This is known as your throughput requirements. Depending on the applications you are running, you may have periods where you anticipate spikes in traffic load (think eCommerce sites that get busy for Christmas or first thing in the morning when all your employees access their email).
You need to select a load balancer capable of managing this number of connections and throughput of traffic to ensure smooth, fast and consistent application delivery.
Managing Application Delivery
Once you’ve established your business-critical applications and spec’d the performance required from your load balancer, you can now plan how to process, manage and direct your data. Load balancers have evolved significantly from basic layer4 devices to much more sophisticated layer7 Application Delivery Controllers, or ADCs as Gartner refers to them, providing great levels of flexibility.
Layer4 load balancers operate at the transport layer, whilst layer7 load balancers operate at the application protocol level, affording them greater visibility and understanding of the application being processed. This enables advanced functionality and optimisation features including intelligent traffic management, content caching, security and compression.
edgeNEXUS Load Balancers feature a unique, web-traffic manipulation engine called “flightPATH” for creating specific rules to deal with real world application delivery problems, from 404 redirects, to blanking credit card details and rewriting content on-the-fly.
When building traffic management rules you need to set up the number of servers that can manage a particular service, determine the type of health check required on those servers and of course select the routing policy that will determine how the traffic is distributed. Our tech team can of course assist here.
Security is still the single, largest concern when it comes to publishing an application over the Internet and should be a paramount concern for anyone selecting and deploying a load balancer.
Many of the biggest attacks take place at an application layer, especially HTTP and yet many externally facing applications rely on a standard firewall for protection that will simply allow ALL application traffic through. To be fully protected, organisations must have an application-layer firewall to prevent attacks such as DoS and SQL injections. The edgeNEXUS web application firewall supports both PCI-DSS and OWASP firewall requirements.
Final Deployment Stage
To maintain resiliency, we always recommended a pair of Load Balancers, so as not to create a single point of failure. Upon deploying across multiple data centres, it is advisable to have a pair in each and then a Global Server Load Balancer (GSLB) in each site so that you can run in an active-active scenario, which is extremely beneficial as it makes it easy for global users to be distributed across multiple data centres.
Once you have considered all the above, it is time to purchase your Load Balancer and to get your infrastructure project underway. Soon you will wonder how you ever managed without a load balancer before!
If you would like to find out more about the edgeNEXUS load balancing options available, please click here.