By Gail Seymour
If you have a Web application that connects to a database or runs processor-heavy, server-side scripts, you will soon come to realize that spikes in demand can cause your site to become sluggish, or even worse, unavailable, just when you need it to function most. If your site has reached the stage where you need multiple servers to cope with user demand, it’s time to think about load balancing.
What Is Load Balancing?
Load balancing is a way of distributing the workload across multiple servers. Multiple machines can be linked to appear as a single virtual server, or machines organized into clusters, with a hardware switch or router directing requests between them to minimize load times. Thus load balancing can be achieved by both software or hardware based implementations.
- Software-based load balancing requires software to be installed on the servers in a load-balanced cluster. Incoming requests are handled by the software and processed on the designated server based on an algorithm. This could be as simple as a “round-robin,” with each incoming request going to the next processor on the list, or a complicated one that takes recent performance into account.
- Hardware-based load balancing uses a specialized switch or router with load balancing software installed on it.
The Benefits of Load Balanced Servers
- Improved scalability.
A well set up load balancing configuration will allow processing power to be used intelligently. The switch should be able to route user requests to the processors with the lowest current demand, as they will be able to provide the quickest response times. It should be possible to prioritize applications, so that user interactions are handled with the highest priority and background processes halted and queued for processing during periods of lower demand. When demand is low, some servers may be taken offline, and brought back online when they are needed.
- High performance.
This ability to draw on multiple processors to handle a single user request results in quicker download times for the end user. It’s a bit like the difference between standing in line at a single checkout behind 10 other shoppers when you only have two items to pay for, and what happens to that queue when three other checkouts are opened. The shopper with a full cart no longer holds up the three or four who have single items to pay for, and they are able to move though another channel. Even better, with an effective load balancing set up, the shopper with a full cart is able to split their load and move through multiple channels to clear the bottleneck quicker.
- Higher availability.
Sticking with the checkout analogy, if you only have one register, and you need to remove the cash drawer to balance the float, no transactions can be processed until the job is complete and the drawer returned to the till. If you have a spare cash drawer, it can simply be swapped out and business can resume while the float is balanced. In the same way load balancing across multiple servers enables one to be taken offline for routine maintenance, and also provides a level of disaster recovery. Depending on the set up, provisions can be made for alternate routing in the event of a single server or complete site failure.
Load Balancing and Cloud Computing
Load balancing is also important in cloud computing. Since the concept of cloud computing is intended to allow dynamic scaling of resources on demand, there has to be a way to assess and monitor demand, and also the available resources. The load balancer, whether hardware or software based, is the mechanism that allows this to happen.
About the Author
Gail Seymour has been a Web site designer for more than 10 years. During that time she has won three Web site design awards and has provided the content and copy for dozens of Web sites and more than 50,000 Web pages.