By IT Brew Staff
less than 3 min read
Definition:
Load balancers distribute network traffic as efficiently as possible across multiple servers, ensuring that a single server is not hit with too much demand at once. IT pros can use load balancers to increase a network’s capacity and reliability, especially on networks that must service enormous amounts of traffic per day (such as popular cloud applications or websites).
Load balancers belong in two categories: hardware and software. A hardware load balancer (HLD) is a physical appliance that distributes incoming traffic across multiple servers in a network. An enterprise with multiple HLDs and servers in multiple locations can not only optimize network traffic but also redirect traffic to other servers and data centers if one node fails, minimizing downtime.
Load balancers allow IT professionals to respond quickly to ebbs and flows in traffic. For example, if traffic to an e-commerce site spikes due to an ultra-popular deal on a product, a load balancer can bring additional servers online to handle the added visitors. Conversely, if traffic falls off, the load balancer can minimize the number of servers in use. Load balancers rely on algorithms to direct traffic, such as “round robin,” which distributes requests between servers in a sequential manner.
Software load balancers (SLBs), on the other hand, are loaded onto standard servers (or their virtual equivalents). For some IT pros, opting for software-based load balancing over its hardware equivalent has distinct advantages: For example, software load balancers can operate within hybrid environments, and quickly scale to handle unexpected traffic spikes that might overwhelm a set number of hardware appliances.