By Bob Rankin
When you click on a link in your Web browser, the browser opens one connection to the destination site and begins downloading the target file through that single connection. It’s like getting water through a single hose; only so much water will arrive in a given amount of time. Download accelerators try to use multiple hoses, exploiting one of the features of the HTTP protocol.
HTTP can request specific “ranges” or segments of a file rather than the entire file. This feature lets you resume an interrupted download at the point where it was interrupted, rather than forcing you to download the entire file again. You can also pause a download deliberately and restart it where you left off.
Download accelerators open more than one connection to a Web server. Each connection requests a different range of the target file. So you have multiple parts of a file being downloaded in parallel. As the parts arrive they are assembled into the completed file. You have multiple hoses pouring water into one bucket, so the bucket fills faster… in theory. Several factors can render download accelerators ineffective.
Some Web sites are configured to limit the number of simultaneous connections they will make to a given IP address. Webmasters do this to share their limited supplies of connections and bandwidths fairly among all users. Traffic congestion between a Web server and your browser also limits the effectiveness of download accelerators. Of course, you cannot download any faster than the maximum speed of your Internet connection, either. If, for example, you’re on a 5Mbps cable modem connection, a download accelerator isn’t going to pull in 10 or 20 Mbps. It’ll be 5 Mbps max, if you’re lucky.
A few Web servers place bandwidth caps on downloading connections. In such cases, a download accelerator is actually effective because it employs multiple connections to get around the per-connection bandwidth cap. But a smart Webmaster will limit connections per IP address as well as capping bandwidth per connection.