Mobile Television Could Not Exist Without Content Delivery Networks

The popular concept of television anytime, anywhere and on any device may be Internet driven, but it could not exist without Content Delivery Networks (CDN) — the speedy enablers that work quietly behind the scenes connecting video programs to end users.

Twenty years ago, CDNs for Internet media didn’t exist — mainly because there was no Internet media. The earliest networks were used to speed up data for large websites and other corporate needs.

It was at NAB, 1995 —19 years ago — that Rob Glaser, a former Microsoft executive, announced the first streaming audio. It was called RealAudio. On Sept. 5, 1995, just after NAB, Glaser’s company —then called Progressive Networks — made history when it broadcast a baseball game between the New York Yankees and Seattle Mariners over the Internet. The broadcast was the first major use of streaming audio over the Internet.

Rob Glaser.  By 1997, the Seattle-based<br />company had changed its name <br />to RealNetworks and Glaser announced <br />Real Video at NAB. The era of streaming <br />video had begun.

Rob Glaser. By 1997, the Seattle-based
company had changed its name
to RealNetworks and Glaser announced
Real Video at NAB. The era of streaming
video had begun.

Within two years, the Seattle-based company had changed its name to RealNetworks and Glaser announced RealVideo at NAB. The era of streaming video had begun. By end of the decade of the 90s, more than 85 percent of streaming content on the Internet was in Real’s format.

In the early days streaming media was plagued with problems. It was difficult to receive programming due to glitches and constant breakup. This prompted Glaser to build the first media specific network — called RBN — for the Real Broadcast Network. It was the first CDN exclusively designed for media.

Real had started off with a bang, but it would soon fail. The company’s business model depended on the sale of its streaming media server software to distributors of media content. Microsoft and Apple did an end run and began giving its own streaming software away for free. Real’s server software business took a major hit and didn’t recover.

Almost simultaneously, in 1999, a young scientist named Daniel Lewin, a former elite special forces member of Israel’s army, went public with his MIT professor, Tom Leighton, in a content delivery company called Akamai. In the public offering, the owners of Akamai, based in Cambridge, Massachusetts, became billionaires. And then came the dot.com crash in 2000, taking away many of Akamai’s clients.The basis for Akamai’s technology was a mathematical scheme applied to CDNs by Lewin and Leighton called “consistent hashing.” It was an invention that — at least on paper — could get rid of the Internet’s “world wide wait,” as it was jokingly called. Just as important, the system could “scale,’’ meaning it would continue to work no matter how many people used it.

Akamai has servers located in many locations round the<br />globe, all designed to speed delivery.

Akamai has servers located in many locations round the
globe, all designed to speed delivery.

Consistent hashing, originally devised by Professor David Karger at MIT, is the underlying premise of distributed caching. Caching is accomplished through servers, which are positioned between the source of the content and the user requesting it. An efficient caching system is able to refresh or change content so as to reflect changes to web pages.

Using algorithms, consistent hashing offered a way of organizing files optimally. When a new server was added or a new set of data was introduced, the data would be spread evenly and consistently across the positioned computing servers — helping with the efficient delivery of the data.

On Sept. 11, 2001, Danny Lewin boarded American Airlines Flight 11 to Los Angeles to visit with a potential Akamai client. Unknown to him, his flight was targeting the World Trade Center. Lewin, trained in counter-terrorism by the IDF, tried to stop the hijackers during the flight. While fighting with one of the men, he was stabbed in the back by another. Lewin became what is believed to be the first person killed in the new era of terrorism that began on 9/11. 

Danny Lewin, co-developer of Akamai<br />Networks, was among the first killed in <br />the attacks of 9-11. Trained in counter-<br />terrorism by the IDF, he tried to overtake <br />the hijackers of American Airlines flight <br />11, but was stabbed in the back by a <br />terrorist.

Danny Lewin, co-developer of Akamai
Networks, was among the first killed in
the attacks of 9-11. Trained in counter-
terrorism by the IDF, he tried to overtake
the hijackers of American Airlines flight
11, but was stabbed in the back by a
terrorist.

Lewin’s death was a major blow to Akamai, who was already trying to regain ground after the burst of the dot.com bubble in 2000.

“The loss of Danny was incredibly tragic,” said Jeff Young, who was head of public relations at Akamai on 9/11 and remains there today as vice president of corporate communications. “But ironically, 9/11 also showed that Akamai's technology really worked. It was our biggest traffic day to that date. People were flooding news websites to see what was going on from all over the world.

“We were supporting many of those web sites like CNN and the Washington Post. Many of those who were not our customers completely went under with the huge demand. They could not support the load. This massive breaking news story was the first demonstration of Danny's technology in a global way.”

For two years, Akamai struggled with the tragedy. But it was Lewin and Leighton’s technology that eventually turned the company around. After the attack, web traffic for Akamai’s global network of clients, including major news media sites, surged by a factor of five. Lewin’s technology managed the spike well and its customers noticed. Akamai, which needed new clients after the dot.com bust, added travel companies, airlines and government agencies to its roster on an emergency basis in the days after 9/11.

Then came another major impact for CDNs. It was with the rapid success of Apple’s iPhone in the summer of 2007, coupled with the introduction of the iPad in 2010. With these two devices, the concept of media anywhere, anytime on any device really took off. It would take the speed of CDNs, rather than the congested public Internet, to make live video work well on portable wireless Internet-connected devices.

Though Lewin never lived to see it, his technology allowed millions of users to watch streaming video simultaneously. It also kept news websites online during global crises as viewers rushed for the latest information. This video revolution gave birth to the modern Content Delivery Network.

Today’s CDNs are web servers distributed across different data centers in different geographical regions of the world. They deliver multimedia content to end-users, based on the location of this user. They also mean the faster performance of a hosted website and a better security from hacker attacks.

This is because CDNs maintain multiple “Points of Presence,” meaning the servers store copies of identical content and apply a mechanism that provides logs and information to the origin servers.

Akamai leads the CDN market today with an estimate of about 30 percent of Internet traffic, while Limelight Networks and Level 3 Communications follow. In 2013, Verizon acquired EdgeCast Content Delivery Network, a leading CDN that operates thousands of content servers in 30 Internet hubs around the world.

Jason Thibeault, senior director of marketing strategy at Limelight Networks said the major development today in CDNs is the addition of the cloud. “Unlike traditional data center cloud storage, we’ve integrated it with the edge of the network,” he said. “From a customer standpoint, this enables them to get their content much closer to the end user. We feel this is a real game changer in the way we talk to customers about content delivery.” 

Jayson Thibeault, Senior Director of Marketing Strategy,<br />Limelight Networks.

Jayson Thibeault, Senior Director of Marketing Strategy,
Limelight Networks.

All CDNs are embracing the cloud in their infrastructures, Thibeault said, but it’s too early to say where it will end as companies experiment and build out client applications.

He described what modern CDNs do in simple language:

“There are three miles of delivering content to an end user. Somebody sits at a computer and they click on a link on a website. That request has to go across all three miles. The first mile, a middle mile and the last mile,” he said. “In the first mile that connection goes from the end user to his Internet Service Provider. The middle mile is the distance between the user’s Internet Service Provider, who provides the request, to the content owner, who responds to it.

“The last mile is that space between the content owner and the physical server where that content is located. What a CDN does is make that round trip much faster. So instead of having to go across the public Internet where there is lots of congestion, the CDN uses a private network to get around that. In essence, it brings that content to the user quicker.”

A Level Three control room. Headquartered in Broomfield,<br />CO, Level Three is one of only six Tier 1 Internet providers <br />in the world.

A Level Three control room. Headquartered in Broomfield,
CO, Level Three is one of only six Tier 1 Internet providers
in the world.

Thibeault said the most interesting thing that has happened recently in the CDN space was the Verizon acquisition of Edgecast. “It made people see how important CDNs are for delivering the Internet. If we didn’t have CDNs the Internet would be an entirely different place today. We will probably see more consolidation in the market. The Verizon acquisition was a green light for people to start paying attention to what is happening in the CDN market.”

CDNs and their associated smaller companies continue to grow fast. Cisco projected the business grew between 40 and 45 percent last year, and the complete market will grow from $6 billion to $12 billion by 2015.

“The concept of TV everywhere is really making CDN awareness explode,” Thibeault said. “There is no way to deliver that kind of experience without using a CDN.”

You might also like...

Video Quality: Part 2 - Streaming Video Quality Progress

We continue our mini-series about Video Quality, with a discussion of the challenges of streaming video quality. Despite vast improvements, continued proliferation in video streaming, coupled with ever rising consumer expectations, means that meeting quality demands is almost like an…

2024 BEITC Update: ATSC 3.0 Broadcast Positioning Systems

Move over, WWV and GPS. New information about Broadcast Positioning Systems presented at BEITC 2024 provides insight into work on a crucial, common view OTA, highly precision, public time reference that ATSC 3.0 broadcasters can easily provide.

Next-Gen 5G Contribution: Part 2 - MEC & The Disruptive Potential Of 5G

The migration of the core network functionality of 5G to virtualized or cloud-native infrastructure opens up new capabilities like MEC which have the potential to disrupt current approaches to remote production contribution networks.

The Streaming Tsunami: Securing Universal Service Delivery For Public Service Broadcasters (Part 3)

Like all Media companies, Public Service Broadcasters (PSBs) have three core activities to focus on: producing content, distributing content, and understanding (i.e., to monetize) content consumption. In these areas, where are the best opportunities for intra-PSB collaboration as we…

Designing IP Broadcast Systems: Addressing & Packet Delivery

How layer-3 and layer-2 addresses work together to deliver data link layer packets and frames across networks to improve efficiency and reduce congestion.