Not all that long ago, content flowed one way. You posted content to your website and just how much you posted and when was entirely up to you. That model is changing quickly, however, as Web 2.0 concepts flow into the enterprise, and companies are increasingly providing tools for customers, partners, and other interested parties to post content including text, video, and pictures. These days, content is a two-way street. The trick is how to avoid content traffic jams.
With content flowing in two directions, issues arise not only about which content gets posted, but also about the types of service providers required to allow organizations to deal with fluctuations in the amount of content that gets added to the site at any given time. Certainly, companies like YouTube and MySpace have been designed from the start to deal with unpredictable quantities of user-generated content, but what about organizations that have been doing business on the web the old-fashioned (one) way? How do these companies deal with this new paradigm and what happens when the users begin generating content in earnest?
Build It and They Will Come
The thought of letting any number of unknown contributors run loose on your websites may be a daunting proposition to some, but companies selling such diverse items as travel services, motorcycles, and beer are doing just that. Michael Gordon, CEO of LimeLight Networks, a company that provides content-network services, says you have to go into this knowing there will be traffic fluctuations. "There are going to be lots of applications and interactions that may take your website usage patterns in unpredictable directions," he says. Not only that, he adds, "The catalog is going to expand in an uncontrolled way because you don't know how much the users will post."
If you are concerned whether your web infrastructure can handle this content influx, you needn't worry, according to John Blossom, president of Shore Communications Inc., because he says the actual technological impact of the content on a web-content management system is fairly minor, especially when compared to the potential impact user-generated content could have on the site. "I don't think it's a situation where the content-management system suddenly has to handle thousands of people uploading video simultaneously, because it's all handled on the back end. If somebody is uploading audio, video, or another rich-media source, chances are the company is sequestering the content on a server and managing it in a back-end database," he says.
Robin Hopper, CEO of iUpload, a company that helps organizations set up blogs and wikis to facilitate user-generated content sharing, says you need to look beyond participation and the number of people who might visit your site at any one time, and look at how the content gets distributed. "As much as Web 2.0 is about very broad participation, encouraging lots of people to participate on the content side, not just the site owner, it also has to be about how you distribute the content and you need to have both broad and targeted distribution to leverage the content gems and put them in front of the right audience," he says.
Lots and Lots of Bandwidth
If the content has to be distributed, and it's not your servers feeling the pressure, where is the content going? Companies like YouTube and MySpace, large media outlets, and others use Content Distribution Networks (CDNs) from companies like Limelight Networks, Akamai, and Mirror Image. In fact, iUpload has its own servers but also relies on Akamai to handle the traffic surges that are inevitable with user-generated content. "To do this well," Hopper says, "you have to have bandwidth, and it needs to be intensive bandwidth."
According to the online computer dictionary, Whatis.com, "a content delivery network is an interconnected system of computers on the internet that provides web content rapidly to numerous users by duplicating the content on multiple servers and directing the content to users based on proximity." In practice, companies take different approaches, but each distributes the load and sets up access points around the globe to give users instant access when they click a link. Gordon says we are a "quick-twitch generation" and users won't stick around without instant gratification, regardless of whether they are visiting a media site or your company's user community site. "Even though it's user-generated content, users are still expecting media-type experiences," he says.
To achieve that, each company sets up thousands of servers around the world. Mirror Image uses a concept called Content Access Points (CAP). Martin Haywood, director of marketing at Mirror Image, says this approach helps customers deal with traffic spikes. "Each CAP we have around the world has hundreds of servers that will deliver the content, whether it's text, video, or images; and we have a tremendous amount of bandwidth available for customers," he says.
Pedro Sanchez, senior product marketing manager at Akamai, says it's about scaling as demand changes. "The beauty of the Akamai infrastructure is that it scales on demand. If you are setting up for something like a Vista launch or a Super Bowl ad, or you could be a retailer who is going to get flooded with traffic in December, we scale infrastructure on demand. Content that is cacheable goes onto the Akamai network and is distributed on additional servers on the network as usage increases," Sanchez explains.
Limelight also has a similar structure with servers spread across the globe, according to Gordon, but it uses what he calls an "edge" strategy to deliver the content to the end user. What that means is that the content is stored on servers all over the globe, and when a user requests the file, it gets delivered from the closest available server. This enables your company to deliver user-generated content in real time without investing in more servers to deal with changing demands, a strategy that Gordon believes makes little sense. "Peak traffic is why a customer picks a company like Limelight. It doesn't make sense for a customer to have 200 servers sitting around when normal traffic could be handled on 20 servers. On the other hand, if you have only 20 servers and get 200 servers worth of demand, you let your visitors down." Gordon says.