P2P in B2B: Getting Past the "N" Word

Page 2 of 3

      Bookmark and Share

Legally Mandated P2P?
That shift may be legislatively endorsed as well. In an ironic twist, some of the same techniques used by software pirates could be central to the U.S. government's plan to share data more effectively among its disparate departments and agencies. The E-Government act passed in December 2002 calls for agencies to develop standards for categorizing government information so that it can be searched across agencies. Currently, the Environmental Protection Agency, U.S. Geological Survey, and Housing and Urban Development are among the federal groups that are deploying distributed content systems like NextPage to move the U.S. Government to the next level of interdepartmental information sharing. Call it federal Napsterization (oh no, there's that "N" word).

"We have what we call a federal government content network," says Brand Neiman, chair of the XML services working group for the Federal CIO Council, which is testing peer-to-peer arrangements among departmental servers and evangelizing these emerging technologies to others in the government. "It allows us to set up a system of hierarchical folders, and they literally look like they are all on one machine, but they are different content types on different servers. They are using the XML standard and messaging between the servers to communicate what their content is," says Neiman.

Neiman's dream is to use peer-to-peer sharing so that complex inter-agency content projects like the Annual Statistical Abstract of the U.S. could be made into a "distributed living document" online. This printed compilation of data usually lags a year behind the most recent figures because it takes so much time and manpower to aggregate information from 200 locations to form 40 chapters and 1,500 tables. "They can't produce the document until the last paragraph of the last table is in," says Neiman. In a distributed system, the raw reports remain in the hands of the people and agencies creating and updating the data, sometimes on a user's own desktop machine. The P2P technology tracks and virtualizes this real-time information across desktops and servers into virtual space so that the Abstract exists as a unified whole for the end-user. "It's one of the best examples we have in the government of how something that originates in 200 locations and groups could be managed in a distributed content network and be extremely effective," says Neiman. "It's not only creating the same document, but making more accessible the background information," because peer sharing lets contributors from other agencies drill down into any piece of data on another server to see what programs and statistics were behind the top line numbers.

The Grid Discount
While real-time content updating and more effective collaboration among people and departments are some of P2P's sexier attributes, nothing is more attractive to IT managers than cost savings. Companies like Kontiki and Bandwiz are getting a lot of attention by blending grid networking with peer-to-peer techniques so that companies can send massive video and presentation files out to their own people without investing in more bandwidth. "We use bandwidth harvesting techniques to cache content at the outer edge of the network on other peers or nodes on the network," says Mark Szelenyi, director of enterprise marketing, Kontiki. Instead of clogging inter-office data lines or email servers with transferring the same large file again and again at every request, the Kontiki system sends a file from the originator's server only at the first few requests. As this same file gets downloaded to desktops and servers at remote offices, a directory server tracks where the file now resides.

With the same P2P client technology that it uses in consumer-facing download accelerators for CNet, Kontiki lets the end-user machine consult a directory server, which tells this client all the locations where the content resides, perhaps even on the P2P-connected PC in the next cubicle. "We'll deliver a file for a user from up to twelve sources and piece it together on the receiving end," says Szelenyi. Like distributed computing or grid systems, redundancy is built into the system, so if one node goes offline during a download, the software knows how and where to add another source.

Poised to make a splash later this year, Bandwiz is using a similar approach in proof-of-concept tests to pass content among 68 remote offices for a major oil company as well as elearning materials for an insurance company with thousands of sales offices. "We send this content only once through the WAN to the branches and sites," says Daniel Sapir, VP of business development. "There it gets deposited in one or more desktops and then it goes P2P," he says. "Once you get it to the local site, the bandwidth is ample. Recipients get a notice that it is there. They click on a URL, and it is delivered from the local peer. If another site needs the same thing, it can get it from the nearest physical site."

The real benefits of P2P content distribution in the enterprise are not always explicit, but they have the potential to change the ways that people work. Breaking the old bandwidth restrictions opens the door for using much more robust content, like training or support video, more often within an existing infrastructure. Before P2P, "these types of applications weren't able to use digital media because of the strain it creates on the network," says Szelenyi. The Kontiki system delivers video training to Nextel's enterprise sales force and it lets Palm, Inc. create and distribute online downloadable videos that answer the top ten questions that usually send customers to the costly phone support. "At $15 a phone call, the ROI here is pretty high," says Kontiki's Szelenyi. And at the very least there are clear workplace efficiencies, says Bandwiz's Sapir. "The value is that you make the content available to the specific audience that you want, and they get it immediately on their desktop."

Page 2 of 3