The Problem With Net Neutrality (II)




(New York) Two years ago, not a single U.S. television network streamed its programming over the Net. Today they all do. In 2005, YouTube didn’t exist. Arbor Networks estimates that YouTube is today responsible for 10 percent of total global online traffic.

And not only are the number of streamed/downloaded videos increasing but the quality is too. The 480p and 640p standards are being eclipsed by 720p and probably soon, 1080p (approximately Blue-Ray quality).

Look at how this impacts the data required: On iTunes, a two-minute QuickTime movie trailer in 480p (standard definition) requires 47 megabytes. The 720p file is 78 megabytes and the 1080p version requires 126 megabytes. Extrapolating from that, a two-hour movie will require between three and eight gigabytes of data.

This surge of online data and the quantum increase in network complexity are worth keeping in mind the next time someone talks of the “light touch” of Net neutrality regulation.

The only way that network operators can handle the surging growth in online data from P2P filesharing, video streaming and other applications involves complex new technology and investment – lots of both.

To take one example, in 2008 Cisco unveiled a new generation of router branded “Nexus” (cost: between $75,000 and $200,000) that can process up to 15 terabits per second. As the great technology columnist Mark Stephens a.k.a., Robert X. Cringley has written,

“[I]f we imagine a DVD-quality H.264 video stream running typically at one megabit per second, that [router] could seemingly support 15 MILLION such data streams.”

Into this sort of complexity walks a federal program to regulate “neutrality” over such a massive (and growing) amount of data transfer. No doubt many telecom attorneys and lobbyists will earn enough to purchase homes at Rehoboth beach from all the lobbying… er, “educating” that neutrality regulation will spur.

Not exactly a great way for America to improve its ranking in the global broadband race.