Twitter’s Follies




(Las Vegas) There’s something risibly ironic about Twitter CEO Evan Williams co-signing a letter to the FCC that calls on the Commission to “begin a process to adopt rules that preserve an open Internet.” Leaving aside the irony of a federal agency drafting “openness” regulations, what’s amusing is how Williams seems to have forgotten how easily his own company leapt to an unwarranted conclusion two years ago on precisely this issue. Back in 2007, some T-Mobile customers began having difficulty twittering. On December 14, co-founder Biz Stone blogged: “Hey folks. T-Mobile has definitely turned us off without notification.” Oops! A few days later, the company was forced to admit that it had botched its investigation. There was no blocking at all, but “purely a technical issue between T-Mobile and Ericsson, the folks who serve our SMS traffic.” Stone acknowledged that earlier comments had been made on the basis of “limited information.” (Memo to CorpCom departments: Never let your press releases get in front of the facts!)

A similar incident happened in 2006 when some Cox customers could not access CraigsList.com. At least give Craig Newmark credit for admitting that the issue was “a genuine bug.” However many of his Net neutrality supporters weren’t as dependent on the facts for their conclusions.

Alas eBay CEO and Dartmouth grad John Donahoe also signed the letter. You’re doing a great job at eBay, John, but avoid the temptation of diving into federal regulation. You wouldn’t be the first tech CEO to get burned by a federal policy fight. Remember how Steve Case began lobbying for open access on cable — right around the time AOL was negotiating its merger with TimeWarner?

The Problem With Net Neutrality (II)




(New York) Two years ago, not a single U.S. television network streamed its programming over the Net. Today they all do. In 2005, YouTube didn’t exist. Arbor Networks estimates that YouTube is today responsible for 10 percent of total global online traffic.

And not only are the number of streamed/downloaded videos increasing but the quality is too. The 480p and 640p standards are being eclipsed by 720p and probably soon, 1080p (approximately Blue-Ray quality).

Look at how this impacts the data required: On iTunes, a two-minute QuickTime movie trailer in 480p (standard definition) requires 47 megabytes. The 720p file is 78 megabytes and the 1080p version requires 126 megabytes. Extrapolating from that, a two-hour movie will require between three and eight gigabytes of data.

This surge of online data and the quantum increase in network complexity are worth keeping in mind the next time someone talks of the “light touch” of Net neutrality regulation.

The only way that network operators can handle the surging growth in online data from P2P filesharing, video streaming and other applications involves complex new technology and investment – lots of both.

To take one example, in 2008 Cisco unveiled a new generation of router branded “Nexus” (cost: between $75,000 and $200,000) that can process up to 15 terabits per second. As the great technology columnist Mark Stephens a.k.a., Robert X. Cringley has written,

“[I]f we imagine a DVD-quality H.264 video stream running typically at one megabit per second, that [router] could seemingly support 15 MILLION such data streams.”

Into this sort of complexity walks a federal program to regulate “neutrality” over such a massive (and growing) amount of data transfer. No doubt many telecom attorneys and lobbyists will earn enough to purchase homes at Rehoboth beach from all the lobbying… er, “educating” that neutrality regulation will spur.

Not exactly a great way for America to improve its ranking in the global broadband race.

The Problem with Net Neutrality




Regarding FCC Chairman Julius Genachowski’s recent proposal to create a “Net neutrality” rule covering the broadband Internet, a few comments:

Let’s begin at the beginning: The Internet is not a single system. It’s a multitude of mostly private networks of varying capacity and speeds. True, they share the same basic software protocols but the networks have such fundamental differences (e.g., shared node vs. dedicated line, fiber optic vs. coaxial cable) that using the same label to describe them gives a misleading impression.

Second, there’s a big difference between Internet Protocol (IP) services, which often require end-to-end service over a Virtual Private Network (VPN), and applications run over the public Internet. For example, Net neutrality proponents often talk about how data sent over “the Internet” has always been equal. But VPNs have been offering fee-based services for 20 years that involve expediting data transmission.

This is a key distinction when applied to online video, especially IP television. From an engineering perspective, IP TV data cannot touch the public Internet and maintain quality comparable to cable or satellite. Even if the data spent only part of their transmission outside of the VPN, the quality would be fatally compromised.

That is why data for many time-sensitive applications (IP television, HD video streams) must travel over a dedicated network end-to-end.

That’s also why, from an economic perspective, it is a pretty radical concept for the FCC to consider regulating comparable treatment guarantees for multiple video streams! The Net neutrality concept implies that a competitor should have video service over the public Internet that is comparable to private network speed and quality. With the huge cost to deploy new technology, no ISP could afford to build that kind of a network.

More coming…

Net Neutrality Follies




The New York Times’ recent editorial on Net neutrality sets a new standard for dismaying ignorance about how the Net actually works. It posits that the Verizon-NARAL texting snafu is a reason to support Net neutrality. But that two-year-old issue involved Verizon’s refusal to grant NARAL a short code, which has nothing to do with network operations!

The Times’ editorial also suggests that without federal neutrality regulation, “Businesses could slow down or block their competitors’ Web content.” But as Hogan & Hartson’s Christopher Wolf, one of the nation’s premier Internet attorneys, demonstrated in a detailed analysis more than three years ago:

“If the hypothetical fears of those calling for regulated ‘net neutrality’ actually do come to pass in some fashion, there are legal remedies already available under existing laws and legal doctrine.”

These options include unfair competition law, antitrust law, and common law tort theories. Moreover, as the Supreme Court confirmed in its Brand X decision, Title I of the Communications Act of 1934 gives the FCC power to take regulatory action if presented with unfair business tactics by broadband providers.

The Times’ editorial on Net neutrality shows a gross misunderstanding of the law, network operations, and the precarious dynamic of funding for Internet deployment. Other than that, it’s tolerable.

Facebook Ups Its Privacy Emphasis

Filed under: Technology,Telecom



(Washington, DC) By one estimate, more than 24 percent of U.S. Internet users have Facebook accounts, as do 43 percent of the online audience in the UK. Facebook’s servers host an amazing 10 billion photos.

So it’s good to see that the company just improved its privacy policies, making them more transparent to the end user. For an excellent analysis by one of the top privacy commentators in the country, click here.

God Save the Queen

Filed under: Technology,Telecom


(Washington, DC) I’m writing the following with apologies to a certain Proskauer Rose partner who’s an occasional reader, a good friend and a litigator on behalf of the recording industry. The Register (Great Britain) reports that the British Government appears to be succumbing to yet another doomed recording industry-induced effort to stop music piracy through government fiat.

It won’t work. A file sharer can incorporate some pretty simple compression and/or encryption technology to bypass government snooping.

But here’s the bigger issue: The recording labels need to stop looking at file sharing as a legal issue and instead recognize the far bigger consumer issue. The industry’s litigation in the U.S. was not only an abysmal legal failure, it completely backfired. Its heavy-handed tactics crystallized sentiment among the target audiences that the companies were clueless about the Internet. That fed the idea that “It’s OK to steal from them.”

Fortunately, the recording labels have finally begun to recover from this disaster. The morale: Don’t insert your attorneys between your customer and what he/she wants.

And if the Proskauer attorney is still reading, I’ll pick up the lunch tab at our next get-together.

Google’s On2 Something



(Washington, DC) Tech guru Mark Stephens, who writes under the nomme de guerre Robert X. Cringley, just posted a great analysis of Google’s recent purchase of On2, a maker of audio and video compression software. The real issue with Google isn’t hard to figure out. Its server farms stretch from Oregon to North Carolina and form the basis of the company’s move into content delivery. Witness YouTube’s increasing attempts to monetize itself through program ads a la Hulu.

That’s also why the company has kept up the drumbeat in Washington for Net neutrality, which undercuts the development of tomorrow’s Internet and – what a coincidence! – helps to freeze in place Google’s advantage in content delivery networks.

Funny how self-interest can seem so high-minded at times..

 
 « Previous Page