© 1998-2017, Arnold Consulting Group, L.L.C.
Offices in Washington, DC
and Vienna, VA
703.629.8552

 
 

February 5, 2010

Cisco’s New Kid

Filed under: Net Neutrality,Technology — Peter Arnold

Jim Duffy at Network World reports that Cisco is going to announce a new core router to replace its aging CRS-1. The new one supposedly boasts 120Gbps per slot, which would best Juniper’s T-1600 by about 20 percent.

Let’s put this in a larger context. Cisco’s Nexus switch, which replaced the Catalyst two years ago and works on the Net’s backbone and VPNs, can process up to 15 terabits per second. If you assume that a DVD-quality H.264 video stream runs at about 1MB/second, a single Nexus could handle 15 million streams. Or enough for everyone in Illinois.

Small wonder that the telecos, which have bet the farm on pumping HD video to America’s masses, were breathing like Tiger Woods on Bourbon Street when Nexus came out. Just like they will with Cisco’s new entry.

Look at what they’re up against: AT&T’s 2 gigabit backbone lasted about seven years. Its 10 gigabit backbone lasted five years and now the 40 gigabit will last perhaps 3 years. Given these facts, it’s pretty obvious why AT&T has the largest capex in the country.

At least Goldman Sachs got it right last week, noting that while higher capex reduces free cash flow, it’s still “absolutely the right move in our view.”

December 14, 2009

From the Sublime To…

Filed under: Net Neutrality,Technology,Telecom,Wireless Pricing — Peter Arnold


(Washington) … the ridiculous:

“This notion that customers must now curb their Internet usage or pay up is not only unfair to consumers, it puts up a roadblock to wireless innovation,” says Craig Aaron, senior program director at Free Press, a nonprofit group that advocates for unfettered access to communications.

The year has three more weeks but it’s safe to say that the competition for the most incomprehensible, illogical statement is over.
Beyond the absurdity in Aaron’s statement, the irony is that Free Press styles itself as a consumer advocacy group. But what sort of consumer advocate could possibly support a uniform pricing system in which serial filesharers are subsidized by senior citizens who use their connection to check email twice a day?

Logic would say that Free Press and a consumer advocate like Craig Aaron should support a lower-priced option for occasional users, instead of deriding it. Wonder why they don’t…

December 10, 2009

LA Confidential

Filed under: Apple,Net Neutrality,Technology,Telecom,Wireless Pricing — Peter Arnold

(Las Vegas) The award for the year’s most puzzling headline goes to the Los Angeles Times for this doozy on David Sarno’s story about iPhone usage: “AT&T may penalize iPhone users who hog data.”

It’s a great headline, at least to the extent that “penalize” is a shorthand way of saying, “Those who consume a lot of a product should pay more than those who only use a little.” Actually, Sarno’s article is fairly straightforward and is one more example of how the current uniform data pricing structure among wireless carriers is increasingly untenable.

According to the head of AT&T’s wireless division, three percent of iPhone users are chewing up 40+ percent of the company’s bandwidth. Credit the proliferation of mobile video options. But this imbalance should hardly come as a surprise since a similar imbalance has existed on wired broadband for years.

Japan, which led the world in wired broadband, faced a similar issue well before America did. Thanks mostly to P2P, fewer than five percent of subscribers consumed almost half the country’s bandwidth. For a good resource on this, here’s the 2007 report on net neutrality from Japan’s Ministry of Internal Affairs and Communications.

Specifically, look at the usage charts on pages 17-19 and the conclusions on pages 59-66.

November 1, 2009

Mozilla’s Follies

Filed under: America Online,Net Neutrality,Technology — Peter Arnold


(New York) Mozilla’s Mitchell Baker and John Lilly are responsible for a great product — Firefox, the Mac version of which has been my default browser for three years. But when it comes to federal Internet regulation, Ms. Baker and Mr. Lilly seem to have fallen into the classic “Steve Case” trap.

Back in the 1990s, Case led America Online when it was the Net’s online Colossus. But he let a certain starry eyed, high-ranking executive talk him into becoming the ISP point person on the cable open access debate then raging at the FCC. But at the same time Case was demanding that the FCC regulate cable access, COO Bob Pittman was negotiating to merge with TimeWarner. Oops. The resulting confusion delayed the companies’ merger and underscored how confused its new corporate mission had become.

Hard as it is to believe, just ten years ago, the phrase “AOL stock options” was the stuff of retirement dreams, not a punch line.

Back to Mozilla’s Baker and Lilly. In last Friday’s Wall Street Journal, they urge the FCC to regulate “neutrality” over the Net. But by placing their own company as the reason for new regulation, they actually underscore regulation’s true problem. To see why, look at three common types of downloads:

  • Not time-sensitive. When you click on a website’s .exe or .dmg download link, there’s no real difference to you whether it downloads in 15 seconds or 20 seconds.
  • Partially time-sensitive. Some applications (e.g., file-sharing programs) require a download to be completed within a certain time. But the data transmission doesn’t have to be constant. In other words, if a file needs to be downloaded in a minute, there’s no downside if half the data is downloaded in the first 10 seconds, while the remaining half is stretched over the other 50 seconds.
  • Time- and latency-sensitive. This is the Big Enchilada. For the web’s biggest new apps (primarily movie streams, IPTV and VoIP), you not only need rapid service but also constancy of transmission speed! The scenario in the one-minute download above won’t work with a call or an HD stream because you not only need a high average speed, you also need a constant speed.

But under Lilly’s and Baker’s theory, the HD stream will only transmit as fast as the Mozilla download — even if the customer is willing to pay more! This is a totally nonsensical view of the Internet. Essentially, Lilly and Baker are trying to expand a regulatory structure meant for dial-up modems to tomorrow’s high-speed systems. Yes, that protects their own turf but it shuts off one of the most obvious ways the country has to finance the broadband build-out.

Lilly and Baker should stick to improving their browser, rather than becoming enmeshed in a nasty policy fight that doesn’t even involve their company. That’s some advice that Steve Case probably wished he’d followed.

October 4, 2009

The Problem With Net Neutrality (II)

Filed under: Net Neutrality,Technology,Telecom — Peter Arnold


(New York) Two years ago, not a single U.S. television network streamed its programming over the Net. Today they all do. In 2005, YouTube didn’t exist. Arbor Networks estimates that YouTube is today responsible for 10 percent of total global online traffic.

And not only are the number of streamed/downloaded videos increasing but the quality is too. The 480p and 640p standards are being eclipsed by 720p and probably soon, 1080p (approximately Blue-Ray quality).

Look at how this impacts the data required: On iTunes, a two-minute QuickTime movie trailer in 480p (standard definition) requires 47 megabytes. The 720p file is 78 megabytes and the 1080p version requires 126 megabytes. Extrapolating from that, a two-hour movie will require between three and eight gigabytes of data.

This surge of online data and the quantum increase in network complexity are worth keeping in mind the next time someone talks of the “light touch” of Net neutrality regulation.

The only way that network operators can handle the surging growth in online data from P2P filesharing, video streaming and other applications involves complex new technology and investment – lots of both.

To take one example, in 2008 Cisco unveiled a new generation of router branded “Nexus” (cost: between $75,000 and $200,000) that can process up to 15 terabits per second. As the great technology columnist Mark Stephens a.k.a., Robert X. Cringley has written,

“[I]f we imagine a DVD-quality H.264 video stream running typically at one megabit per second, that [router] could seemingly support 15 MILLION such data streams.”

Into this sort of complexity walks a federal program to regulate “neutrality” over such a massive (and growing) amount of data transfer. No doubt many telecom attorneys and lobbyists will earn enough to purchase homes at Rehoboth beach from all the lobbying… er, “educating” that neutrality regulation will spur.

Not exactly a great way for America to improve its ranking in the global broadband race.

September 28, 2009

The Problem with Net Neutrality

Filed under: Net Neutrality,Technology,Telecom — Peter Arnold


Regarding FCC Chairman Julius Genachowski’s recent proposal to create a “Net neutrality” rule covering the broadband Internet, a few comments:

Let’s begin at the beginning: The Internet is not a single system. It’s a multitude of mostly private networks of varying capacity and speeds. True, they share the same basic software protocols but the networks have such fundamental differences (e.g., shared node vs. dedicated line, fiber optic vs. coaxial cable) that using the same label to describe them gives a misleading impression.

Second, there’s a big difference between Internet Protocol (IP) services, which often require end-to-end service over a Virtual Private Network (VPN), and applications run over the public Internet. For example, Net neutrality proponents often talk about how data sent over “the Internet” has always been equal. But VPNs have been offering fee-based services for 20 years that involve expediting data transmission.

This is a key distinction when applied to online video, especially IP television. From an engineering perspective, IP TV data cannot touch the public Internet and maintain quality comparable to cable or satellite. Even if the data spent only part of their transmission outside of the VPN, the quality would be fatally compromised.

That is why data for many time-sensitive applications (IP television, HD video streams) must travel over a dedicated network end-to-end.

That’s also why, from an economic perspective, it is a pretty radical concept for the FCC to consider regulating comparable treatment guarantees for multiple video streams! The Net neutrality concept implies that a competitor should have video service over the public Internet that is comparable to private network speed and quality. With the huge cost to deploy new technology, no ISP could afford to build that kind of a network.

More coming…

September 7, 2009

Net Neutrality Follies

Filed under: Net Neutrality,Technology,Telecom,Wireless Industry — Peter Arnold


The New York Times’ recent editorial on Net neutrality sets a new standard for dismaying ignorance about how the Net actually works. It posits that the Verizon-NARAL texting snafu is a reason to support Net neutrality. But that two-year-old issue involved Verizon’s refusal to grant NARAL a short code, which has nothing to do with network operations!

The Times’ editorial also suggests that without federal neutrality regulation, “Businesses could slow down or block their competitors’ Web content.” But as Hogan & Hartson’s Christopher Wolf, one of the nation’s premier Internet attorneys, demonstrated in a detailed analysis more than three years ago:

“If the hypothetical fears of those calling for regulated ‘net neutrality’ actually do come to pass in some fashion, there are legal remedies already available under existing laws and legal doctrine.”

These options include unfair competition law, antitrust law, and common law tort theories. Moreover, as the Supreme Court confirmed in its Brand X decision, Title I of the Communications Act of 1934 gives the FCC power to take regulatory action if presented with unfair business tactics by broadband providers.

The Times’ editorial on Net neutrality shows a gross misunderstanding of the law, network operations, and the precarious dynamic of funding for Internet deployment. Other than that, it’s tolerable.

August 6, 2009

Google’s On2 Something

Filed under: Net Neutrality,Technology,Telecom,Wireless Industry — Peter Arnold

(Washington, DC) Tech guru Mark Stephens, who writes under the nomme de guerre Robert X. Cringley, just posted a great analysis of Google’s recent purchase of On2, a maker of audio and video compression software. The real issue with Google isn’t hard to figure out. Its server farms stretch from Oregon to North Carolina and form the basis of the company’s move into content delivery. Witness YouTube’s increasing attempts to monetize itself through program ads a la Hulu.

That’s also why the company has kept up the drumbeat in Washington for Net neutrality, which undercuts the development of tomorrow’s Internet and – what a coincidence! – helps to freeze in place Google’s advantage in content delivery networks.

Funny how self-interest can seem so high-minded at times..