What a long strange trip it’s been….

Filed under: Technology,White House

A remarkable event happened in DC this week: Something was actually accomplished with a minimum of bluster. On Thursday, the Senate voted overwhelmingly to approve a permanent ban on state authority to tax Internet access.  The House already passed the bill, titled The Internet Tax Freedom Act. The bill now heads to the President, whose aides have signaled his support.

I’ve been involved with this issue for more than 15 years and since it’s Thursday, perhaps a “TBT” is in order — namely, my first MSNBC appearance. It was back in 2001; Sen Ron Wyden (OR) and I discussed taxes and broadband regulation:



An aging Apple?

Filed under: Apple,Wireless Industry

According to a new report from Nielsen SoundScan, for the first time since 2003, when the iTunes store opened, annual U.S. digital music sales have declined. Digital track sales fell from 1.34 billion units in 2012 to 1.26 billion in 2013, a decline of about six percent.  Digital album sales fell 0.1% while CD sales went into free fall – down nearly 15% last year.

The driving force behind these numbers appears to be the growth of streaming music sites which suggests that creative destruction in the music industry is alive and well.

But that aside, the more interesting tech parlor game is how this change will affect Apple.  After all, no other company has used the music industry’s current business model more effectively and more profitably.  Apple has married innovative hardware and elegant software into first-class (read: expensive, high-margin) products.

This strategy worked because everyone needed personal hardware (iPods, iPhones, Macs) to access entertainment.  The “cloud” was not a viable mass-market option for most of the past decade and Steve Jobs many times dismissed the concept of streaming music services.

But with Spotify, Pandora, Vevo (the web’s #3 video publisher behind YouTube and Facebook) and others offering attractive services to fit changing demands, the concept of buying single songs no longer has the same allure.  Yes, Apple Radio is a good entrant but it was woefully late.

In the past, Apple cannibalized sales in order to create new products.  The iPhone did that with the iPod.  But Apple back then was a different company.  Its near-death experience in 1997 was still recent history when the company began work on the iPhone in 2004, despite protests from the iPod division.

With the music industry seeing the impact of cloud-based services, Apple’s model no longer looks as imposing.  Moreover, a $487 billion company (Apple today) doesn’t necessarily act with the quickness of a $2.3 billion company (Apple in 1997).

The song will not remain the same.  Stay tuned.

Microsoft: “with hollow eye and wrinkled brow”

The slow erosion of Microsoft’s brand continues.  New York Times columnist Randall Stross has a wonderful essay this morning on how the salons in Redmond are using Microsoft 8 as an excuse to phase out the eye-rolling “Live” appendage to  its desktop software:

After so many years of pushing the Windows Live brand in so many products, the company couldn’t easily drop the branding, even if executives had come around to the idea that it was misbegotten. But the imminent arrival of a new version of its flagship PC operating system, Windows 8, seems to provide cover for the change.

While schadenfreude at Microsoft’s expense is a frequent occurrence these days, the import of this is larger than a misbegotten marketing strategy.  Stross covers one key point and passes over another.  First, as Stross rightly notes, the emphasis on “Live” was Microsoft’s attempt to capitalize on real-time integration of software programs and users across the web.  This worked fine for Xbox where the addition of real-time gaming actually lent itself to the “Live” moniker.  But as PC users gravitated toward the web, particularly on mobile phones when 3G became ubiquitous, something else happened: The central focus on the PC and all its attendant software began to fragment. Microsoft couldn’t respond as Apple and Google effectively grabbed the smartphone OS market.

The second point, which derives from this, is that the web (especially the mobile web) put the public’s focus on third-party software developers. This is another big Microsoft weak point since the company has spent nearly 20 years expanding its bundled software in an effort to stamp out third-party competition.  This strategy worked fine when dial-up ruled but as mobile broadband expanded, this was doomed.

Memo to former AG Reno: You could’ve saved a lot of time and trouble.  The marketplace worked.

May the farce be with you

RAMESES IIWall Street Journal reporter Jeffrey Trachtenberg’s article this week on e-book pricing is a depressing reminder that Karl Marx was right: History does repeat itself, first as tragedy and then as farce.  Trachtenberg’s focus is on the inflated pricing structure creeping into e-book pricing but if you change the topic from books to music, his article could probably have appeared in 1998.

To put this in perspective, here’s a thought worth at least passing consideration: How did the music industry, with all its entrenched legal and financial heft among the major labels during the 1980s and 1990s, manage to be dominated by a computer company?  Call it arrogance in the sense that the guiding philosophy for too long involved a refusal to give consumers what they were clamoring for.  Meanwhile, the only pro-active strategy seemed to be launching lawsuits against tweens and grannies.

Remember Liquid Audio?  No?  OK, here goes: It started in the mid-1990s as a music download service and had the potential to succeed, driving home broadband adoption in the process.  Reportedly Steve Jobs nosed around the company early on, considering purchasing it as a way to jumpstart Apple’s nascent moves into the music industry.

But Liquid Audio never caught on because (gee, this is shocking) it could never secure sufficient licensing agreements with the music labels.  Meanwhile, even as two-hour movies on DVD were selling everywhere for $14, the music industry kept insisting that this same price was reasonable for a 45-minute audio disc with perhaps eight songs – in other words, the same product as Elvis Presley’s 1956 Christmas Album albeit in a more technologically sophisticated format.

Back to e-Books.  With the music industry’s pricing arrogance serving as a case study in how to anger consumers, the publishing conglomerate has a serious need for entry level marketing help.  This story has been told before and the ending is inevitable:

Round the decay
Of that colossal wreck, boundless and bare
The lone and level sands stretch far away

R.I.P., Steve

Steve JobsDuring the coming days, there will be testimonials to Steve Jobs that border on hagiography. (Erica Ogg at GigaOm is already out with a good one.) He might not have changed “the world” but he certainly did change large parts of it. More than anyone else, Jobs convinced the music industry to stop the circular firing squad mentality that resulted in the perverse concept of engaging the next generation of buyers by suing or threatening to sue them.

But twenty years earlier, he also foresaw the role of the graphic interface in personal computing while most of the industry was fixated on hardware. Think back to 1980: IBM not only rebuffed Bill Gates’ proposal to sell the concept behind Windows’ forerunner, it told Gates that its OS/1 would push Microsoft out of business. That was the climate into which Jobs pushed Apple barely a few years later.

This author remembers it well. In 1984, I purchased my first computer. It was a 128 kb Mac preloaded with basic software (anyone remember MacWrite?) that Dartmouth made available for about $1,000. The word processing was definitely preferable to typing on a Smith-Corona but had its limitations. The maximum size of a text document was about 6-7 pages which meant that my senior thesis on Jean-Paul Sartre and European totalitarianism actually comprised three separate documents.

Apple would founder without Jobs. It suffered terribly under Gil Amelio, whose lack of foresight can be summed up by his brusque dismissal of the Internet in his biography, On the Firing Line.

Finally, it’s worth noting that in addition to the iPod, iPhone, Mac and everything else, Jobs had a hand in the greatest commercial in TV history. According to Apple lore, the Board was appalled when Jobs previewed the commercial in late 1983 but Jobs persisted. Nearly 30 years later, no other commercial has come close.

R.I.P, Steve Jobs.

Why Apple might want to do the Hulu

Filed under: Apple,IBM,Technology

Why would Steve Jobs be interested in Hulu? The answer may be in Tarheel Country, specifically Maiden, near I-77.

That’s where you’ll find Apple’s mammoth new data center, built to handle iCloud and a lot more.  How much more?  Look at some numbers: This center is either 500,000 square feet (AppleInsider) or a million (Robert Cringely). By comparison, according to Cringely, IBM’s Special Events Web Service, which handled data for the Olympics, has three data centers with a combined 2000 square feet.

Any way you calculate it, iCloud by itself will not generate the data needed to give Apple anything approaching a suitable return on capital.  But wait!  Isn’t Apple trying to fast track the consumer transition to the cloud by phasing out (or minimizing) internal hard drives?  The new Mac Mini, introduced this week, has no optical drive and neither does the Mac Air, which replaces Apple’s aging MacBook.  Meanwhile, you’ve never been able to play a blu-ray disc on your Mac.

But that’s still not nearly enough data to pay for the Maiden center, especially given Moore’s Law (“the number of transistors on a chip will double about every two years”).

Let’s go one step further and factor in compression.  Even aging microchips in the Apple line have an H.264 encoder/decoder that apparently can compress a 1080p audio/video stream into four megabits per second.  That compares with about 20 megabits in normal circumstances or perhaps 24 for better HDTV.

Here’s the short answer on Hulu: Apple has pumped a huge amount of money into North Carolina but its returns, even if iCloud takes off, will be paltry-to-negative.  Even if you forget the money (Apple is sitting on $76 billion, after all), the black eye for the company would be a huge embarrassment.

So Steve and his colleagues need to do something fast.  Buy Hulu and integrate it into iTunes.  It’s not a great solution but as Richard Dreyfus demanded from Roy Scheider in Jaws, “You got any better ideas, hot shot?”

Once more unto the… cloud?

Apologies for the long drought in blog updates.  Aaron Burr once said, “Delay may give clearer light as to what is best to be done” which may explain why Alexander Hamilton took the first shot in their duel.  But it’s not great advice for blogging.

Two events in the computing world this past week show that Serenditpity is more than just a well-known Manhattan eatery. IBM commemorated its 100th anniversary and Google issued its first foray into consumer hardware since last year’s Nexus One disaster.

First up, IBM.  Thirty years ago, Bill Gates offered to sell IBM the rights to the precursor to the Windows OS for about $100,000.  IBM refused, not comprehending the idea of profits in software outside its OS/1.  The driving principles in IT for the next generation offered a cruel lesson. By the late 1980s, minicomputers had pushed aside mainframes since VAXes were about a third the cost of an IBM 3090 and smaller too.  By the mid-1990s, PC’s had eclipsed minicomputers since they were about a third the cost of a VAX.

IBM’s saving grace was bringing in Dartmouth grad Lou Gerstner in 1993, whose approach to business (and himself) was so amoral that he had served on the board of a well-known cancer advocacy group only to resign to become head of RJR Nabisco.  But Gerstner understood computing’s changing dynamics, quickly reversing John Akers’ disastrous idea of quasi-autonomous divisions pursuing conflicting goals.  Instead, Gerstner killed OS/2 and turned the company’s focus to high-margin client services. (By 2004, his successor spun off the PC business to Lenovo.)

So here’s where tomorrow’s fun starts: As Walter Mossberg writes in today’s Wall Street Journal, Google’s new Chromebook computers are:

“essentially full-screen Web browsers designed to do everything via the Internet. Instead of using traditional programs, you will rely on ‘Web apps’ accessed through the browser—email programs, word processors or photo editors, for example.”

Among the benefits, notes Mossberg:

Because all your apps, settings and files are stored in the cloud, if you lose your Chromebook, or wish to use someone else’s Chromebook, you can just log into your Google account and all your stuff will appear on the new machine.

Google automatically updates the operating system, so you don’t have to deal with manual updates.

Look at the numbers driving this and the logical outcome.  Unlike certain industry observers, numbers don’t lie.  Depending on a company’s infrastructure (OK, a big “if”), cloud-based computing costs are at least a third cheaper than the cost of a PC, once you include the time and cost of customizing software, keeping it current, and adding security.

If the mainframe-to-VAX-to-PC experience holds, within another five years, the savings will approach 80 percent.  So if you’re in college now with your heart set on joining an IT department, your odds are about as good as beating the house at Caesar’s.

By that time, server farms such as what Apple is about to bring online in North Carolina to power iCloud will have proliferated globally, as will fast, low-cost M2M wireless connections.  At that point, your data is going into the cloud, backed by 256-bit encryption which isn’t perfect but is still far preferable to today’s system.

The Chromebook will not be a commercial success because consumer habits, especially in technology, are not conditioned for changes this radical, unless the company peddling it is based in Cupertino.  But cracks are in the Titanic’s hull and water is seeping in.  In another five to six years, Microsoft’s Office and Windows will evoke memories of Ozymandias, which is too bad insofar as Windows 7 is actually a worthy competitor to Snow Leopard.  But it is too little, too late.

“There is no reason for any individual to have a computer in his home”

Ken OlsenDigital Equipment Corporation (DEC) founder and longtime CEO Ken Olsen died this week at 84.  By all accounts, Olsen was a decent man but his passing is a timely reminder of the problems of lineal thinking with technology.

Olsen missed the PC revolution because he didn’t see oncoming OS improvements that would make the computer a consumer product.  He wasn’t alone.  Around 1980, IBM rejected a young Bill Gates’ offer to sell the forerunner of the Windows OS for about $100,000.  As decisions go, that’s akin to General Pickett telling his troops to take Cemetery Hill.

Olsen lasted as DEC’s CEO for another 15 years after making the above comment in 1977.  But by that time, the company was well into its death roll.

Fast forward to today: The PC era is ending and the OS as we’ve known it for a generation won’t matter because what made it important to us is quickly migrating to racks in a dark data center. Devices are increasingly coming preloaded with the OS on a chip, while apps and data are stored in the cloud.

Meanwhile, our computer gizmos are falling into two groups: mobile devices, including ultralight laptops, and large displays with integrated receivers for entertainment.

DEC’s mainframes lost to minicomputers 20+ years ago because the latter were less than half the cost.  (As an added bonus, they didn’t need a special room cooled to the temperature of Detroit in February.)  By the early 1990s, minicomputers had lost out to PCs because the computing cost of latter was less than half that of a VAX.   Now, look forward to what the cloud means: Companies can slash their IT staff because they won’t need to spend the money when employees have stripped down smart terminals, an small encrypted hard drive and an Ethernet connection.

Welcome to the future!

Spam is gone and other tech truisms

willrogersAs the American philosopher Yogi Berra once said, It’s tough to make predictions, especially about the future.  And as the late Will Rogers (left) remarked, It isn’t what we don’t know that causes us trouble, it’s what we do know that just ain’t so.

In that vein, Cody Willard at Marketwatch just published a mildly amusing post, “Top 10 Dumbest Tech Predictions of All Time.”  A few entries are depressingly predictable such as comments disparaging the utility of the telephone or computer.  Overall, though, it’s worth reading as a window into the problems of lineal thinking when confronting a dynamic industry.

Alas, when it comes to technology ideas, there’s no shortage of bad ideas.  Remember Flooz?  Mercata?  Cyberrebates.com?  Didn’t think so.  Still, it’s worth adding two recent gems to Willard’s list including:

“By joining forces with Time Warner, we will fundamentally change the way people get information, communicate with others, buy products and are entertained – providing far-reaching benefits to our customers and shareholders.” That was Steve Case on January 10, 2000 announcing the AOL/TimeWarner merger that he and Ken Novack had been pushing.  Further comment superfluous.

“Two years from now, spam will be solved.” That was Bill Gates speaking to the BBC at Davos in 2004.  Alas, his prediction is just slightly off — to the tune of about 70 billion a day, according to this article in The New York Times.  It must’ve been the altitude.

This farce could go on and on since chronicling absurd tech predictions is like shooting fish in a barrel.  For the ultimate chronology of awful (and mostly hilarious) predictions, check out The Experts Speak by Chris Cerf and Victor Navasky.

Smart TV: A not-so-smart idea

Filed under: Technology

According to The Wall Street Journal, TV manufacturers are increasingly looking at bringing the Internet to your TV in order to create a more interactive experience.  Don’t bet on success.

A basic rule of marketing consumer products is that you won’t succeed by taking something simple and making it complex.  First, there is the ingrained consumer habit that TV watching is a passive experience.  It’s about entertainment.  Adding an Internet component inevitably adds complexity, starting with a new software system that consumers must master.  Combine that with the archaic copyright laws governing retransmission and you have a train wreck in the making.

Look no further than the reporting from The NY Times’ Ashlee Vance and Claire Miller on Google’s recent Google TV disaster to see the problem.

Second, recent consumer electronics trends are clear: It’s all about mobility.  Laptops have surged past desktops. Smartphones now comprise more than a third of the U.S. mobile phone market, likely to become a majority by this time next year.  The idea that your flat screen will become the focal point for social networking, commerce and other apps seems, at best, fanciful.

Third, there’s the shark in the water: Broadband connections mean virus/malware threats.  Smart TVs are already attracting hackers’ attention and the unspoken truth about most current “smart TVs” is that their defenses against this threat are woeful at best. As a practical matter, that means the Amazon link that suddenly appears on your TV screen is really routing you to a server in Novosibirsk.

Consumers already need to update their PCs and smartphones.  How many will relish the thought that they need to stay current on yet another product?

As the old marketing adage goes, Companies never succeed by taking something simple and making it complex.  (Mr. Ballmer, call your office about Windows 7.)   The Smart TV is a wonderful concept from a technological perspective.  Those who enjoy thumbing through user manuals will also like it.  But for most consumers?  Doubt it.