point
Menu
Magazines
Browse by year:
In Software we Trust
Friday, February 1, 2002
Now that the “Dot Com” edition of Parker Brothers’ Monopoly is being discontinued, it’s probably too late to discuss winning strategies. Certainly, building a hotel on Excite@Home (their replacement for Park Place) is probably not a great investment. One thing is clear though: If you’re going to pick a pewter playing piece before you start the game, eschew the little man at the computer, skip the tiny silver mouse and go for the symbol of true power in technology — the blinking computer cursor — since software and monopoly have become nearly synonymous in the computer industry.

Software Monopolies are the Rule

It’s easy to point to Microsoft as the prime example of this, with Bill Gates sitting on hotels on both Boardwalk and Park Place. But don’t disregard its quieter, more unassuming classmates, each of them sitting on their own color-groups and charging double rent. In fact, monopolists and duopolists dominate most of the popular software categories, with their markets concentrated to a degree that makes their hardware brethren green with envy.

Want to manipulate digital images? Then you’d better have some Adobe software installed on your computer. Doing your finances online? Hope you made sure that your bank outputs files in a Quicken-compatible format. Business diagramming? Visio nearly owns that market. Do you need to instant message someone, file taxes, create 2D or 3D animation, or do small-business accounting? Those businesses are controlled by AOL, Turbotax, Macromedia, 3D StudioMax, and Quickbooks, respectively, each with a majority market share, and most in excess of 80 percent. While market share is not the official test of monopoly power (the ability to control prices is), it’s certainly a leading indicator.

Even less mature software markets are at least duopolies. Microsoft shares streaming media with Real and handheld operating systems with Palm. On the enterprise side of things, where the customers insist on having alternatives, ownership is still heavily concentrated. Of Oracle, Siebel, SAP, Checkpoint and Veritas, even the weakest of these companies enjoys a quarter of the market, and most more than 40 percent. Finally, some of the best-known hardware companies lead their markets because of software. Cisco benefits greatly by its control of the dominant routing software, and IOS and Intel’s leadership is largely due to having had (until relatively recently) the sole practical platform for Microsoft’s software.

With market share comes many of the benefits of market dominance—especially in an industry where the marginal cost of producing an additional unit is close to zero. Market share then equates almost directly to profitability—an ironic outcome for an industry that, at its birth, had many observers questioning whether it could ever make money.

Users Pay for the Development

In the game of Software Monopoly, the would-be landlord doesn’t have to pay much for the houses and hotels – the tenants (or users) perform most of the upgrades. This is because the heavy exchange of user-generated information stored in proprietary formats naturally leads to a monopoly situation over time. Users and developers create documents and applications on top of these software products, and then spread these to other people. The greater the number of users, the more effort spent creating the document; and the greater the sharing, the more value that is built upon the original application. Eventually, the level of investment becomes extremely high and the software monopoly is born, albeit in a very different way from the manner in which traditional monopolies are born.

In the old Standard Oil- or AT&T-type monopoly:

1) The original fixed cost to start the business is so large that the marginal cost of adding another unit is miniscule by comparison.

2) The monopolist uses this ability to undercut the competition and drive them out of business.

3) With competition out, the monopolist uses higher prices and lower output to increase profits.


In the new, user-assisted software monopoly:

1) The original fixed cost of getting a software company launched is low. Instead, the heavy cost and investment is assumed by the users and third-party developers, as they build plug-ins and save images and documents in proprietary formats. The marginal cost of creating another copy of the software is zero.

2) The users share the documents with other users, forcing the need for compatibility. They lock themselves in, since they need to be able to access their old files. In this way, they raise the effective price of using a competitive product, and drive the competitor out of business.

3) The newly created monopolist, rather than raising prices and lowering output, keeps prices nearly constant and increases output to saturate the market. The near-zero marginal cost of production leads to a dramatic increase in profits.

Thus, the software monopolist enjoys low amounts of initial capital to get started, relies on its users rather than large investments to drive its competitors out of business, and uses market share rather than higher prices to make money. The lower cost of initial capital and the high margins make the software monopoly a bonanza for shareholders; and the benefits of standard and relatively fixed prices make the pain considerably less for the end users. The competitors, who now have the monopolist — and inadvertently, the users —working against them, get the worst of it.

Utilities Can’t Play the Game

Just as in Monopoly, where you can’t build hotels on the Railroads or the Electric Company, the rules of software monopoly don’t apply to the industries that provide the infrastructure. The communications hardware industry provides equipment and protocols on which considerable user data travels, but the industry leaders consistently face stiff competition. In fact, the lower down one goes on the traditional network stack, the less likely one is to find monopoly concentration in an industry. The basic transport segments of optical transmission, Ethernet, Sonet, ATM and the like, are heavily fragmented, low-margin areas. Higher up the stack, in the TCP/IP domain, we see greater concentration amongst the Ciscos and the Checkpoints of the world. It’s only very high up, at the application and session protocol layers that the traditional software companies, complete with their market shares and margins arrive.

Players lower on the OSI stack have to conform to open standards that make them replaceable. Here, the rules of the game are tightly defined by BellCore, the IEEE, the IETF and so forth. These open standards exist at the lower levels of the stack, but not at the higher levels for three main reasons: frequency of exchange, complexity of formats and user concentration.

The rate at which users exchange the data is important since, at the lower levels of the stack, frequency of data exchange is common and is the sole determinant of value. An optical transmission link has zero value unless it can talk to the rest of the network. However, a copy of Adobe Photoshop has value, even if no one else can read the files, since one can always print out the finished documents. Therefore, the pressure on the optical transmission link to communicate with other links out of the box is immense—and that requires conformance with open standards. The exception occurs if the market starts out as a series of small closed networks that are only later joined together, such as VPNs, intra-offices and application-specific networks.

As to complexity, standards bodies can define relatively static protocols like SONET, Ethernet and IP. But defining all the different ways in which most software applications communicate with each other (and with user documents) is nearly impossible because software is constantly adding features and functionality. There are exceptions, though. For example, transport protocols are quick to be defined, whereas management protocols are slow. The latter require less frequent interaction with competing equipment, and are considerably more complex, so the network vendors can define these themselves, rather than letting standards bodies do it for them.

Finally, users of most packaged consumer software are numerous and poorly organized. They have neither sufficient personal incentive nor resources to launch an open standards campaign for their favorite software package. At least enterprise users, who normally like to have a fallback option, have some leverage, and can ask for things such as SQL and POSIX compatibility. Buyers of telecom and networking equipment have it best of all. In most cases, a half-dozen or so purchasers can dictate standards compliance to dozens of hopeful vendors.

Whose Data is it Anyway?

Eventually, one gets tired of going around the Software Monopoly board, realizing that we’re paying rather than receiving the $200 every time we pass Go. Constant upgrades of expensive software packages are irksome in an era of unprecedented success for open standards like TCP/IP and HTTP. The good news is that more software is starting to be built around these open standards. The bad news is that when the standard doesn’t exist, such as for Instant Messaging, the proprietary solution can achieve market dominance very quickly.

Furthermore, clever software vendors are pushing more and more of the data storage deeper into the network cloud, where the file format will be closed and locked away forever. Users should, whenever possible, eschew minor features and favor the ubiquity of simple, open standards. There’s no doubt that all of the .txt, .mp3, .jpg, .html and .mpg files on your computer will still be accessible a decade from now, regardless of your hardware or software platform, at no additional software cost. That’s not true of most of the other files, so don’t put your time and effort into building value for a vendor — namely, your own work — which you’ll be paying time and time again to use.

For the trustbusters, the challenge is much more daunting. The current approach — splitting companies, and oversight of what can and can’t be integrated — is shortsighted and doesn’t address the root problem. Splitting a company doesn’t bring any real relief for the users in the short term, since we’re stuck in whatever applications we were using. And close oversight results in disastrously stifling and expensive government regulation. It risks curtailing the innovation that comes from integrating formerly disparate components, and destroying the benefits that standardization brings to such a complex industry.

The focus for those who favor intervention should be on the advancement and accurate publishing of complete APIs, protocols and formats, and enforcing adherence to such. Similar to the patent system, the innovator who establishes the market (be it AOL for instant messaging or eBay for auction trading) can use a proprietary API or file format for a period of time. But once the market has matured, it must publish and adhere to open standards. The extreme case — complete forced open-source — exposes the APIs, but at the ruinous cost of allowing no differentiation between products. Forcing open standards at the interface where applications and files meet at least provides interoperability and allows vendors to once again compete for the user’s loyalty. It’s not completely fair to the winners of software monopoly, but it protects users, provides choice and removes the “Go Directly to Jail” alternative from the industry.

Naval Ravikant is a founder of Genoa Corporation and Epinions, Inc. He currently invests in communications infrastructure and software companies for August Capital. si

Twitter
Share on LinkedIn
facebook