10.7 C
New York
Thursday, April 25, 2024

The Great Bifurcation

 

The Great Bifurcation

Courtesy of Ben Thompson, Stratechery

For the first several years of Stratechery I would write a year end article about “The State of Consumer Technology”; the last one I wrote, though, was in 2018, because consumer technology, dominated as it was by Apple and Google on the device side, and Google and Facebook on the services side, seemed rather stale and destined to descend into the world of politics and regulation (I was more optimistic about the enterprise, both in terms of the ongoing shift to the public cloud and the opportunity for SaaS companies).

That has largely proven to be the case, but it’s not the first time this has happened to technology; the pattern has happened twice before, and in each case the seeds of the next era were planted — usually by incumbents — while the previous era stagnated. And, in every case, the transition was marked by a reduction in lock-in and the devolvement of increasing amounts of autonomy to the individual user.

Tech 1.0: From Invention to IBM

The transistor, the foundation of modern computing, was invented at Bell Labs in 1947 by the solid state physics group led by William Shockley; nine years later Shockley moved to Mountain View, California to be close to his ailing mother in Palo Alto, where he started Shockley Semiconductor Laboratory. Eight of the researchers he hired, led by Bob Noyce, left the increasingly erratic Shockley a year later to found Fairchild Semiconductor, and in 1968 founded Intel with the support of Arthur Rock, one of the first venture capitalists.

The West Coast, though, was a sideshow compared to New York, where IBM had switched to transistors for the 7000 Series mainframe (as opposed to the 700 Series’ vacuum tubes); the real breakthrough was the modular and expandable System/360, which was the first computer bought by most companies, including the fictional SC&P from Mad Men:

There certainly was a connection to be drawn between IBM and the moon: IBM helped develop and track NASA’s initial exploratory flights and the eventual lunar mission. Here on earth, though, the Justice Department decided in 1969 that the company was in violation of antitrust laws; the case would be dropped 13 years later, but not before IBM voluntarily unbundled its software and services from its hardware, creating the first market for software.

Tech 2.0: King of the Hill

Notice those dates: by the time the Department of Justice sued IBM in 1969, Intel had already been founded; two years later an Intel engineer named Frederico Faggin designed the first microprocessor, the Intel 4004, which shrunk many of the functions of IBM’s room-sized computers to a single chip. Ten years after that IBM released the IBM PC, powered by Intel’s 8088 microprocessor.

The open nature of the IBM PC platform — at least once Compaq backward-engineered IBM’s BIOS — commoditized PCs; the real points of leverage in the PC value chain were Intel for chips and Windows for the operating system. The latter was a two-sided market: because so many businesses bought the Windows DOS-powered IBM PC, developers were motivated to make software for DOS; the more software for DOS, and later Windows (which was backwards compatible), the more that businesses sought out DOS/Windows-based computers. Over time more and more people who first used computers at work wanted similar functionality at home, which meant that DOS/Windows dominated the consumer market as well.

Thus was born another Justice Department lawsuit, this time against Microsoft’s alleged monopoly; that case was also eventually dismissed (although it lived on in various forms in the E.U. for years). Once again, though, the next paradigm that rendered the seeming monopolist’s lock-in immaterial was already in place: the Internet could be accessed from any computer, no matter its operating system. Moreover, in an echo of IBM’s voluntary unbundling of hardware and software, which created the conditions for tech’s next evolution, it was Microsoft that introduced the XMLHttpRequest API to Internet Explorer, which undergirded the Ajax web app architecture and Tech 3.0.

Tech 3.0: Software Eats the World

If Tech 1.0 was about hardware, and 2.0 software, 3.0 was about services. On the enterprise side this meant the development of the public cloud and software-as-a-service applications that required nothing more than a browser and a credit card; Marc Andreessen’s famous 2011 essay, Software Is Eating the World, is really about this transformation from software you installed on your computer to software you accessed over the Internet:

Companies in every industry need to assume that a software revolution is coming. This includes even industries that are software-based today. Great incumbent software companies like Oracle and Microsoft are increasingly threatened with irrelevance by new software offerings like Salesforce.com and Android (especially in a world where Google owns a major handset maker).

In some industries, particularly those with a heavy real-world component such as oil and gas, the software revolution is primarily an opportunity for incumbents. But in many industries, new software ideas will result in the rise of new Silicon Valley-style start-ups that invade existing industries with impunity. Over the next 10 years, the battles between incumbents and software-powered insurgents will be epic. Joseph Schumpeter, the economist who coined the term “creative destruction,” would be proud.

Continue here >

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments

Stay Connected

157,319FansLike
396,312FollowersFollow
2,290SubscribersSubscribe

Latest Articles

0
Would love your thoughts, please comment.x
()
x