12.3 C
New York
Friday, April 19, 2024

The demise of Moore’s Law, The “Big Flip,” and the crisis of expertise

Is software the new hardware? And when an algorithm system decides not to give us a job, will we have the right to know why? 

The demise of Moore's Law, The “Big Flip,” and the crisis of expertise

[David Brin is an astrophysicist, futurist, and best-selling author. He serves on advisory boards (e.g. NASA's Innovative and Advanced Concepts program) and speaks and consults on a wide range of topics. David's international best-selling novels include The PostmanEarth, and Existence. For more, visit the Contrary Brin blog and David's website.]

Courtesy of David BrinContrary Brin

2017 may be viewed as the year of a “big flip,” when our progress in computation and Artificial Intelligence swerved from hardware – which improved for forty years along the exponential curve called Moore’s Law — over to software, which has lagged for decades. Preliminary signs indicate that new methods – plus the availability of prodigious machinery and data sets – may empower software to take some giant leaps. Moreover, this is happening at the same time that Moore’s Law experiences its long predicted “S-Curve” tapering-off.

I’ll explain. But first, a related topic.  Here’s a deeply thoughtful and well supported missive on expertise, especially scientific, and the troubled way in which expert views are often over- or under-appreciated: The Crisis of Expertise by Tom Nichols, author of The Death of Expertise: The Campaign Against Established Knowledge and Why it Matters.

Tom Nichols doesn’t address vexatious issues like the War on Science, a politically propelled vendetta that has metastasized into a broad-front attack upon all fact-using professions. Nor does he explore the fascinating tradeoffs between two centuries — the 20th, which featured a Professionalization of Everything — and the 21st, whose amazing Rise of the Amateur’ I document elsewhere.

No, this rumination by Nichols zeroes in, thoughtfully, on the difficulty of truth-seeking and reliable verifiability in science, especially when it gives advice to policy.

Beyond Moore's Law

The demise of Moore’s Law: “The computing industry is adjusting to the loss of two things it has relied on for 50 years to keep chips getting more powerful. One is Moore’s Law, which forecast that the number of transistors that could be fitted into a given area of a chip would double every two years. The other is a phenomenon called Dennard scaling, which describes how the amount of power that transistors use scales down as they shrink. Neither holds true today,” writes Tom Simonite in Technology Review. But this article asserts that it doesn’t matter, because while the pace of hardware improvement has slackened, coincidentally, the long-sluggish state of software has experienced some rapid surges with advances by Google and others in the field of Machine Learning.  

Is this – as I assert – a “big flip”?  If so, it’s not the first time. In fact, elsewhere I have hypothesized that human evolution saw something very similar, over a vast time scale. I see evidence… and I speak of this elsewhere… that we humans increased our cerebral, linguistic and tool-using skills for half a million years the hard way, by developing the brain itself to become a supreme problem-solver… crunching very crude software by brute force, like using a “Cray” to grind through Cobol code. Whereupon, there commenced the first of many rapid software revolutions. Take the relatively sudden transformation that took place about 40,000 years ago. a point when (as described in EXISTENCE) our ancestors suddenly (across a few centuries) expanded their tool kit by a factor of five or more, started elaborate burial rituals and made stunning cave paintings. Subsequent likely upgrades happened roughly 15,000, 6,000, 2500, 500 and 200 years ago. And especially today.

In other words, the Big Flip (as I call it), from hardware driven exponential computational growth to software driven, may be only the latest in a series of leaps that followed (roughly) a similar pattern. At least… that is a hypothesis offered up by this science-fictional seer.

Can we still even see what’s going on? 

The Dark Secret at the Heart of AI: Is it important for humans to understand what’s going on under the hood, as we embark on an era when Machine Learning takes off, as algorithm systems determine who makes parole, who’s approved for a loan, and who gets hired for a job? “There’s already an argument that being able to interrogate an AI system about how it reached its conclusions is a fundamental legal right. Starting in the summer of 2018, the European Union may require that companies be able to give users an explanation for decisions that automated systems reach. This might be impossible, even for systems that seem relatively simple on the surface, such as the apps and websites that use deep learning to serve ads or recommend songs,” writes Will Knight in Technology Review.

“This raises mind-boggling questions. As the technology advances, we might soon cross some threshold beyond which using AI requires a leap of faith. Sure, we humans can’t always truly explain our thought processes either—but we find ways to intuitively trust and gauge people.” 

Hm, well, that intuition thing has always been iffy, even with each other.  No, what finally started with us was not understanding each other so much as gaining tools to hold each other accountable. Which unleashed (in a few places) flat-fair competition. Which unleashed creativity. But only in those places where accountability could take root.

Problems and solutions

And these quandaries extend beyond just the cybernetic! For example: those of you who expected designed creatures any day now… “As CRISPR-Cas9 starts to move into clinical trials, a new study published in Nature Methods has found that the gene-editing technology can introduce hundreds of unintended mutations into the genome.  In other words… time for caution, children. Maybe even another Asilomar Conference.  

See this explored in more detail in: A Crack in Creation: Gene Editing and the Unthinkable Power to Control Evolution by Jennifer A. Doudna and Samuel H. Sternberg.

Then there’s the problem of managing our world. You know I’ve mused a lot about this, e.g. in EARTH. And so have many others. For example: Steven Koonin (now at NYU/CUSP but previously Undersecretary for Science at the DOE) was a housemate of mine at Caltech. (We were all a wee bit in awe of Steve.) Koonin has proposed an approach to resolving the (deliberately stirred) fog and murk around Climate Change in much the same way that I’ve proposed, for years — by creating an arena for full-frontal debate, dealing with every issue with a systematic, adversarial process. It’s natural that he should choose this path. Top scientists like Koonin and Roger Penrose (with whom I dined, last month) are among the most competitive humans our species ever produced. And with good reason, since that is how creative endeavors flourish. 

See Koonin's proposal outlined in A 'Red Team' Exercise would strengthen Climate Science…. which complements my own articles on reciprocal accountability. Our "arenas" of democracy, science, markets and courts all wither in darkness… and operate best in light.

Of course, I have opinions as to which “side” would ultimately win such a healthy process, and today’s right has the same suspicion, illustrated by their desperate measures to avoid open fact-checking. Still, I am willing to be proved wrong and promise even to be fascinated, when that happens!  So, bring on the “disputation arenas” that I’ve called-for, across 25 years!

Interesting snippets

This study shows that dads are more attentive to their toddler daughters than sons and encourage more analytic thinking. I would reckon this might be partially cultural and perhaps even a bit recent.

A new kind of "flow battery" would let you replace the liquid electrolytes at a service station as fast as you fill now with a tank of gas, letting the old fluids get recharged by solar power.

Watch dolphins using a touchscreen.

This entertaining “Periodic Table of Irrational Nonsense” deliberately veers away from anything political – in other words the conspiracy theories and denialist cults that are harming us the worst. Still, it amusingly categorizes and arranges many of the silly fetishes that your neighbors (some of them, but never you!) indulge in.

Gangs of orcas and sperm whales are robbing halibut fishermen in the Bering Sea. “The orcas will wait all day for a fisher to accumulate a catch of halibut, and then deftly rob them blind. They will relentlessly stalk individual fishing boats, sometimes forcing them back into port.” Oh, but elsewhere on the planet, pods of dolphins will herd schools of fish toward humans who share the catch. Should these halibut fellows study that trick? 

Visit David Brin's Contrary Brin blogwebsitebiographybooks/novels, and short stories. David Brin's nonfiction book about the information age –The Transparent Society – won the Freedom of Speech Award of the American Library Association. Check out his work. 

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments

Stay Connected

157,351FansLike
396,312FollowersFollow
2,290SubscribersSubscribe

Latest Articles

0
Would love your thoughts, please comment.x
()
x