“Blockchain is one of the major tech stories of the past decade,” said a December, 2022 McKinsey article, “What is Blockchain.” “Everyone seems to be talking about it — but beneath the surface chatter there’s not always a clear understanding of what blockchain is or how it works. Despite its reputation for impenetrability, the basic idea behind blockchain is pretty simple,” namely: “Blockchain is a technology that enables the secure sharing of information.”
Read More“The Coming AI Economic Revolution: Can Artificial Intelligence Reverse the Productivity Slowdown?” was recently published in Foreign Affairs by James Manyika and Michael Spence, two authors I’ve long admired. James Manyika is senior VP of research, technology and society at Google, after serving as chairman and director of the McKinsey Global Institute from 2009 to 2022. Michael Spence, a co-recipient of the 2001 Nobel Prize in Economics, is professor in economics and business at NYU’s Stern School of Business, and was previously professor of management and dean of the Stanford Graduate School of Business.
Read More“Since the 1980s, open source has grown from a grassroots movement to a vital driver of technological and societal innovation,” said “Standing Together on Shared Challenges,” a report published by Linux Foundation Research in December of 2023. “The idea of making software source code freely available for anyone to view, modify, and distribute comprehensively transformed the global software industry. But it also served as a powerful new model for collaboration and innovation in other domains.”
Read MoreSince the advent of the Industrial Revolution, general purpose technologies (GPTs) have been the defining technologies of their times. Their ability to support a large variety of applications can, over time, radically transform economies and social institutions. GPTs have great potential from the outset, but realizing their potential takes large tangible and intangible investments and a fundamental rethinking of firms and industries, including new processes, management structures, business models, and worker training. As a result, realizing the potential of a GPT takes considerable time, often decades. Electricity, the internal combustion engine, computers, and the internet are all examples of historically transformative GPTs.
Read MoreOver the past few decades, powerful AI systems have matched or surpassed human levels of performance in a number of tasks such as image and speech recognition, skin cancer classification, breast cancer detection, and highly complex games like Go. These AI breakthroughs have been based on increasingly powerful and inexpensive computing technologies, innovative deep learning (DL) algorithms, and huge amounts of data on almost any subject. More recently, the advent of large language models (LLMs) is taking AI to the next level. And, for many technologists like me, LLMs and their associated chatbots have introduced us to the fascinating world of human language and cognition.
Read MoreIn his 1950 seminal paper, Computing Machinery and Intelligence, Alan Turing proposed what’s famously known as the Turing test, - a test of a machine’s ability to exhibit intelligent behavior indistinguishable from that of a human. If a human at a keyboard couldn’t tell whether they were interacting with a machine or a human, the machine is considered to have passed the Turing test. “Ever since, creating intelligence that matches human intelligence has implicitly or explicitly been the goal of thousands of researchers, engineers and entrepreneurs,” wrote Erik Brynjolfsson, - Stanford professor and Director of the Stanford Digital Economy Lab, - in a recent article, The Turing Trap: The Promise & Peril of Human-Like Artificial Intelligence.
Read MoreOver the past decade, powerful AI systems have matched or surpassed human levels of performance in a number of specific tasks such as image and speech recognition, skin cancer classification and breast cancer detection, and highly complex games like Go. These AI breakthroughs have been based on deep learning (DL), a technique loosely based on the network structure of neurons in the human brain that now dominates the field. DL systems acquire knowledge by being trained with millions to billions of texts, images and other data instead of being explicitly programmed.
Read MoreArtificial intelligence has emerged as the defining technology of our era, as transformative over time as the steam engine, electricity, computers, and the Internet. AI technologies are approaching or surpassing human levels of performance in vision, speech recognition, language translation, and other human domains. Machine learning (ML) advances, like deep learning, have played a central role in AI’s recent achievements, giving computers the ability to be trained by ingesting and analyzing large amounts of data instead of being explicitly programmed.
Read More