Posts tagged technology and strategy
The Current State and Future Trends of Enterprise Blockchain Technologies

“Blockchain is one of the major tech stories of the past decade,” said a December, 2022 McKinsey article, “What is Blockchain.” “Everyone seems to be talking about it — but beneath the surface chatter there’s not always a clear understanding of what blockchain is or how it works. Despite its reputation for impenetrability, the basic idea behind blockchain is pretty simple,” namely: “Blockchain is a technology that enables the secure sharing of information.”

Read More
The Coming AI Revolution

The Coming AI Economic Revolution: Can Artificial Intelligence Reverse the Productivity Slowdown?” was recently published in Foreign Affairs by James Manyika and Michael Spence, two authors I’ve long admired. James Manyika is senior VP of research, technology and society at Google, after serving  as chairman and director of the McKinsey Global Institute from 2009 to 2022. Michael Spence, a co-recipient of the 2001 Nobel Prize in Economics, is professor in economics and business at NYU’s Stern School of Business, and was previously professor of management and dean of the Stanford Graduate School of Business.

Read More
What’s the Likely Long Term Evolution of AI?

Since the advent of the Industrial Revolution, general purpose technologies (GPTs) have been the defining technologies of their times. Their ability to support a large variety of applications can, over time, radically transform economies and social institutions. GPTs have great potential from the outset, but realizing their potential takes large tangible and intangible investments and a fundamental rethinking of firms and industries, including new processes, management structures, business models, and worker training. As a result, realizing the potential of a GPT takes considerable time, often decades. Electricity, the internal combustion engine, computers, and the internet are all examples of historically transformative GPTs.

Read More
Large Language Models: A Cognitive and Neuroscience Perspective

Over the past few decades, powerful AI systems have matched or surpassed human levels of performance in a number of tasks such as image and speech recognition, skin cancer classification, breast cancer detection, and highly complex games like Go. These AI breakthroughs have been based on increasingly powerful and inexpensive computing technologies, innovative deep learning (DL) algorithms, and huge amounts of data on almost any subject. More recently, the advent of large language models (LLMs) is taking AI to the next level. And, for many technologists like me, LLMs and their associated chatbots have introduced us to the fascinating world of human language and cognition.

Read More
The 2022 State of AI in the Enterprise

“Rapidly transforming, but not fully transformed - this is our overarching conclusion on the market, based on the fourth edition of our State of AI in the Enterprise global survey,” said Becoming an AI-fueled organization, the fourth survey conducted by Deloitte since 2017 to assess the adoption of AI across enterprises. “Very few organizations can claim to be completely AI-fueled, but a significant and growing percentage are starting to display the behaviors that can get them there.”

Read More
The Promise & Peril of Human-Like Artificial Intelligence

In his 1950 seminal paper, Computing Machinery and Intelligence, Alan Turing proposed what’s famously known as the Turing test, - a test of a machine’s ability to exhibit intelligent behavior indistinguishable from that of a human. If a human at a keyboard couldn’t tell whether they were interacting with a machine or a human, the machine is considered to have passed the Turing test. “Ever since, creating intelligence that matches human intelligence has implicitly or explicitly been the goal of thousands of researchers, engineers and entrepreneurs,” wrote Erik Brynjolfsson, - Stanford professor and Director of the Stanford Digital Economy Lab, - in a recent article, The Turing Trap: The Promise & Peril of Human-Like Artificial Intelligence.

Read More
Is Scotland A Major Contender In The AI Space?

The pace of adoption for AI and machine learning continues unabated with global, widespread, adoption and usage. It’s not just companies that are taking note of the tremendous value AI can provide them. Countries and governments around the world are also seeking competitive advantages by harnessing the power of AI. Governments that can take advantage of the tremendous transformation presented by AI and cognitive technologies can position themselves for global competitiveness in the future. As a result, countries around the world are adopting AI strategies to provide roadmaps, funding, education, and strategies needed to differentiate themselves and become leaders in different areas related to AI and cognitive technology.

Read More
Foundation Models: AI’s Exciting New Frontier

Over the past decade, powerful AI systems have matched or surpassed human levels of performance in a number of specific tasks such as image and speech recognition, skin cancer classification and breast cancer detection, and highly complex games like Go. These AI breakthroughs have been based on deep learning (DL), a technique loosely based on the network structure of neurons in the human brain that now dominates the field. DL systems acquire knowledge by being trained with millions to billions of texts, images and other data instead of being explicitly programmed.

Read More
How Can You Trust the Predictions of a Large Machine Learning Model?

Artificial intelligence has emerged as the defining technology of our era, as transformative over time as the steam engine, electricity, computers, and the Internet. AI technologies are approaching or surpassing human levels of performance in vision, speech recognition, language translation, and other human domains. Machine learning (ML) advances, like deep learning, have played a central role in AI’s recent achievements, giving computers the ability to be trained by ingesting and analyzing large amounts of data instead of being explicitly programmed.

Read More