Will GenAI scale without enough power to accelerate multi-trillion parameter LLM training, causing economic fallout and impacting GDP growth? The rapid expansion of Generative AI (GenAI) and Large Language Models (LLMs), like GPT, Gemini and LLaMA, has drastically increased the need for computational power, especially as models scale from billions to trillions of parameters. Training these multi-trillion parameter models demands specialized hardware, notably NVIDIA’s H100 and upcoming GB200GPUs, which are designed to handle the immense Teraflop computation processing requirements of such massive model parameters count and datasets. These GPUs outperform traditional models in both speed and efficiency, but at the costof significantly higher power consumption.
Read More“Social media provide a steady diet of dire warnings that artificial intelligence (AI) will make software engineering (SE) irrelevant or obsolete,” wrote CMU computer scientists Eunsuk Kang and Mary Shaw in their September, 2024 paper, “tl;dr: Chill, y’all: AI Will Not Devour SE.” [tl;dr: a summary of a longer text]. “To the contrary, the engineering discipline of software is rich and robust; it encompasses the full scope of software design, development, deployment, and practical use; and it has regularly assimilated radical new offerings from AI.”
Read More“Digital twins are fast becoming part of everyday life,” said the lead article in the August 31 issue of The Economist. A Digital Twin is essentially a computerized companion to a real-world entity, be it an industrial physical asset like a jet engine, an individual’s health profile, or a highly complex system like a city.
Read MoreWhile excitement about generative AI is high, some questions persist as to how much value is being delivered. AI has been used by leading firms such as Amazon and Netflix to improve shopping recommendations, but examples of significant applications to improve overall business performance are not abundant. One area where AI has considerable potential is new product development (NPD). The NPD process has not changed much in most organizations for decades with fewer than 30% of new product projects becoming commercial successes. Yet only 13% of firms are using AI in NPD.
Read MoreEarlier this year, the Stanford Institute for Human-Centered Artificial Intelligence released the 2024 AI Index Report, its seventh annual edition of the impact, progress, and trends in AI. At over 500 pages and 9 chapters, the report aims to be the world’s most authoritative source for data and insights about AI in order to help develop a more thorough understanding of the rapidly advancing field of AI.
Read MoreWhether it be through our PCs or the smart devices we carry around, computing controls almost everything we do in the modern world. Futurists view computational power and related capacities as key indicators of technological advancement. The computational explosion of data will have a direct impact on the fundamental pillars of society, such as healthcare, security, communications, transportation, and energy.
Read MoreThe 2024 MIT Sloan CIO Symposium took place on Tuesday, May 14 at the Royal Sonesta, a hotel overlooking the Charles River a short walk from the MIT campus in Cambridge, MA. Not surprisingly, AI was the dominant theme in this year’s Symposium, with a number of keynotes and panels on the topic. In addition, a pre-event program was added on the day before the Symposium, which included a number of more informal roundtable discussions on various aspects of AI, such as legal risks in AI deployment, AI as a driver for productivity, and human’s role in AI-augmented workplaces.
Read MoreWhile randomized controlled trials (RCTs) have traditionally been considered the gold standard for drug development, it is widely recognized that RCTs are expensive, lengthy, and burdensome on patients. According to some estimates, it takes more than a billion dollars in funding and a decade of work to bring one new medication to market. Despite exciting advances in genomics, patient-centric awareness, decentralized clinical trials, and the application of artificial intelligence (AI), there is a lack of compelling evidence to date that these trends have had a significant impact on the time and cost of the typical oncology clinical trial.
Read More“Blockchain is one of the major tech stories of the past decade,” said a December, 2022 McKinsey article, “What is Blockchain.” “Everyone seems to be talking about it — but beneath the surface chatter there’s not always a clear understanding of what blockchain is or how it works. Despite its reputation for impenetrability, the basic idea behind blockchain is pretty simple,” namely: “Blockchain is a technology that enables the secure sharing of information.”
Read More“The Coming AI Economic Revolution: Can Artificial Intelligence Reverse the Productivity Slowdown?” was recently published in Foreign Affairs by James Manyika and Michael Spence, two authors I’ve long admired. James Manyika is senior VP of research, technology and society at Google, after serving as chairman and director of the McKinsey Global Institute from 2009 to 2022. Michael Spence, a co-recipient of the 2001 Nobel Prize in Economics, is professor in economics and business at NYU’s Stern School of Business, and was previously professor of management and dean of the Stanford Graduate School of Business.
Read More“Since the 1980s, open source has grown from a grassroots movement to a vital driver of technological and societal innovation,” said “Standing Together on Shared Challenges,” a report published by Linux Foundation Research in December of 2023. “The idea of making software source code freely available for anyone to view, modify, and distribute comprehensively transformed the global software industry. But it also served as a powerful new model for collaboration and innovation in other domains.”
Read MoreCyberattacks are becoming more common in the digital ecosystems we utilize for both personal and professional reasons. In the past year alone, hundreds of millions of private records from banks, ISPs, and retail establishments have been made available to the public.
Read MoreEvery new year creates a new opportunity for optimism and predictions. In the past couple of years, emerging technology has permeated almost all areas of our lives. There is much to explore! In this article, I focus on three evolving technology areas that are already impacting our future but are only at the early stages of true potential: artificial intelligence, quantum computing, and space systems.
Read MoreOur evolving digital world is getting trickier and trickier to protect. Every organization is now a target in the present digital environment, and every firm, big or little, has operations, a brand, a reputation, and revenue funnels that could be at significant danger from a breach.
Read MoreArtificial intelligence has been around since the 1950’s, yet even for skeptics, recent advances in Generative AI (GenAI) have significantly moved the needle forward. Due to massive early adoption, Goldman Sachs estimates that Generative AI could raise global GDP by 7% within 10 years [1]. However, while the focus has been on GenAI, it’s important to note that any GenAI strategy requires the right data and the ability to govern the data effectively with the GenAI tool. A strong GenAI strategy will include the following key components.
Read MoreRisk Mitigation Strategies for Artificial Intelligence Solutions in Healthcare Management
There are a growing number of examples of how Artificial Intelligence Solutions (AIS) can assist in improving healthcare management: early diagnosis, chronic disease management, hospital readmission reduction, efficient scheduling and billing procedures, and effective patient follow-ups while attempting to achieve healthcare's quintuple aim.
The topic of artificial intelligence's rising involvement in our digital world and its associated opportunities and challenges have been the main topics of discussion at many security conferences and events in recent times. There is little doubt that humankind is on the verge of an era of exponential technological advancement, and AI is leading the way in the emerging digital world.
Read MoreIn 2021, Booz Allen Hamilton analysts surmised that China will surpass Europe and the US in quantum-related research and development and that Chinese hackers could soon target heavily encrypted datasets such as weapon designs or details of undercover intelligence officers with a view to unlocking them at a later date when quantum computing makes decryption possible.
Read MoreLeading a digital transformation initiative requires collaboration and alignment between business and IT teams. Both business and IT leaders have crucial roles to play in ensuring the success of the initiative. Business leaders should provide strategic direction, define the goals and objectives of the digital transformation, and ensure that the initiative aligns with the overall business strategy. They have a deep understanding of the organization's operations, customer needs, and industry trends. Business leaders can identify areas where digital technology can drive value and competitive advantage, and they can champion the necessary organizational changes.
Read MoreThe central role of process in digital transformation was recognized by MIT scholars a decade ago. Similarly, McKinsey emphasized that the actions organizations can take to encourage digital process innovation involves mapping and then streamlining selected end-to-end business processes and gaining a clear view of how information and data are managed across the company.
Read More