While artificial intelligence (AI) has the potential to be transformative, the track record to date is disappointing. Although billions have been invested in AI, recent research reveals that only 1 percent of companies surveyed consider themselves to be “mature” – i.e. to have fully integrated AI into workflows and thereby produce better business outcomes. The same research report found that the biggest barrier to scaling AI is not employees—but leaders. Mayer, Hannah, Lareina Yee, Michael Chui, and Roger Roberts. "Superagency in the Workplace: Empowering People to Unlock AI’s Full Potential." McKinsey & Company, January 28, 2025. https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/superagency-in-the-workplace-empowering-people-to-unlock-ais-full-potential-at-work.
Read MoreThe integration of Artificial Intelligence (AI) in clinical trials has emerged as a transformative force in the United States healthcare system. While AI offers significant benefits in clinical trials, including cost reduction and improved efficiency, it also presents complex challenges in governance, regulation, and ethical implementation. Authors aim to analyze specific AI applications in U.S. clinical trials, focusing on case studies, regulatory frameworks, and ethical considerations.
Read MoreWhen we released "AI for the Rest of Us" two years ago, we stood in a liminal space, observing as artificial intelligence seemed poised to transform not only Silicon Valley but the entirety of human experience. In hindsight, we recognize that our predictions have been realized in both expected and surprising ways.
Read MoreOn December 29, the WSJ published “Will AI Help or Hurt Workers?,” an article based on a research paper by Aidan Toner-Rodgers, a second year PhD student in MIT’s Economics Department. One of the reasons the WSJ article caught my attention is that it featured a photo of the MIT graduate student in between two of the world’s top economists whose research I’ve closely followed for years: Daron Acemoglu, — who in October was named a co-receipient of the 2024 Nobel Memorial Prize in Economic Science, and David Autor (along with his dog Shelby) — who was a co-chair of a multi-year, MIT-wide Taskforce on the impact of AI on “The Work of the Future.”
Read MoreThe information technology landscape has significantly changed in recent years in terms of corporate value creation (and performance). Wherever data is stored, the digital revolution has created new challenges but also new solutions for innovation and efficiency. Unfortunately, data has a high value to those with nefarious purposes and enhancing data protection needs to become a priority for every business and organization.
Read MoreMIT professor emeritus Rodney Brooks has been posting an annual Predictions Scorecard in rodneybrooks.com since January 1, 2018, where he predicts future milestones in three technology areas: AI and robotics, self driving cars, and human space travel. He also reviews the actual progress in each of these areas to see how his past predictions have held up. On January 1 he posted his 2025 Predictions Scorecard.
Read MoreAccording to a recent Boston Consulting Group (BCG) report, while there is much hype around artificial intelligence (AI), the value is hard to find. Based on recent research involving more than 1,000 companies worldwide, only 22% of companies have advanced beyond the proof-of-concept stage to generate some value, and only 4% are creating substantial value.
Read More“Open source software (OSS) has become a driving force behind innovation, collaboration, and the democratization of technology,” said the 2024 Global Spotlight Insights Report. The report, published last month by Linux Foundation Research, analyzed regional and industry differences in open source opportunities and challenges and tracked year-over-year trends…
Read MoreIn 2022 Congress requested a study by the National Academies on the current and future impact of AI on the US workforce. The report, “Artificial Intelligence and the Future of Work,” was released in November of 2024. The three year study was conducted by a Committee of experts from universities and private sector institutions co-chaired by Stanford professor Erik Brynjolfsson and CMU professor Tom Mitchell.
Read MorePresident Donald Trump’s administration will assume a cybersecurity portfolio that has continued to evolve toward combating digital threats since the Cybersecurity and Infrastructure Protection Agency (CISA) was created 6 years ago out of the Department of Homeland Security (DHS). CISA’s mission is a formidable one. The list of hostile threat players in cyberspace is quite extensive. Nation-states, organized criminals, terrorists, and hacktivists are all included.
Read MoreIt is well known that artificial intelligence (AI) enables better, faster and more automated decisions. Indeed, it has been proposed that AI is driving a resurgence of interest in redesign business processes.[i] That’s partly due to the ability of certain AI tools, such as robotic process automation (RPA) which when combined with machine learning as “intelligent process automation,” can automate information-intensive processes. It has also been argued that AI fits well into improvement methods such as Lean Six Sigma and can be applied at each stage of the so called DMAIC process (Define-Measure-Analyze-Improve- Control).[ii] Note that Six Sigma and Lean Six Sigma are highly codified and structured methods of process improvements which have a strong bias towards incremental improvement within organizational boundaries. The integration of AI into process improvement may have the potential to reignite interest in more major change – targeted at large enterprise processes – perhaps even reengineering.
Read MoreThere is no doubt that 2023 was a tough year for cyber security. The amount of data breaches keeps rising from previous years, which was already very scary. An exponential rise in the complexity and intensity of cyberattacks like social engineering, ransomware, and DDOS attacks was also seen. This was mostly made possible by hackers using AI tools.
Read MoreWill GenAI scale without enough power to accelerate multi-trillion parameter LLM training, causing economic fallout and impacting GDP growth? The rapid expansion of Generative AI (GenAI) and Large Language Models (LLMs), like GPT, Gemini and LLaMA, has drastically increased the need for computational power, especially as models scale from billions to trillions of parameters. Training these multi-trillion parameter models demands specialized hardware, notably NVIDIA’s H100 and upcoming GB200GPUs, which are designed to handle the immense Teraflop computation processing requirements of such massive model parameters count and datasets. These GPUs outperform traditional models in both speed and efficiency, but at the costof significantly higher power consumption.
Read More“Social media provide a steady diet of dire warnings that artificial intelligence (AI) will make software engineering (SE) irrelevant or obsolete,” wrote CMU computer scientists Eunsuk Kang and Mary Shaw in their September, 2024 paper, “tl;dr: Chill, y’all: AI Will Not Devour SE.” [tl;dr: a summary of a longer text]. “To the contrary, the engineering discipline of software is rich and robust; it encompasses the full scope of software design, development, deployment, and practical use; and it has regularly assimilated radical new offerings from AI.”
Read More“Digital twins are fast becoming part of everyday life,” said the lead article in the August 31 issue of The Economist. A Digital Twin is essentially a computerized companion to a real-world entity, be it an industrial physical asset like a jet engine, an individual’s health profile, or a highly complex system like a city.
Read MoreWhile excitement about generative AI is high, some questions persist as to how much value is being delivered. AI has been used by leading firms such as Amazon and Netflix to improve shopping recommendations, but examples of significant applications to improve overall business performance are not abundant. One area where AI has considerable potential is new product development (NPD). The NPD process has not changed much in most organizations for decades with fewer than 30% of new product projects becoming commercial successes. Yet only 13% of firms are using AI in NPD.
Read MoreEarlier this year, the Stanford Institute for Human-Centered Artificial Intelligence released the 2024 AI Index Report, its seventh annual edition of the impact, progress, and trends in AI. At over 500 pages and 9 chapters, the report aims to be the world’s most authoritative source for data and insights about AI in order to help develop a more thorough understanding of the rapidly advancing field of AI.
Read MoreWhether it be through our PCs or the smart devices we carry around, computing controls almost everything we do in the modern world. Futurists view computational power and related capacities as key indicators of technological advancement. The computational explosion of data will have a direct impact on the fundamental pillars of society, such as healthcare, security, communications, transportation, and energy.
Read MoreThe 2024 MIT Sloan CIO Symposium took place on Tuesday, May 14 at the Royal Sonesta, a hotel overlooking the Charles River a short walk from the MIT campus in Cambridge, MA. Not surprisingly, AI was the dominant theme in this year’s Symposium, with a number of keynotes and panels on the topic. In addition, a pre-event program was added on the day before the Symposium, which included a number of more informal roundtable discussions on various aspects of AI, such as legal risks in AI deployment, AI as a driver for productivity, and human’s role in AI-augmented workplaces.
Read MoreWhile randomized controlled trials (RCTs) have traditionally been considered the gold standard for drug development, it is widely recognized that RCTs are expensive, lengthy, and burdensome on patients. According to some estimates, it takes more than a billion dollars in funding and a decade of work to bring one new medication to market. Despite exciting advances in genomics, patient-centric awareness, decentralized clinical trials, and the application of artificial intelligence (AI), there is a lack of compelling evidence to date that these trends have had a significant impact on the time and cost of the typical oncology clinical trial.
Read More