President Donald Trump’s administration will assume a cybersecurity portfolio that has continued to evolve toward combating digital threats since the Cybersecurity and Infrastructure Protection Agency (CISA) was created 6 years ago out of the Department of Homeland Security (DHS). CISA’s mission is a formidable one. The list of hostile threat players in cyberspace is quite extensive. Nation-states, organized criminals, terrorists, and hacktivists are all included.
Read MoreIt is well known that artificial intelligence (AI) enables better, faster and more automated decisions. Indeed, it has been proposed that AI is driving a resurgence of interest in redesign business processes.[i] That’s partly due to the ability of certain AI tools, such as robotic process automation (RPA) which when combined with machine learning as “intelligent process automation,” can automate information-intensive processes. It has also been argued that AI fits well into improvement methods such as Lean Six Sigma and can be applied at each stage of the so called DMAIC process (Define-Measure-Analyze-Improve- Control).[ii] Note that Six Sigma and Lean Six Sigma are highly codified and structured methods of process improvements which have a strong bias towards incremental improvement within organizational boundaries. The integration of AI into process improvement may have the potential to reignite interest in more major change – targeted at large enterprise processes – perhaps even reengineering.
Read MoreThere is no doubt that 2023 was a tough year for cyber security. The amount of data breaches keeps rising from previous years, which was already very scary. An exponential rise in the complexity and intensity of cyberattacks like social engineering, ransomware, and DDOS attacks was also seen. This was mostly made possible by hackers using AI tools.
Read MoreWill GenAI scale without enough power to accelerate multi-trillion parameter LLM training, causing economic fallout and impacting GDP growth? The rapid expansion of Generative AI (GenAI) and Large Language Models (LLMs), like GPT, Gemini and LLaMA, has drastically increased the need for computational power, especially as models scale from billions to trillions of parameters. Training these multi-trillion parameter models demands specialized hardware, notably NVIDIA’s H100 and upcoming GB200GPUs, which are designed to handle the immense Teraflop computation processing requirements of such massive model parameters count and datasets. These GPUs outperform traditional models in both speed and efficiency, but at the costof significantly higher power consumption.
Read More“Social media provide a steady diet of dire warnings that artificial intelligence (AI) will make software engineering (SE) irrelevant or obsolete,” wrote CMU computer scientists Eunsuk Kang and Mary Shaw in their September, 2024 paper, “tl;dr: Chill, y’all: AI Will Not Devour SE.” [tl;dr: a summary of a longer text]. “To the contrary, the engineering discipline of software is rich and robust; it encompasses the full scope of software design, development, deployment, and practical use; and it has regularly assimilated radical new offerings from AI.”
Read More“Digital twins are fast becoming part of everyday life,” said the lead article in the August 31 issue of The Economist. A Digital Twin is essentially a computerized companion to a real-world entity, be it an industrial physical asset like a jet engine, an individual’s health profile, or a highly complex system like a city.
Read MoreWhile excitement about generative AI is high, some questions persist as to how much value is being delivered. AI has been used by leading firms such as Amazon and Netflix to improve shopping recommendations, but examples of significant applications to improve overall business performance are not abundant. One area where AI has considerable potential is new product development (NPD). The NPD process has not changed much in most organizations for decades with fewer than 30% of new product projects becoming commercial successes. Yet only 13% of firms are using AI in NPD.
Read MoreEarlier this year, the Stanford Institute for Human-Centered Artificial Intelligence released the 2024 AI Index Report, its seventh annual edition of the impact, progress, and trends in AI. At over 500 pages and 9 chapters, the report aims to be the world’s most authoritative source for data and insights about AI in order to help develop a more thorough understanding of the rapidly advancing field of AI.
Read MoreWhether it be through our PCs or the smart devices we carry around, computing controls almost everything we do in the modern world. Futurists view computational power and related capacities as key indicators of technological advancement. The computational explosion of data will have a direct impact on the fundamental pillars of society, such as healthcare, security, communications, transportation, and energy.
Read MoreThe 2024 MIT Sloan CIO Symposium took place on Tuesday, May 14 at the Royal Sonesta, a hotel overlooking the Charles River a short walk from the MIT campus in Cambridge, MA. Not surprisingly, AI was the dominant theme in this year’s Symposium, with a number of keynotes and panels on the topic. In addition, a pre-event program was added on the day before the Symposium, which included a number of more informal roundtable discussions on various aspects of AI, such as legal risks in AI deployment, AI as a driver for productivity, and human’s role in AI-augmented workplaces.
Read MoreWhile randomized controlled trials (RCTs) have traditionally been considered the gold standard for drug development, it is widely recognized that RCTs are expensive, lengthy, and burdensome on patients. According to some estimates, it takes more than a billion dollars in funding and a decade of work to bring one new medication to market. Despite exciting advances in genomics, patient-centric awareness, decentralized clinical trials, and the application of artificial intelligence (AI), there is a lack of compelling evidence to date that these trends have had a significant impact on the time and cost of the typical oncology clinical trial.
Read More“Blockchain is one of the major tech stories of the past decade,” said a December, 2022 McKinsey article, “What is Blockchain.” “Everyone seems to be talking about it — but beneath the surface chatter there’s not always a clear understanding of what blockchain is or how it works. Despite its reputation for impenetrability, the basic idea behind blockchain is pretty simple,” namely: “Blockchain is a technology that enables the secure sharing of information.”
Read More“The Coming AI Economic Revolution: Can Artificial Intelligence Reverse the Productivity Slowdown?” was recently published in Foreign Affairs by James Manyika and Michael Spence, two authors I’ve long admired. James Manyika is senior VP of research, technology and society at Google, after serving as chairman and director of the McKinsey Global Institute from 2009 to 2022. Michael Spence, a co-recipient of the 2001 Nobel Prize in Economics, is professor in economics and business at NYU’s Stern School of Business, and was previously professor of management and dean of the Stanford Graduate School of Business.
Read More“Since the 1980s, open source has grown from a grassroots movement to a vital driver of technological and societal innovation,” said “Standing Together on Shared Challenges,” a report published by Linux Foundation Research in December of 2023. “The idea of making software source code freely available for anyone to view, modify, and distribute comprehensively transformed the global software industry. But it also served as a powerful new model for collaboration and innovation in other domains.”
Read MoreCyberattacks are becoming more common in the digital ecosystems we utilize for both personal and professional reasons. In the past year alone, hundreds of millions of private records from banks, ISPs, and retail establishments have been made available to the public.
Read MoreEvery new year creates a new opportunity for optimism and predictions. In the past couple of years, emerging technology has permeated almost all areas of our lives. There is much to explore! In this article, I focus on three evolving technology areas that are already impacting our future but are only at the early stages of true potential: artificial intelligence, quantum computing, and space systems.
Read MoreOur evolving digital world is getting trickier and trickier to protect. Every organization is now a target in the present digital environment, and every firm, big or little, has operations, a brand, a reputation, and revenue funnels that could be at significant danger from a breach.
Read MoreArtificial intelligence has been around since the 1950’s, yet even for skeptics, recent advances in Generative AI (GenAI) have significantly moved the needle forward. Due to massive early adoption, Goldman Sachs estimates that Generative AI could raise global GDP by 7% within 10 years [1]. However, while the focus has been on GenAI, it’s important to note that any GenAI strategy requires the right data and the ability to govern the data effectively with the GenAI tool. A strong GenAI strategy will include the following key components.
Read MoreRisk Mitigation Strategies for Artificial Intelligence Solutions in Healthcare Management
There are a growing number of examples of how Artificial Intelligence Solutions (AIS) can assist in improving healthcare management: early diagnosis, chronic disease management, hospital readmission reduction, efficient scheduling and billing procedures, and effective patient follow-ups while attempting to achieve healthcare's quintuple aim.
The topic of artificial intelligence's rising involvement in our digital world and its associated opportunities and challenges have been the main topics of discussion at many security conferences and events in recent times. There is little doubt that humankind is on the verge of an era of exponential technological advancement, and AI is leading the way in the emerging digital world.
Read More