“Social media provide a steady diet of dire warnings that artificial intelligence (AI) will make software engineering (SE) irrelevant or obsolete,” wrote CMU computer scientists Eunsuk Kang and Mary Shaw in their September, 2024 paper, “tl;dr: Chill, y’all: AI Will Not Devour SE.” [tl;dr: a summary of a longer text]. “To the contrary, the engineering discipline of software is rich and robust; it encompasses the full scope of software design, development, deployment, and practical use; and it has regularly assimilated radical new offerings from AI.”
Read More“Digital twins are fast becoming part of everyday life,” said the lead article in the August 31 issue of The Economist. A Digital Twin is essentially a computerized companion to a real-world entity, be it an industrial physical asset like a jet engine, an individual’s health profile, or a highly complex system like a city.
Read MoreThe 2024 MIT Sloan CIO Symposium took place on Tuesday, May 14 at the Royal Sonesta, a hotel overlooking the Charles River a short walk from the MIT campus in Cambridge, MA. Not surprisingly, AI was the dominant theme in this year’s Symposium, with a number of keynotes and panels on the topic. In addition, a pre-event program was added on the day before the Symposium, which included a number of more informal roundtable discussions on various aspects of AI, such as legal risks in AI deployment, AI as a driver for productivity, and human’s role in AI-augmented workplaces.
Read More“The Coming AI Economic Revolution: Can Artificial Intelligence Reverse the Productivity Slowdown?” was recently published in Foreign Affairs by James Manyika and Michael Spence, two authors I’ve long admired. James Manyika is senior VP of research, technology and society at Google, after serving as chairman and director of the McKinsey Global Institute from 2009 to 2022. Michael Spence, a co-recipient of the 2001 Nobel Prize in Economics, is professor in economics and business at NYU’s Stern School of Business, and was previously professor of management and dean of the Stanford Graduate School of Business.
Read More“Since the 1980s, open source has grown from a grassroots movement to a vital driver of technological and societal innovation,” said “Standing Together on Shared Challenges,” a report published by Linux Foundation Research in December of 2023. “The idea of making software source code freely available for anyone to view, modify, and distribute comprehensively transformed the global software industry. But it also served as a powerful new model for collaboration and innovation in other domains.”
Read MoreEvery new year creates a new opportunity for optimism and predictions. In the past couple of years, emerging technology has permeated almost all areas of our lives. There is much to explore! In this article, I focus on three evolving technology areas that are already impacting our future but are only at the early stages of true potential: artificial intelligence, quantum computing, and space systems.
Read MoreOur evolving digital world is getting trickier and trickier to protect. Every organization is now a target in the present digital environment, and every firm, big or little, has operations, a brand, a reputation, and revenue funnels that could be at significant danger from a breach.
Read MoreRisk Mitigation Strategies for Artificial Intelligence Solutions in Healthcare Management
There are a growing number of examples of how Artificial Intelligence Solutions (AIS) can assist in improving healthcare management: early diagnosis, chronic disease management, hospital readmission reduction, efficient scheduling and billing procedures, and effective patient follow-ups while attempting to achieve healthcare's quintuple aim.
The topic of artificial intelligence's rising involvement in our digital world and its associated opportunities and challenges have been the main topics of discussion at many security conferences and events in recent times. There is little doubt that humankind is on the verge of an era of exponential technological advancement, and AI is leading the way in the emerging digital world.
Read MoreSince the advent of the Industrial Revolution, general purpose technologies (GPTs) have been the defining technologies of their times. Their ability to support a large variety of applications can, over time, radically transform economies and social institutions. GPTs have great potential from the outset, but realizing their potential takes large tangible and intangible investments and a fundamental rethinking of firms and industries, including new processes, management structures, business models, and worker training. As a result, realizing the potential of a GPT takes considerable time, often decades. Electricity, the internal combustion engine, computers, and the internet are all examples of historically transformative GPTs.
Read MoreIn heavily regulated industries such as healthcare, digital innovation can be slow to progress. However, once organizations push towards digital transformation and innovation, the benefits that can be achieved such as revenue growth, patient volume, and cost of care can provide tremendous value. Healthcare organizations are looking for an approach to cost-effective and technically efficient build-out to help on their digital transformation journeys. With investments shifting from core EMRs to infrastructure solutions that enable flexibility and adaptability, healthcare organizations are looking to digital innovation to solve these key issues. In an upcoming Enterprise Data &AI presentation on May 5, 2022, Vignesh Shetty, SVP & GM Edison AI And Platform, GE Healthcare Digital will discuss GE Healthcare’s digital health platform and how it’s helping companies in the healthcare sector on their AI and data journey.
Read More“Machine learning has an AI problem,” wrote author Eric Siegel in a recent Harvard Business Review (HBR) article, “The AI Hype Cycle is Distracting Companies.” “With new breathtaking capabilities from generative AI released every several months — and AI hype escalating at an even higher rate — it’s high time we differentiate most of today’s practical ML projects from those research advances. This begins by correctly naming such projects: Call them ML, not AI. “Including all ML initiatives under the AI umbrella oversells and misleads, contributing to a high failure rate for ML business deployments. For most ML projects, the term AI goes entirely too far — it alludes to human-level capabilities.”
Read MoreThe last winter season has witnessed unprecedented weather conditions across the state of California, driven by a series of over 30 atmospheric river storms from October through March. The impact is two-sided. On one hand, the aquatic deluge has brought much-needed rain and snow to the drought-stricken state, hence alleviating the ongoing multi-year drought in California. The laden snowpack in the Sierra Nevada mountains is also critical for the state's water supply, as it melts in the spring and summer to provide water for agriculture and cities. On the other hand, the storms have dealt a severe impact on life and property. Heavy rain and snowfall bring potential hazards such as flooding, landslides, and mudslides. 41 of California’s 58 counties have been placed under a federal emergency declaration, while 3 of them have been bucketed under a major disaster declaration. Within a 3-week period following Christmas 2022, an estimated 32 trillion gallons of water fell across California, which could fill the state’s largest reservoir, Shasta Lake, approximately 21 times.
Read MoreProtecting critical infrastructure, and especially the U.S. Energy Grid is certainly a topic that keeps the U.S. Department of Homeland Security (DHS), The U.S. Department of Energy (DOE), The U.S. Department of Defense (DOD), and U.S. intelligence community planners up at night. The threats can be from cybersecurity attacks (by countries, criminal gangs, or hacktivists), from physical attacks by terrorists (domestic or foreign) and vandals on utilities or power plants, or from an Electronic Magnetic Pulse (EMP) generated from a geomagnetic solar flare, or from a terrorist short range missile exploded in the atmosphere.
Read MoreThe most intelligent creation, the human brain, is unrivaled in its cognitive abilities. It’s capable of processing vast amounts of information quickly, making complex decisions, evaluating, learning, adapting to new situations and so much more. Further, it has a high degree of plasticity, enabling it to reorganize and adapt to environmental changes such as injuries.
Read MoreOver the past few decades, powerful AI systems have matched or surpassed human levels of performance in a number of tasks such as image and speech recognition, skin cancer classification, breast cancer detection, and highly complex games like Go. These AI breakthroughs have been based on increasingly powerful and inexpensive computing technologies, innovative deep learning (DL) algorithms, and huge amounts of data on almost any subject. More recently, the advent of large language models (LLMs) is taking AI to the next level. And, for many technologists like me, LLMs and their associated chatbots have introduced us to the fascinating world of human language and cognition.
Read MoreSince its launch in 2020, Generative Pre-trained Transformer 3 (GPT-3) has been the talk of the town. The powerful large language model (LLM) trained on 45 TB of text data has been used to develop new tools across the spectrum — from getting code suggestions and building websites to performing meaning-driven searches. The best part? You just have to enter commands in plain language.
Read MoreWe are living amid a technological revolution that is transforming the globe. Changes are visible in all aspects of our lives from transportation, health, and communications. As the adage states, yesterday’s science fiction is today’s science. We are now expanding our capabilities in every area of science, chemistry, biology, physics, and engineering. That includes heightened spae exploration, as well as building smart cities, new manufacturing hubs, and developing artificial intelligence and quantum technologies.
Read MoreBy 2032, it will be logical to assume that the world will be amid a digital and physical transformation beyond our expectations. It is no exaggeration to say we are on the cusp of scientific and technological advancements that will change how we live and interact.
What should we expect in the coming decade as we begin 2022? While there are many potential paradigms changing technological influences that will impact the future, let us explore three specific categories of future transformation: cognitive computing, health and medicine, and autonomous everything.
Read MoreAccording to CISA, “the working group will serve as an important mechanism to improve the security and resilience of commercial space systems. It will identify and offer solutions to areas that need improvement in both the government and private sectors and will develop recommendations to effectively manage risk to space based assets and critical functions.” See CISA Launches a Space Systems Critical Infrastructure Working Group | CISA
Read More