How NOAA Is Applying AI Beyond Forecasting: Interview With Eric Kihn

When you think of forecasts, the first thing that comes to mind might be the weather. People have been forecasting the weather since time immortal, using everything from observing nature to oracles and mystics. Fortunately today, we have the power of big data, geospatial imaging, satellite data, worldwide interconnected sensors, radar and doppler imaging, and of course, advanced analytics and artificial intelligence (AI) to enable some of the most accurate, longer-term forecasts that help people get a heads up before weather impacts their daily lives.

Read More
Foundation Models: AI’s Exciting New Frontier

Over the past decade, powerful AI systems have matched or surpassed human levels of performance in a number of specific tasks such as image and speech recognition, skin cancer classification and breast cancer detection, and highly complex games like Go. These AI breakthroughs have been based on deep learning (DL), a technique loosely based on the network structure of neurons in the human brain that now dominates the field. DL systems acquire knowledge by being trained with millions to billions of texts, images and other data instead of being explicitly programmed.

Read More
The Innovative Ways The United States Space Force Is Leveraging Data And AI

As the newest branch of the United States armed forces, the United States Space Force is at the innovative edge of applying technology and know-how to newly evolving areas of opportunity and threats. As such, it should come as little surprise that advanced uses of data and artificial intelligence are powering many of the innovative applications.

Read More
Cybersecurity in 2022 – A Fresh Look at Some Very Alarming Stats

Last year I wrote two FORBES articles* that highlighted some of the more significant cyber statistics associated with our expanding digital ecosystem. In retrospect, 2021 was a very trying year for cybersecurity in so many areas. There were high profile breaches such as Solar Winds, Colonial Pipeline and dozens of others that had major economic and security related impact. Ransomware came on with a vengeance targeting many small and medium businesses. Perhaps most worrisome was how critical infrastructure and supply chain security weaknesses were targeted and exploited by adversaries at higher rates than in the past.

Read More
Public Private Partnerships And The Cybersecurity Challenge Of Protecting Critical Infrastructure

In the U.S., most of the critical infrastructure, including defense, oil and gas, electric power grids, health care, utilities, communications, transportation, education, banking and finance, is owned by the private sector (about 85 percent according to DHS) and regulated by the public sector. The public and private relationship in operating and protecting critical infrastructure requires a strong working partnership.

Protecting the critical infrastructure poses a difficult challenge because democratic societies by their nature are interactive, open and accessible. Because of the growing digital connectivity (and interdependence) of both IT and industrial control systems, critical infrastructure is facing an evolving and sophisticated array of cybersecurity challenges.

Read More
How Can You Trust the Predictions of a Large Machine Learning Model?

Artificial intelligence has emerged as the defining technology of our era, as transformative over time as the steam engine, electricity, computers, and the Internet. AI technologies are approaching or surpassing human levels of performance in vision, speech recognition, language translation, and other human domains. Machine learning (ML) advances, like deep learning, have played a central role in AI’s recent achievements, giving computers the ability to be trained by ingesting and analyzing large amounts of data instead of being explicitly programmed.

Read More
Three Habits for Effective Digital Transformation

Large scale change has never been easy. Nearly three decades ago, leadership guru Dr John Kotter reported that 70% of all major change efforts in organizations failed. Just a couple years later, the late Dr. Michael Hammer estimated a 70% failure rate for the radical reengineering efforts. Now, that transformational efforts are often driven by technology, the recent success rate is equally bleak according to research by BCG. The root cause of failure with large scale digital change is captured by George Westerman’s first law of digital transformation, which states that: Technology changes quickly, but organizations change much more slowly

Read More
Catalyzing Innovation via Centers, Labs, and Foundries

The cornerstone of collaboration is based on knowledge transfer; sharing of research tools, methodologies and findings; and sometimes combining mutual funding resources to meet shortfalls necessary to build prototypes and commercialize technologies.

Collaborations often involve combinations of government, industry and academia who work together to meet difficult challenges and cultivate new ideas. A growing trend for many leading companies is creating technology specific innovation centers, labs, and foundries to accelerate collaboration and invention.

Read More
How low-code platforms can aid intelligent business process management

The potential for low-code/no-code platforms is enormous. Low-code increases the productivity of IT developers — sometimes by several orders of magnitude. And no-code empowers experts and subject matter experts primarily on the business or operations side (as opposed to IT) to become “citizen developers.” But as I explained in a previous article, low-code and no-code platforms are not a panacea; they face challenges.

Read More
Intelligent automation depends on these 4 cornerstones

There has been more than a modicum of buzz around what IDC calls intelligent process automation and what Gartner calls hyperautomation. In both cases, these terms refer to the integrated deployment of digital technologies such as robotic process automation (RPA), intelligent business process management suites (iBPMS), artificial intelligence, process mining, etc. Integrating digital technologies is far from a new concept. MIT and Deloitte advocated this approach back in the day when everyone was focused on social, mobile, analytics, and cloud (SMAC).

Read More
High Performance Analysis and Control of Complex Systems Using Dynamically Reconfigurable Silicon and Optical Fiber Memory

This is Lars Wood’s only published paper, invited to appear in the proceedings of the first IEEE workshop on FPGAs for Custom Computing Machines in 1993. This paper is significant in that it demonstrates the expanse of FCCM computing using a non-numerical paradigm for numerical computations within a completely asynchronous optoelectronic machine. This paper is referenced as prior art for patents in the space, including one for interconnected optoelectronic FPGAs. The paper is considered a vision statement for FCCM in 1993. View original published paper in IEEE COMPUTER SOCIETY DIGITAL LIBRARY.

Read More
7 Types Of Artificial Intelligence

Artificial Intelligence is probably the most complex and astounding creations of humanity yet. And that is disregarding the fact that the field remains largely unexplored, which means that every amazing AI application that we see today represents merely the tip of the AI iceberg, as it were. While this fact may have been stated and restated numerous times, it is still hard to comprehensively gain perspective on the potential impact of AI in the future. The reason for this is the revolutionary impact that AI is having on society, even at such a relatively early stage in its evolution.

Read More
Infusion of Machine Learning Operations with Internet of Things

With advancements in deep tech, the operationalization of machine learning and deep learning models is burgeoning in the machine learning space. In a typical scenario within organizations involving machine learning or deep learning business cases, the data science and IT teams collaborate extensively in order to increase the pace of scaling and pushing multiple machine learning models to production through continuous training, validation, deployment and integration with governance. Machine Learning Operations (MLOps) has carved a new era of the DevOps paradigm in the machine learning/artificial intelligence realm by automating end-to-end workflows.

Read More
Low code: A promising trend or a Pandora’s Box?

The analyst community is having a field day with hype around “low code.” IDC has predicted that there will be more and more low code used and that the worldwide population of low-code developers will grow with a CAGR of 40.4% from 2021 to 2025. Gartner predicted that low code will increase nearly 30% from 2020 to reach $5.8 billion in 2021. Forrester has also jumped on the low-code hype wagon and forecasted that by the end of 2021, 75% of application development will use low-code platforms.

Read More
Can AI Help With Autism Diagnosis?

In the US, one in five children are living with a diagnosable behavioral health disorder; however, only 21% of those children who are diagnosed receive needed treatment. Often, parents have to wait one to three years to receive a proper and accurate diagnosis and an individualized treatment plan for their child.

Read More
How AI Is Transforming Agriculture

Agriculture and farming is one of the oldest and most important professions in the world. Humanity has come a long way over the millennia in how we farm and grow crops with the introduction of various technologies. As the world population continues to grow and land becomes more scarce, people have needed to get creative and become more efficient about how we farm, using less land to produce more crops and increasing the productivity and yield of those farmed acres.

Read More