Four Emerging Technology Areas Impacting Industry 4.0: Advanced Computing, Artificial intelligence, Big Data & Materials Science

Last year was a transformative year for technological innovation. The threats from Covid19 upended our way of living, especially related to remote-work, and pushed forward the timetable of digital transformation. As we move ahead into another year of unprecedented technological advancement, it is useful to examine some of the trends, & technologies that are already shaping 2021 and what may be on the future horizon. 

2021 is in the early stages of “Industry 4.0” a transformational technological era that can be described as the fusion of our physical and digital systems. The disruptive technological change we are experiencing is impacting industries including health and medical care, transportation, energy, construction, finance, commerce, and security. 

The emerging technologies impacting Industry 4.0 are numerous. They include industrial applications such as automation, genomics, precision medicine, deep learning, 5G, Internet of Things, material science, and autonomous transportation. Most applications fall under general technology areas and are often meshed in techno-fusion will other technologies. There are enabling technology areas that impact many verticals in industry 4.0. Below are descriptions and capsules of insight about four enabling technology categories to explore exponential change.


Advanced Computing

In today's world, computing rules almost all that we do, whether it be from our PC or our smart devices attached to our being. What we call classical computing has evolved over the past decades from room-sized processors to microchips. In Industry 4.0 new advanced computing categories have come online and are expanding in capability and practicality. They include high-performance/ super-computing, edge computing, and quantum computing. 

High-Performance Computing & Supercomputing

The website Techtarget.com provides a working definition of high-performance computing: the use of parallel processing for running advanced application programs efficiently, reliably, and quickly. 

Supercomputing has taken high-performance computing to new levels. Seymour Cray is commonly referred to as the "father of supercomputing" and his company, Cray Computing, is still a driving force in the industry. Supercomputers are differentiated from mainframe computers by their vast data storage capacities and expansive computational powers. Supercomputers are the engines for future automation. The National Academy of Sciences, in its study "The Future of Supercomputing," envisions investments in supercomputing as highly beneficial and that it plays an essential role in national security and scientific discovery.

Edge Computing

Edge computing is a product of our sensor society where everything and anything is connected., often referred to as the Internet of Things. Edge computing places computing power and analytics functions close to where data is created. 

Quantum Computing

The research firm Gartner describes quantum computing as: “[T]he use of atomic quantum states to effect computation. Data is held in qubits (quantum bits), which can hold all possible states simultaneously. Data held in qubits is affected by data held in other qubits, even when physically separated. This effect is known as entanglement.” In a simplified description, quantum computers use quantum bits or qubits instead of using binary traditional bits of ones and zeros for digital communications.

A good explanation of what that means according to Scientific American is that “Quantum computers rely on the same physical rules as atoms to manipulate information. Just like traditional, classical, computers execute logical circuits to run software programs, quantum computers use the physics phenomena of superposition, entanglement, and interference to execute quantum circuits.” Quantum Computing May Be Closer Than You Think - Scientific American

Quantum computing will allow for computers that down massive amounts of data and that will be able to calculate at amazing speeds. We will be able to download libraries in mere seconds. It will be paradigm-shifting and society in areas of research, learning, and predictions. 

Our civilization is now at the footstep of quantum computing. Futurist Ray Kurzweil said that mankind will be able to “expand the scope of our intelligence a billion-fold” and that “the power of computing doubles, on average, every two years. Recent breakthroughs in physics, nanotechnologies, and materials science have brought us into a computing reality that we could not have imagined a decade ago. The chart below provides some examples of where quantum computing will provide value and impact. 

The future of advanced computing will be amazingly interesting and mind-expanding. That digital world will include Molecular computing, biocomputing, biochemical computing, and DNA computing). It will feature computers that have capabilities of a human-machine interface and communicate via lightwave transmission and are self-taught and self-assembling via artificial intelligence. In the distant future, computers may be sentient. 

On the horizon, some resources on Advanced Computing

“Five new quantum information science centers will marry the R&D strengths of academia, industry and U.S. national laboratories” Quantum Computing May Be Closer Than You Think https://www.scientificamerican.com/article/quantum-computing-may-be-closer-than-you-think/?utm_source=newsletter&utm_medium=email&utm_campaign=tech&utm_content=link&utm_term=2020-09-01_featured-this-week&spMailingID=68753529&spUserID=NTk0NzAzNjYzNjAS1&spJobID=1960106348&spReportId=MTk2MDEwNjM0OAS2

“Researchers have developed an electronic chip featuring artificial intelligence (AI) that imitates the way the human brain processes visual information. In addition to capturing and manipulating images, the new chip can now also enhance them, classify numbers, and be trained to recognize patterns and images with an accuracy rate of over 90%.” This AI chip claims to mimic the human brain https://www.techradar.com/news/this-ai-chip-claims-to-mimic-the-human-brain?fbclid=IwAR2KPckcRUT_BMY7ZY_BV-dXXhYfcTKqpGJqQttZ6SRqhGMJqta0k_owe5c

“An intelligent material that learns by physically changing itself, similar to how the human brain works, could be the foundation of a completely new generation of computers.” The First Steps Toward a Quantum Brain: An Intelligent Material That Learns by Physically Changing Itself The First Steps Toward a Quantum Brain: An Intelligent Material That Learns by Physically Changing Itself (scitechdaily.com) 

“Researchers at the Technion have created a biological computer, constructed within a bacterial cell and capable of monitoring different substances in the environment. Currently, the computer identifies and reports on toxic and other materials. Next up: the ability to warn about hemorrhaging in the human body”. Researchers turn bacterial cell into biological computer https://phys.org/news/2020-02-bacterial-cell-biological.html

“Researchers at the University of Rochester and Cornell University have taken an important step toward developing a communications network that exchanges information across long distances by using photons, mass-less measures of light that are key elements of quantum computing and quantum communications systems.” Using quantum properties of light to transmit information https://phys.org/news/2020-11-quantum-properties-transmit.html


Artificial Intelligence

One of the most exciting and philosophically debated categories of emerging technologies is that of artificial intelligence (AI). AI is no longer a thing of science fiction. Companies are already developing technology to distribute artificial intelligence software to millions of graphics and computer processors around the world. AI, machine learning, and natural language processing can be used to solve a variety of business problems. AI can understand, diagnose, and solve customer problems — without being specifically programmed.

Gartner describes artificial intelligence as a “technology that appears to emulate human performance typically by learning, coming to its own conclusions, appearing to understand complex content, engaging in natural dialogs with people, enhancing human cognitive performance or replacing people on execution of non-routine tasks. “

The scientific and economic promise of AI technologies is certainly monumental. Microsoft UK’s chief envisioning officer Dave Choplin claimed that AI is “the most important technology that anybody on the planet is working on today.” Human/computer interface breakthroughs that will extend human brain capacity and memory. The government explores artificial intelligence | TheHill  

Investment into areas of emerging technologies is a good barometer of importance and promise. According to IDC, global spending on AI is forecast to double over the next four years, growing from $50.1 billion in 2020 to more than $110 billion in 2024. Worldwide AI spend to reach more than $110 billion in 2024 - Help Net Security

Machine Learning

Machine learning is a component of artificial intelligence already being used in industry, especially design manufacturing. It is becoming a tool used by most industries from healthcare to security. In basic terms involves getting a computer to act without programming. It often combines with AI and can be thought of as the rapid automation of predictive analytics. Machine learning can provide the fastest way to identify new cyber-attacks, draw statistical inferences, and push that information to endpoint security platforms.

On the horizon, some resources on AI

“Experts say the rise of artificial intelligence will make most people better off over the next decade, but many have concerns about how advances in AI will affect what it means to be human, to be productive and to exercise free will.” Artificial Intelligence and the Future of Humans Artificial Intelligence and the Future of Humans | Pew Research Center

The crux of the development and application of AI is a true public/private partnership engine supported by investment, ingenuity and real-world implementation. Of course, with disruptive technologies comes accountability (security and trust) for the many administrative, IP, and the regulatory ethical challenges that are arising”.  Government, An Integral Partner For Exploring Artificial Intelligence (forbes.com) Government, An Integral Partner For Exploring Artificial Intelligence

The New Techno-Fusion: The Merging Of Technologies Impacting Our Future

The New Techno-Fusion: The Merging Of Technologies Impacting Our Future (forbes.com) 


Big Data:  

According to the Gartner IT Glossary, Big Data is high volume, high-velocity, and high-variety information assets that demand cost-effective, innovative forms of information processing for enhanced insight and decision making. It is the fuel of digital transformation. Eric Schmidt, the former CEO of Google surmised that we produce more data every other day than we did from the inception of early civilization until the year 2003 combined. 

Organizing, managing, and analyzing data is a transformational effort. It is estimated that by the end of this year there will be 35 zettabytes of digital data. That data governance challenge includes processing geospatial data, 3D data, audio and video, unstructured text, and social media. The challenge is to automate technology and methods to analyze such large amounts of unstructured data with application interfaces and convergence to smart interoperable platforms. Digital data storage will eventually be used instead of magnetic tape, 

In the near future, cloud-based platforms, using machine learning algorithms and 5G technologies will play a significant role in making those digital processes operational and efficient. It will likely be a catalyst for a new era of automation impacting all industries and verticals including financial, energy, security, communications, and healthcare.

On the horizon, some resources for Big Data

The ecosystem of big data is continuously emerging and new technologies come into the picture very rapidly many of them expanding more according to demand in IT industries. These technologies assure harmonious work with fine superintendence and salvation. 

Top 10 Big Data Technologies Top 10 Big Data Technologies | Analytics Steps

BIG DATA TECHNOLOGIES THAT ARE EXPECTED TO FLOURISH IN 2020  Big Data Technologies That Are Expected to Flourish in 2020 (analyticsinsight.net)


Materials Science  

We have examined the potential applications of mostly software in computing and cognitive intelligence. Material science will also impact Industry 4.0. Exciting research in materials science is creating stronger, durable, lighter, and even “self-healing” and self-assembling materials. Nanomaterials artificially engineered at molecular scale synthetic composites are now being designed at the inter-atomic level. 

The capability to design and manufacture infrastructures such as bridges, roads, buildings with stronger, adaptable, self-intelligent, and seemingly eternal materials will revolutionize the construction and transportation industries. Advancements in material science will also affect healthcare, robotics, and the electronics industry. Many emerging materials will be combined with 3-D printing. The application of biomechanical and synthetics materials is one of the most interesting areas of materials science. 

On the horizon, some resources on Materials Science:

As researchers develop sensors and other devices based on conductive microbes’ nanowires, they continue to debate exactly how the organisms conduct electricityElectricity-Carrying Bacteria Lead to New Applications—and New QuestionElectricity-Carrying Bacteria Lead to New Applications--and New Questions - Scientific American

Synthetic biology — a new engineering approach to biology that lets us redesign cells — will revolutionize the future by allowing us to create things in laboratories that don't exist in nature. Futurists Tell Us The Most Amazing And Scary Things To Expect In The Future  Futurologists Tell Us The Things To Expect In The Future (buzzfeed.com) 

Nanobots are tiny robots that carry out specific tasks. In medicine, they can be used for targeted drug delivery. A remarkable combination of artificial intelligence (AI) and biology has produced the world's first "living robots". Not bot, not beast: Scientists create first ever-living, programmable organism Not bot, not beast: Scientists create first ever-living, programmable organism (phys.org)

I have just touched on a few of the societal implications of our new technological era and what Industry 4.0 may harbor. The positive news is that we also to be making exponential gains in our understanding of technology and its applications. With benefits come risks and the real imperative for society is planning and adaptation or we will lose control of the promise of technological innovation. My descriptions and overviews are just a starting point for discovering the impact of emerging technology applications on our way of life in 2021 and beyond. I hope you will explore further. 


About Chuck Brooks

Chuck Brooks, President of Brooks Consulting International, is a globally recognized thought leader and evangelist for Cybersecurity and Emerging Technologies. LinkedIn named Chuck as one of “The Top 5 Tech Experts to Follow on LinkedIn.” Chuck was named as a 2020 top leader and influencer in “Who’s Who in Cybersecurity” by Onalytica. He was named by Thompson Reuters as a “Top 50 Global Influencer in Risk, Compliance,” and by IFSEC as the “#2 Global Cybersecurity Influencer.” He was named by The Potomac Officers Club and Executive Mosaic and GovCon as at “One of The Top Five Executives to Watch in GovCon Cybersecurity. Chuck is a two-time Presidential appointee who was an original member of the Department of Homeland Security. Chuck has been a featured speaker at numerous conferences and events including presenting before the G20 country meeting on energy cybersecurity.

Chuck is on the Faculty of Georgetown University where he teaches in the Graduate Applied Intelligence and Cybersecurity Programs. He is also a Cybersecurity Expert for “The Network” at the Washington Post, Visiting Editor at Homeland Security Today, and a Contributor to FORBES. He has also been featured speaker, author on technology and cybersecurity topics by IBM, AT&T, Microsoft, General Dynamics, Xerox, Checkpoint, Cylance, Malwarebytes, and many others.

Chuck Brooks LinkedIn Profile: https://www.linkedin.com/in/chuckbrooks/

Chuck Brooks on Twitter: @ChuckDBrooks