Artificial Intelligence In Humanoid Robots
Source: COGNITIVE WORLD on FORBES
When people think of Artificial Intelligence (AI), the major image that pops up in their heads is that of a robot gliding around and giving mechanical replies. There are many forms of AI but humanoid robots are one of the most popular forms. They have been depicted in several Hollywood movies and if you are a fan of science fiction, you might have come across a few humanoids. One of the earliest forms of humanoids was created in 1495 by Leonardo Da Vinci. It was an armor suit and it could perform a lot of human functions such as sitting, standing and walking. It even moved as though a real human was inside it.
Initially, the major aim of AI for humanoids was for research purposes. They were being used for research on how to create better prosthetics for humans. Now, humanoids are being created for several purposes that are not limited to research. Modern-day humanoids are developed to carry out different human tasks and occupy different roles in the employment sector. Some of the roles they could occupy are the role of a personal assistant, receptionist, front desk officer and so on.
The process of inventing a humanoid is quite complex and a lot of work and research is put into the process. Most times, inventors and engineers face some challenges. First-grade sensors and actuators are very important and a tiny mistake could result in glitching. Humanoids move, talk and carry out actions through certain features such as sensors and actuators.
People assume that humanoid robots are robots that are structurally similar to human beings. That is, they have a head, torso, arms and legs. However, this is not always the case as some humanoids do not completely resemble humans. Some are modeled after only some specific human parts such as the human head. Humanoids are usually either Androids or Gynoids. An Android is a humanoid robot designed to resemble a male human while gynoids look like female humans.
Humanoids work through certain features. They have sensors that aid them in sensing their environments. Some have cameras that enable them to see clearly. Motors placed at strategic points are what guide them in moving and making gestures. These motors are usually referred to as actuators.
A lot of work, finances and research are put into making these humanoid robots. The human body is studied and examined first to get a clear picture of what is about to be imitated. Then, one has to determine the task or purpose the humanoid is being created for. Humanoid robots are created for several purposes. Some are created strictly for experimental or research purposes. Others are created for entertainment purposes. Some humanoids are created to carry out specific tasks such as the tasks of a personal assistant using AI, helping out at elderly homes, and so on.
The next step scientists and inventors have to take before a fully functional humanoid is ready is creating mechanisms similar to human body parts and testing them. Then, they have to go through the coding process which is one of the most vital stages in creating a humanoid. Coding is the stage whereby these inventors program the instructions and codes that would enable the humanoid to carry out its functions and give answers when asked a question.
Doesn't sound so difficult, right? However, it would be foolhardy to think that creating a humanoid is as easy as creating a kite or a slingshot in your backyard. Although humanoid robots are becoming very popular, inventors face a few challenges in creating fully functional and realistic ones. Some of these challenges include:
Actuators: These are the motors that help in motion and making gestures. The human body is dynamic. You can easily pick up a rock, toss it across the street, spin seven times and do the waltz. All these can happen in the space of ten to fifteen seconds. To make a humanoid robot, you need strong, efficient actuators that can imitate these actions flexibly and within the same time frame or even less. The actuators should be efficient enough to carry a wide range of actions.
Sensors: These are what help the humanoids to sense their environment. Humanoids need all the human senses: touch, smell, sight, hearing and balance to function properly. The hearing sensor is important for the humanoid to hear instructions, decipher them and carry them out. The touch sensor prevents it from bumping into things and causing self-damage. The humanoid needs a sensor to balance movement and equally needs heat and pain sensors to know when it faces harm or is being damaged. Facial sensors also need to be intact for the humanoid to make facial expressions, and these sensors should be able to carry a wide range of expressions.
Making sure that these sensors are available and efficient is a hard task.
AI-based Interaction: The level at which humanoid robots can interact with humans is quite limited. This where Artificial Intelligence is critical. It can help decipher commands, questions, statements and might even be able to give witty, sarcastic replies and understand random, ambiguous human ramblings.
However, some humanoid robots are so human-like and efficient, that they have become quite popular. Here are a few of them:
Sophia: This is the world's first robot citizen. She was introduced to the United Nations on October 11, 2017. On October 25th, she was granted Saudi Arabian citizenship, making her the first humanoid robot ever to have a nationality.
Sophia was created by Hanson robotics and can carry out a wide range of human actions. It is said that she is capable of making up to fifty facial expressions and can equally express feelings. She has very expressive eyes and her Artificial Intelligence revolves around human values. She has an equal sense of humor. This particular humanoid was designed to look like the late British actress, Audrey Hepburn.
Since she was granted citizenship, Sophia has attended several interviews, conferences and is now one of the world's most popular humanoids.
The Kodomoroid TV Presenter: This humanoid robot was invented in Japan. Her name is derived from the Japanese word for child- Kodomo- and the word 'Android'. She speaks a number of languages and is capable of reading the news and giving weather forecasts.
She has been placed at the Museum of Emerging Science and Innovation in Tokyo where she currently works.
Jia Jia: This humanoid robot was worked on for three years by a team at the University of Science and Technology of China before its release. She is capable of making conversations but has limited motion and stilted speech. She does not have a full range of expressions but the team of inventors plans to make further developments and infuse learning abilities in her. Although her speech and vocabulary need further work, she is still fairly realistic.
Humanoid robots are here to stay and over time, with AI making progress, we might soon find them everywhere in our daily lives.
Dr Sanjit Singh Dang has been a successful Venture Capitalist, Corporate Executive, Board Member, Speaker and Writer in Silicon Valley for almost two decades. He is currently the Co-Founder and Chairman of U First Capital. They provide Venture Capital as a Service to Corporations by bringing Startups, University IP, etc in the Corporate's specific areas of interest (dedicated model). Prior to that, he was at Intel Capital where he had an excellent track record of driving 1 Exit every year: Orb Intelligence (Acquired by Dun and Bradstreet in 2020), Pinterest (IPO 2019), DocuSign (IPO 2018), Body Labs (AI startup, Acquired by Amazon in 2017 within 2 years of leading Series A investment), Voke (Acquired by Intel in 2016 within 7 months of leading Series A investment), Maginatics (Acquired by EMC in 2015 within 1 year of leading investment), and Basis Science (Acquired by Intel in 2014 within 1 year of investment). He has been an Investor and on the Board of several companies, including True Fit (AI for eCommerce, Raised $100M), Reflektion (AI recommendation for eCommerce, Raised $42M), Helpshift (AI-driven Customer Service, Raised $38M) and Enlighted (Smart Lighting, Acquired by Siemens in 2018). He is also an investor in Mirantis (Cloud Computing, raised $100M), GoodData (SaaS Business Intelligence, raised over $100M) and Arcadia Data (Big Data 2.0, Raised $26M). Sanjit has been on a US-level Tech/Innovation Policy Advisory team. He is on the Advisory Council of UN’s World Artificial Intelligence Organization. He has also been on University of California President's Innovation Council. Sanjit has the fastest Engineering PhD from University of Illinois (2yrs 9mo after undergrad), which he received in 2000 with top research awards. He also attended the VC Executive program at Haas School of Business, UC Berkeley. He’s an invited Speaker at several top conferences, eg SURGE/WebSummit, Venture Summit West, TiECon, Collision, ShopTalk, McKinsey Leadership Summit, Silicon Valley Open Doors, Global AI conference, etc. He’s a mentor at Stanford and Berkeley’s Entrepreneurship programs. Sanjit has over a decade of Exec leadership experience in Product Design, Business Development and Strategy across several domains: Big Data, Natural Language Processing, 3D Camera/Apps, Supply-Chain Analytics and Flash Memory. He has managed $2Bill/year accounts and executed 30 partnership deals in $100M-$2Bill range. Always striving to be ahead of the curve, Sanjit worked on Big Data Analytics before industry created the term 'Big Data'. Similarly, he launched 2 online courses during graduate school in 1999 and published iconic papers on learnings, way before the MOOC revolution started.