People have countless fantasies about Artificial Intelligence. It has become the most popular theme in novels and movies. When we dream about AI, we often fancy a world with Iron Man and his intelligent assistant J.A.R.V.I.S (or it’s replacement FRIDAY); Baymax from Big Hero 6; or the high-tech adult theme park from Westworld.
However, due to the current technological advancement, the humanoid robots that transcend our human ability is still far away from us. Artificial Intelligence tools nowadays serve as assistants to our lives and solve pain points that could not be addressed before.
So, what is preventing the birth of humanoid robots?
First, let’s take a brief look at the history of AI.
The field of AI was formally established at a conference held at Dartmouth University in 1956 and coined the term “Artificial Intelligence.” Between 1956 and 1974, universities and government agencies invested a countless amount of money and resources in AI researches. Different types of computers were invented that were solving algebra, geometry, and so on. The success of this period made people overly optimistic and began to have high expectations for AI. However, the technology and hardware equipment at the time could not keep up with the novel AI, leading to the first AI winter started in 1974.
The winter continued until the early 1980s, due to the rise of the expert systems, coupled with Japan’s $850 million attempt to create a computer with supercomputing power with human intelligence, known as the Fifth generation computer project. Japan has long been obsessed with humanoid robots. The widely-known science-fiction franchise, Gundam (humanoid vehicles with advanced AI systems), was started in 1979. The original Ghost In The Shell comic was also first released in 1989. To compete with Japan, the British government also invested in AI again during the same period.
During this period, computer hardware boomed. The computers mentioned here were not the kinds known to the average joes but used by professionals with no interface and only a bunch of codes. In 1987, Apple and IBM both developed their own versions of desktop computer (similar to the ones we are using today) with higher performance than those professional computers. This has led to the collapse of the $500 million expert system industry leading to the second AI winter.
It wasn’t until 1997 when IBM’s Deep Blue defeated chess champion, Garry Kasparov, that made the general public to realize the power of AI. At the same time, the tech industry faced the Dot-com bubble. Funds invested in AI were exhausted. However, machine learning continued to advance, mainly due to the breakthrough in hardware equipment. The computer’s processing power and storage capacity grew exponentially, enabling companies to store large amounts of data and process large amounts of data. Various companies and government agencies have successfully applied AI on a larger scale in different applications.
In 2011, IBM’s question-and-answer system Watson defeated reigning champions Brad Rutter and Ken Jennings in the American quiz show “Jeopardy!”. In 2016 and 2017, Alpha GO defeated Go player of 9 dan rank, Lee Sedol, and the world champion, Ke Jie, respectively.
For the past 15 years, Amazon, Google, and other companies have gained substantial competitive advantages using machine learning. In addition to processing user data to understand consumer behavior, these companies continue to focus on computer vision, natural language processing, and many other AI applications.
After decades of research, four key factors have contributed to the rapid development of AI:
- Moore’s Law — Since the beginning of AI research, computer processing power has approximately doubled every two years to the point that handling big data and complex algorithms are made possible. It only took Google two years to make Alpha Go from an amateur Go player to defeat the world champion. Also, the latest version of Alpha Go’s computing resources is only one-tenth of the previous version that defeated Lee Sedol;
- Big Data — Machine learning relies on a large amount of raw data. Over the last two years alone has generated 90 percent of the data in the world;
- Funding — As AI becomes more widely adopted, VC and angel funds are pouring into the AI research, hoping to find the next unicorn;
- Algorithm Complexity — Research and funding are driving increasingly sophisticated algorithms that is possible to utilize neural networks and cognitive computing fully.
Many people believe that the current development of AI has slowed down to the point that we are near to face the third AI winter. Some think that there has been no breakthrough in the development of strong AI. Others believe that today’s enterprises focus on weak AI applications which suppress the growth of the entire AI industry. There are also people who criticize that the current AI research is too centralized that is harming society.
However, the current stock market is dominated mainly by the likes of Apple, Google, Amazon, Facebook, and so on. If the third AI winter were to come, not only will the AI industry stop moving forward, but the entire world will also face the next economic crisis.