In 1965, Gordon Moore made a famous observation in the field of semiconductor development, one that the world now recognizes as Moore's Law. His prediction about the integrated circuits (IC) becoming smaller and faster have proved to be true so far. The smartphones that we use today have way more computing power than what the systems had on moon landing missions in the early 1960s and 1970s.
However, in the recent times, the race to shrink ICs seems to be coming to an abrupt end. The improvement in CPU performance has not been like what it used to be in the past. Chip makers are now gradually discarding the practice of shrinking ICs by increasing transistor counts and clock speeds. Instead, they are focusing more on power efficiency and component integration.
The reason for this change is the massive demand for specialized processors that can handle AI and deep learning workloads. Case in point is Nvidia, who reportedly spent $2 billion to launch a chip it says is optimized for machine learning algorithms and other AI-based technologies. One of these AI-based technologies is Natural Language Processing (NLP).
NLP refers to the evolving set of computer and AI-based technologies that allow computers to learn, understand, and produce content in human languages. The technology works closely with speech recognition and text recognition engines. While text/character recognition and speech/voice recognition allows computers to input the information, NLP allows making sense of this information.
Though scientists and researchers have done a lot of theoretical work on NLP in the past, we have only recently started seeing its real-world use cases. NLP-based systems are augmenting both human-human communication (e.g., with language translation) as well as human-machine communication (e.g., virtual assistants). In this article, we will discuss NLP's evolution, its applications, and Alibaba Cloud's NLP capabilities.
The idea of developing intelligent human-like machines has intrigued philosophers, writers, artists, and inventors since time immemorial. However, AI, as we know today, started taking shape with the development of computers in the early twentieth century.
It was the theory of computation by Alan Turing that made it possible for scientists to reduce any form of computation to simple binary digits. His contributions towards the field of AI is enduring; the Turing Test is still a reliable standard for measuring intelligence in machines. This was also the time when the concept of NLP started to take shape with the intersection of artificial intelligence and linguistics.
In the early years, the researchers and scientists spent a huge amount of time in acquainting computers with vocabulary and grammar of human languages. It was a complex task as computers had to process data as well as understand context. There was a brief lull in the 1970s because the governments had limited their investment in AI after seeing no significant breakthroughs.
In 1981, Japan allocated a budget of $850 million to finance the development of Fifth Generation Computers. The Japanese researchers aimed to build computers that could communicate with humans, and perform language translations. They developed a statistical approach that relied on training computers on large quantities of data for achieving notable results. Today most of the NLP systems use similar machine learning approaches for the understanding of linguistic structure.
Following another quiet period in the 1990s, the new millennium brought a flux of breakthroughs in the field of NLP. The massive explosion of online text, abundant and inexpensive computing and a new approach to develop statistical phrase-based MT systems were some of the factors leading to these breakthroughs.
In 2006, Turing Center for Computer Science & Engineering, University of Washington published an important paper on NLP. In this paper, Oren Etzioni, Michele Banko, and Michael J. Cafarella coined the term “Machine Reading” or MR. The researchers defined MR as the autonomous understanding of text with unsupervised methods. With this paper, the researchers prompted the AI community to merge MR with machine learning and machine translation, thus developing a new field of AI research.
In 2011, IBM Watson beat its human competitors in a popular US quiz show Jeopardy. The news became instantly viral. Unlike other board games, Jeopardy posed major challenges for an AI machine. Watson answered complex riddles and questions on the quiz show displaying its prowess in understanding languages. The researchers spent more than three years training Watson's neural network for Jeopardy.
Since Watson's accomplishments, NLP and associated AI technologies have entered the consumer realm. All major enterprises are today deploying intelligent chatbots for customer support services. These chatbots can answer routine queries, help in ticketing, and offer faster issue resolutions. Businesses are experimenting with recruitment portals that use NLP to sift through numerous job applications and find better applicants for hiring.
Today NLP-based MT has become highly efficient and can offer translations quickly and efficiently. Tourists can use several mobile apps that rely on MT for assistance in understanding foreign languages during their travel. Businesses are implementing NLP solutions for social listening, customer communications, and crisis management. NLP is also improving spam filters, thus preventing users from frauds and unwanted emails.
A lot of financial market movements are affected by global events, political developments, government policy announcements, and the general economic environment in a region. NLP-based systems can read the news, press releases, and other financial reports to assess this environment. This ability makes automated financial advisors more efficient.
Alibaba Cloud's NLP engine recently topped humans in one of the world's most-challenging reading comprehension tests, SQuAD. Developed by Stanford University, SQuAD includes over 100,000 questions on reading and comprehension.
The competing teams in this competition had to answer questions such as, “What causes rain?” among others. For answering such questions accurately, several machine learning systems competed against each other as well as their human counterparts. They had to sift through the massive amount of textual content and locate precise phrases containing potential answers. Alibaba Cloud's Hierarchical Attention Network proved to be highly effective in these tasks.
While the achievement at SQuAD made a buzz, this is not the first time Alibaba Cloud made headlines for NLP. Working through its research arm iDST (Institute of Data Science and Technologies), Alibaba Cloud has showcased NLP prowess in English-named entity classification tasks at the Text Analysis Conferences organized by the National Institute of Standards and Technology, USA. It was also a top scorer at the ACM CIKM Cup. The company has also used NLP to handle large amounts of inbound customer inquiries during Alibaba's 11.11 Global Shopping Festival.
In another NLP related development, Alibaba Cloud created a smart speaker system named Tmall Genie for sale on its e-commerce portal Tmall in July 2017. The system uses NLP and AliGenie voice assistant to receive customers' requests in Mandarin Chinese. Customers can use Tmall Genie for controlling smart home devices, searching and playing music, providing the latest news, and several other tasks. The speaker activates on hearing “Tmall Genie,” recognizes customers through voiceprint recognition, and allows them to place orders on Tmall.
In recent years, Alibaba Cloud has made a significant investment in augmenting its Big Data and AI capabilities. It offers a comprehensive suite of AI-based capabilities, including NLP, intelligent voice recognition, image recognition, video recognition, among others. To learn more about these AI capabilities, you can explore Alibaba Cloud ET Brain.
Business experts, market intelligence, and research and advisory firms are bullish on evolving AI trends. Analysts at Gartner believe that in 2018, numerous companies across all sectors and even governments would start implementing AI solutions. NLP will play a crucial role in the development of intelligent real-world applications. Businesses need to innovate and make the most of this emerging technology for improving customer experience, operational efficiency and explore new revenue models.
Find similar articles and learn more about Alibaba Cloud solutions at www.alibabacloud.com.
2,599 posts | 762 followers
FollowAlibaba Clouder - October 11, 2019
Alibaba Clouder - July 22, 2020
Alibaba Clouder - October 9, 2019
Alibaba F(x) Team - December 14, 2020
Alibaba Clouder - April 19, 2019
Alibaba Clouder - June 11, 2018
2,599 posts | 762 followers
FollowA Big Data service that uses Apache Hadoop and Spark to process and analyze data
Learn MoreA secure environment for offline data development, with powerful Open APIs, to create an ecosystem for redevelopment.
Learn MoreConduct large-scale data warehousing with MaxCompute
Learn MoreMore Posts by Alibaba Clouder