What is the Serverless architecture?
With the passage of time, the Serverless architecture has become more and more popular. With the features of extreme elasticity, pay as you go, low-cost operation and maintenance, it plays an increasingly important role in many fields; The field of machine learning is also very hot in recent years, and has been applied in more and more industries.
In fact, there are always two problems in machine learning projects:
● High resource occupation rate and low utilization rate, especially in projects with large difference between flow peaks and troughs, resource waste is more significant;
● High complexity of deployment, update and maintenance;
While Serverless has the characteristics of extreme elasticity, pay as you go, low-cost operation and maintenance, etc. If the Serverless architecture is applied to machine learning projects, it will not only ensure the performance of machine learning projects, but also reduce costs and improve resource utilization, which is a topic worthy of research and exploration.
Based on this, officially produced by Alibaba Cloud, four experts from Alibaba Cloud and Ant Group, Liu Yu, Tian Chudong, Lu Mengkai and Wang Renda (in no particular order) systematically summarized Alibaba's AI experience under the Serverless architecture, and jointly launched a new book, AI Application Development under the Serverless Architecture: Introduction, Practice and Performance Optimization.
The new book will be serialized on the official account of Serverless (serverless devs) for free. Welcome to subscribe!
preface
This book is a technical book on machine learning practice under the Serverless architecture. Through the basic introduction of the Serverless architecture, the summary of project development experience, and the learning of common machine learning algorithms, models, and frameworks, the book aims to apply machine learning projects to the Serverless architecture The combination of different machine learning projects with Serverless architecture and the development of machine learning applications based on Serverless architecture are explored. We hope to introduce the basic knowledge of Serverless architecture and machine learning to readers through simple and clear language, real cases, and open source code.
I hope that readers can truly understand the important value of the combination of Serverless architecture and machine learning through this book, and can successfully develop and launch machine learning projects under the Serverless architecture, so as to more directly obtain the technical dividends brought by cloud computing.
This book not only has basic theoretical knowledge, but also has a lot of experience sharing, as well as the practical application of the latest technology points, including but not limited to the hands-on GPU instances under the Serverless architecture, multi-dimensional cold start optimization scheme, and the multimode debugging capability of the Serverless architecture. We hope that through the study of this book, readers can have a more comprehensive and intuitive understanding of the Serverless architecture, and have a more in-depth understanding of machine learning under the Serverless architecture.
At the same time, it is hoped that the book will help readers implement machine learning projects under the Serverless architecture and obtain technical dividends from cloud computing development.
Chapter 1: Introduce the foundation of Serverless architecture, including its development, advantages, challenges, etc;
Chapter 2 introduces the application development under the Serverless architecture from the aspects of the development process of the Serverless architecture, the comparison with the development process of ServerFul, and the migration of traditional frameworks;
Chapter 3: Introduce the exploration related to machine learning, including the learning and research of algorithms and models such as support vector machines and neural networks;
Chapter 4: This chapter introduces common machine learning frameworks and their applications in practical projects, so that readers can understand common machine learning frameworks and the plans to deploy them to Serverless architecture;
Chapter 5: This chapter introduces the project practice in several widely used fields of machine learning, including image recognition, emotion analysis and exploration in the fields related to model upgrade iteration, involving many new functions and features of Serverless architecture, such as container images, reserved instances, GPU instances, etc;
Chapters 6 and 7: Two complete cases of the combination of Serverless architecture and AI are introduced. From the project background to the design of related modules, the development and deployment of the project, the whole process of starting, developing and maintaining machine learning projects under the Serverless architecture is explained through a complete process;
Chapter 8: Share relevant development experience on Serverless architecture and summarize the optimization method of Serverless application, including cold start optimization scheme under Serverless architecture and development considerations.
Chapter 1 Initial Understanding of Serverless Architecture
This chapter introduces the basic content of Serverless architecture to readers by exploring the concept of Serverless architecture, analyzing its advantages, values, challenges and difficulties, and sharing its application scenarios. Through the study of this chapter, readers will have a certain understanding of the theoretical basis of Serverless architecture.
Concept of Serverless architecture
With the development of cloud services, computing resources are highly abstracted. From physical machines to cloud servers, and then to container services, computing resources are gradually refined.
In 2012, Iron In the article "Why The Future of Software and Apps is Serverless", Ken Form, vice president of io, first proposed the concept of serverless, and pointed out that "even though cloud computing has gradually emerged, people are still moving around servers. However, this will not last long. Cloud applications are developing towards serverless, which will have a significant impact on the creation and distribution of applications.".
In 2019, UC Berkeley published a paper "Cloud Programming Simplified: A Berkeley View on Serverless Computing". In the article, The author's sharp assertion "The new BaaS storage service will be invented to expand the types of applications that can run more adaptively on Serverless computing. Such storage can match the performance of local block storage, and has temporary and persistent options. The price based on Serverless computing will be lower than that of ServerFul computing, at least not higher than that of ServerFul computing. Once Serverless computing makes a technical breakthrough, it will lead to Serve The decline of rFul service. Serverless will become the default computing paradigm in the cloud era, replacing ServerFul computing, which also means the end of the server client model.
The Serverless architecture first came into the public's view in 2012 and became the leading role of UC Berkeley's sharp assertion in the cloud computing field in 2019, completing the transition from a "new perspective" to a "high-profile architecture". In the past seven years, the Serverless architecture has gradually become a new technology paradigm known to all, from being little known to being commercially applied, to being deployed as a cloud computing strategy by leading cloud manufacturers.
Of course, in the past seven years, Serverless has not only gradually upgraded and improved its technical architecture, but also become more and more clear in its concept and development direction. As for the definition of Serverless, Martin Fowler pointed out in the article "Serverless Architectures" that Serverless is actually a combination of BaaS and FaaS.
This simple and clear definition lays the foundation for the composition of Serverless architecture. As shown in Figure 1-1, Martin Fowler believes that in the Serverless architecture, part of the application's server-side logic is still completed by developers, but unlike the traditional architecture, it runs in a stateless computing container, is event driven, has a short life cycle (even only one call), and is completely managed by a third party. This situation is called Functions as a Service (FaaS).
In addition, the Serverless architecture also partially relies on third-party (cloud) applications or services to manage server logic and state. These applications are usually rich client applications (single page applications or mobile App), which are built on the cloud service ecosystem, including databases (Parse, Firebase), account systems (Auth0, AWS Cognito), etc. These services were originally called Backend as a Service (BaaS).
1-1 Composition of Serverless architecture
It is also believed that Serverless is a combination of FaaS and BaaS. CNCF has further improved the definition of Serverless architecture in CNCF WG Serverless Whitepaper v1.0: Serverless refers to the concept of building and running applications that do not require server management; It describes a more fine-grained deployment model, in which applications are packaged into one or more functional modules, uploaded to the platform, and then executed, extended, and billed to respond to the exact needs at that time.
At the same time, UC Berkeley's article "Cloud Programming Simplified: A Berkeley View on Serverless Computing" in 2019 also gives a supplementary description and definition of what serverless is from the perspective of serverless architecture characteristics: in short, Serverless=FaaS+BaaS, which must have the characteristics of elastic scaling and pay as you go.
The concept of Serverless is also described in the White Paper on Cloud Native Development (2020) released by China Academy of Information and Communication Research (hereinafter referred to as "the Academy") Cloud Native Production Alliance: serverless (namely, Serverless) is an architecture concept, whose core idea is to abstract the infrastructure providing service resources into various services, provide users with on-demand calls in the way of API interface, and truly achieve on-demand scaling Charge by use.
This architecture system eliminates the need for traditional massive continuous online server components, reduces the complexity of development and operation and maintenance, reduces operating costs and shortens the delivery cycle of business systems, enabling users to focus on the development of business logic with higher value density.
So far, the definition of Serverless architecture in terms of structure, behavior and characteristics can be summarized in Figure 1-2.
1-2 Define the Serverless architecture from different perspectives
Features of Serverless architecture
As we all know, things have two sides. Today, the development of cloud computing has made great progress, but as the latest product of cloud computing, the Serverless architecture still faces challenges that cannot be ignored behind its huge advantages.
Advantages and values
Fei Lianghong, the chief cloud computing technical consultant of Amazon AWS, once said: When most companies are developing applications and deploying them on servers today, whether they choose a public cloud or a private data center, they need to know in advance how many servers, how much storage capacity and database functions are required, and they need to deploy and run applications and dependent software on infrastructure.
Assuming that we do not want to spend energy on these details, is there a simple architecture that can meet this requirement? Today, with the Serverless architecture gradually "entering the homes of ordinary people", the answer has been obvious.
In the process of project launch, we generally need to apply for host resources. At this time, we need to spend a lot of time and energy to evaluate the peak maximum cost. Even if some services are applied for resources according to the maximum consumption, there should be a dedicated person to expand or shrink the resources in different time periods, so as to achieve the balance between ensuring business stability and saving costs.
For some services, sometimes the requested resources need to be evaluated on the basis of the maximum cost. Even though there may be many traffic troughs and a large amount of resource waste, they have to do so. For example, applications that are difficult to expand such as databases are "better than applications that cannot be served because of insufficient resources when the peak value comes".
As Fei Lianghong said, under the Serverless architecture, this problem has been solved better. Instead of planning how many resources to use, we request resources based on actual needs, pay according to the time of use, and pay according to the computing resources applied each time. In addition, the granularity of billing is smaller, which is more conducive to reducing the cost of resources.
The Serverless architecture has six potential advantages:
• Unlimited computing resources on demand.
• Eliminate the early commitment of cloud users.
• The ability to pay for computing resources in the short term as needed.
• Reduce costs on a large scale.
• Simplify operations and improve utilization through resource virtualization.
• Improve hardware utilization by reusing workloads from different organizations.
Compared with the traditional architecture, the Serverless architecture does have the advantages of business focus, elastic scaling, pay as you go and so on. These advantages are often important references for developers in technology selection.
1. Business Focus
The so-called business focus refers to letting developers focus more on their own business logic without spending more attention on the underlying resources.
As we all know, applications in the era of single architecture are relatively simple, and the resources of physical servers are sufficient to support business deployment. With the soaring complexity of the business, the functional modules are complex and huge, and the single architecture seriously blocks the efficiency of development and deployment. As a result, the microservice architecture with decoupled business functions and parallel development and deployment of separate modules has gradually become popular. The refined management of business inevitably promotes the improvement of the utilization rate of basic resources.
As shown in Figure 1-3, virtualization technology has been continuously improved and widely used, bridging the gap between physical resources and reducing the burden of user management infrastructure. The container and PaaS platform are further abstracted, providing the application's dependent services, running environment and computing resources required by the bottom layer. This makes the overall efficiency of application development, deployment and operation and maintenance improved again. The Serverless architecture abstracts computing more thoroughly, and delegates the management of all kinds of resources in the application architecture stack to the platform, eliminating the operation and maintenance of infrastructure, so that users can focus on high-value business areas.
1-3 Evolution diagram of virtual machine, container and Serverless architecture
2. Elastic expansion
The so-called elastic scaling refers to the automatic allocation and destruction of resources according to the fluctuation of business traffic, so as to maximize balance, stability, high performance and improve resource utilization.
As we all know, from IaaS to PaaS and then to SaaS, de server becomes more and more obvious. With the Serverless architecture, deserverization has reached a new height. Compared with ServerFul, Serverless emphasizes the mind of Noserve r for business users.
The so-called Noserver does not mean that it is separated from the server or does not need the server, but that it removes concerns and concerns about the server's running status, which means that the operations that originally needed to expand and shrink the server do not need the attention of business personnel anymore, and they are all managed by the cloud mall. As shown in Figure 1-4, the broken line is the flow trend of a website on a certain day.
1-4 Comparison of traffic and load between traditional virtual machine architecture and Serverless architecture in elastic mode
The analysis of Figure 1-4a is as follows:
• Technicians need to evaluate the resource usage of the website. The evaluation result shows that the maximum peak traffic of the website is 800PV/hour, so they have purchased the corresponding ECS.
• But at 10:00 on the same day, the operation and maintenance personnel found that the website traffic increased suddenly, gradually approaching 800PV/h. At this time, the operation and maintenance personnel purchased a new virtual machine online and configured the environment. Finally, they added corresponding policies to the master machine, which passed the traffic peak at 10-15 hours.
• After 15:00, the operation and maintenance personnel found that the traffic was back to normal, stopped the virtual machine added with the policy and released additional resources.
• At 18:00, we found the arrival of overload flow again
It is clear from Figure 1-4b that the load capacity always matches the traffic (of course, there is a problem with this figure itself, that is, the real load capacity may be slightly higher than the current traffic to a certain extent), that is, the traditional virtual machine architecture does not need to deal with the peaks and valleys of traffic with the intervention of technicians, and its flexibility (including capacity expansion and capacity reduction) is provided by the cloud manufacturer.
It is easy to see from the analysis of Figure 1-4 that the flexibility of the Serverless architecture comes from the manufacturer's O&M technical support to a certain extent.
The Serverless architecture advocates "giving more professional things to more professional people so that developers can focus more on their own business logic", which is also a very intuitive embodiment of the elastic model.
3. Pay as you go
The so-called pay as you go means that the Serverless architecture supports users to pay according to the actual resource usage, which can maximize the user side resource utilization efficiency and reduce costs. Under the traditional virtual machine architecture, once the server is purchased and operated, it will continue to consume resources and generate costs. Although the available resources of each server are limited and usually fixed, the load of the server is different at every moment and the resource utilization rate is also different, which leads to a relatively obvious waste of resources under the traditional virtual machine architecture.
In general, the utilization rate of resources is relatively high during the day, and the waste of resources is less; The utilization rate of resources at night is low, and the waste of resources will be relatively high. According to the statistics of Forbes magazine, typical servers in commercial and enterprise data centers only provide 5%~15% of the average maximum processing capacity output, which undoubtedly proves the correctness of the analysis on resource utilization and waste of traditional virtual machine architecture.
The Serverless architecture allows users to entrust service providers to manage servers, databases, applications, and even logic. On the one hand, this method reduces the trouble of user maintenance, and on the other hand, users can pay the cost according to their actual granularity. For service providers, they can handle more idle resources. This is very good in terms of cost and "green" computing.
1-5 Comparison of traffic and expense between traditional virtual machine architecture and Serverless architecture in elastic mode
As shown in Figure 1-5, the broken line is a traffic trend chart of a website on a certain day.
Figure 1-5a shows the flow and expense under the traditional virtual machine architecture. Generally, resource usage evaluation is required before the business goes online. After evaluating the resource usage of the website, the staff purchased a server that can withstand a maximum of 1300PV per hour.
In a whole day, the total computing power provided by this server is shadow area, and the cost required is also the cost of shadow area corresponding computing power. However, it is obvious that the really effective resource use and cost expenditure is only the area under the flow curve, while the shaded part above the flow curve is the resource loss and additional expenditure.
Figure 1-5b shows the expense diagram under the elastic mode of Serverless architecture. It can be clearly seen that the relationship between expenditure and flow is basically proportional, that is, when the flow is at a low value, the corresponding resource usage is relatively small, and the corresponding expenditure is relatively small; When the traffic is at a high value, the resource usage and expenditure are positively correlated.
In the whole process, it can be clearly seen that the Serverless architecture does not generate significant resource waste and additional costs like the traditional virtual machine architecture.
Through the analysis of Figures 1-5, it is not difficult to see that the flexible scalability of Serverless architecture is organically combined with the pay as you go model, which can minimize resource waste and reduce business costs.
4. Other advantages
In addition to the business focus, elastic scaling, pay as you go and other advantages mentioned above, the Serverless architecture also has other advantages.
● Shorten the business innovation cycle: Since the Serverless architecture is, to a certain extent, a model of "cloud manufacturers strive to do more, so that developers can pay more attention to their own business", we can think that developers will pay less time and attention to the OS level, virtual machine level, and system environment level of the ServerFul architecture, and pay more attention to their own business logic, The direct effect of this is to improve the online efficiency of the project, reduce the innovation cycle of the business, and improve the R&D delivery speed.
● Higher system security: Although the Serverless architecture has a "black box" visual sense to a certain extent, because of this, the Serverless architecture often does not provide the function of logging in to the instance, nor does it expose the details of the system externally. At the same time, the maintenance of the operating system and other levels is also handed over to the cloud manufacturer, which means that the Serverless architecture is more secure to a certain extent: on the one hand, the Serverless architecture only exposes the scheduled services and interfaces, which avoids the risk of being brute force cracked compared with the virtual machine to a certain extent; On the other hand, cloud manufacturers have more professional security teams and server operation and maintenance teams to help developers ensure overall business security and service stability.
● More stable business changes: The Serverless architecture is a natural distributed architecture provided by cloud service providers. At the same time, because the features of Noserver eliminate developers' concerns about the running state of the server, under the Serverless architecture, developers can easily change the business code and configuration by using the tools provided by the cloud manufacturer, After the new business logic comes into effect smoothly, developers no longer need to pay attention to it. Therefore, the Serverless architecture has great advantages in business smooth upgrading, change, agile development, function iteration, grayscale publishing and other aspects.
Of course, even though the advantages of many Serverless architectures have been illustrated above, we still cannot enumerate all the advantages and values. However, it is undeniable that the Serverless architecture is being paid more attention, accepted and applied by more teams and individuals, and its value has been quickly highlighted.
In fact, there are always two problems in machine learning projects:
● High resource occupation rate and low utilization rate, especially in projects with large difference between flow peaks and troughs, resource waste is more significant;
● High complexity of deployment, update and maintenance;
While Serverless has the characteristics of extreme elasticity, pay as you go, low-cost operation and maintenance, etc. If the Serverless architecture is applied to machine learning projects, it will not only ensure the performance of machine learning projects, but also reduce costs and improve resource utilization, which is a topic worthy of research and exploration.
Based on this, officially produced by Alibaba Cloud, four experts from Alibaba Cloud and Ant Group, Liu Yu, Tian Chudong, Lu Mengkai and Wang Renda (in no particular order) systematically summarized Alibaba's AI experience under the Serverless architecture, and jointly launched a new book, AI Application Development under the Serverless Architecture: Introduction, Practice and Performance Optimization.
The new book will be serialized on the official account of Serverless (serverless devs) for free. Welcome to subscribe!
preface
This book is a technical book on machine learning practice under the Serverless architecture. Through the basic introduction of the Serverless architecture, the summary of project development experience, and the learning of common machine learning algorithms, models, and frameworks, the book aims to apply machine learning projects to the Serverless architecture The combination of different machine learning projects with Serverless architecture and the development of machine learning applications based on Serverless architecture are explored. We hope to introduce the basic knowledge of Serverless architecture and machine learning to readers through simple and clear language, real cases, and open source code.
I hope that readers can truly understand the important value of the combination of Serverless architecture and machine learning through this book, and can successfully develop and launch machine learning projects under the Serverless architecture, so as to more directly obtain the technical dividends brought by cloud computing.
This book not only has basic theoretical knowledge, but also has a lot of experience sharing, as well as the practical application of the latest technology points, including but not limited to the hands-on GPU instances under the Serverless architecture, multi-dimensional cold start optimization scheme, and the multimode debugging capability of the Serverless architecture. We hope that through the study of this book, readers can have a more comprehensive and intuitive understanding of the Serverless architecture, and have a more in-depth understanding of machine learning under the Serverless architecture.
At the same time, it is hoped that the book will help readers implement machine learning projects under the Serverless architecture and obtain technical dividends from cloud computing development.
Chapter 1: Introduce the foundation of Serverless architecture, including its development, advantages, challenges, etc;
Chapter 2 introduces the application development under the Serverless architecture from the aspects of the development process of the Serverless architecture, the comparison with the development process of ServerFul, and the migration of traditional frameworks;
Chapter 3: Introduce the exploration related to machine learning, including the learning and research of algorithms and models such as support vector machines and neural networks;
Chapter 4: This chapter introduces common machine learning frameworks and their applications in practical projects, so that readers can understand common machine learning frameworks and the plans to deploy them to Serverless architecture;
Chapter 5: This chapter introduces the project practice in several widely used fields of machine learning, including image recognition, emotion analysis and exploration in the fields related to model upgrade iteration, involving many new functions and features of Serverless architecture, such as container images, reserved instances, GPU instances, etc;
Chapters 6 and 7: Two complete cases of the combination of Serverless architecture and AI are introduced. From the project background to the design of related modules, the development and deployment of the project, the whole process of starting, developing and maintaining machine learning projects under the Serverless architecture is explained through a complete process;
Chapter 8: Share relevant development experience on Serverless architecture and summarize the optimization method of Serverless application, including cold start optimization scheme under Serverless architecture and development considerations.
Chapter 1 Initial Understanding of Serverless Architecture
This chapter introduces the basic content of Serverless architecture to readers by exploring the concept of Serverless architecture, analyzing its advantages, values, challenges and difficulties, and sharing its application scenarios. Through the study of this chapter, readers will have a certain understanding of the theoretical basis of Serverless architecture.
Concept of Serverless architecture
With the development of cloud services, computing resources are highly abstracted. From physical machines to cloud servers, and then to container services, computing resources are gradually refined.
In 2012, Iron In the article "Why The Future of Software and Apps is Serverless", Ken Form, vice president of io, first proposed the concept of serverless, and pointed out that "even though cloud computing has gradually emerged, people are still moving around servers. However, this will not last long. Cloud applications are developing towards serverless, which will have a significant impact on the creation and distribution of applications.".
In 2019, UC Berkeley published a paper "Cloud Programming Simplified: A Berkeley View on Serverless Computing". In the article, The author's sharp assertion "The new BaaS storage service will be invented to expand the types of applications that can run more adaptively on Serverless computing. Such storage can match the performance of local block storage, and has temporary and persistent options. The price based on Serverless computing will be lower than that of ServerFul computing, at least not higher than that of ServerFul computing. Once Serverless computing makes a technical breakthrough, it will lead to Serve The decline of rFul service. Serverless will become the default computing paradigm in the cloud era, replacing ServerFul computing, which also means the end of the server client model.
The Serverless architecture first came into the public's view in 2012 and became the leading role of UC Berkeley's sharp assertion in the cloud computing field in 2019, completing the transition from a "new perspective" to a "high-profile architecture". In the past seven years, the Serverless architecture has gradually become a new technology paradigm known to all, from being little known to being commercially applied, to being deployed as a cloud computing strategy by leading cloud manufacturers.
Of course, in the past seven years, Serverless has not only gradually upgraded and improved its technical architecture, but also become more and more clear in its concept and development direction. As for the definition of Serverless, Martin Fowler pointed out in the article "Serverless Architectures" that Serverless is actually a combination of BaaS and FaaS.
This simple and clear definition lays the foundation for the composition of Serverless architecture. As shown in Figure 1-1, Martin Fowler believes that in the Serverless architecture, part of the application's server-side logic is still completed by developers, but unlike the traditional architecture, it runs in a stateless computing container, is event driven, has a short life cycle (even only one call), and is completely managed by a third party. This situation is called Functions as a Service (FaaS).
In addition, the Serverless architecture also partially relies on third-party (cloud) applications or services to manage server logic and state. These applications are usually rich client applications (single page applications or mobile App), which are built on the cloud service ecosystem, including databases (Parse, Firebase), account systems (Auth0, AWS Cognito), etc. These services were originally called Backend as a Service (BaaS).
1-1 Composition of Serverless architecture
It is also believed that Serverless is a combination of FaaS and BaaS. CNCF has further improved the definition of Serverless architecture in CNCF WG Serverless Whitepaper v1.0: Serverless refers to the concept of building and running applications that do not require server management; It describes a more fine-grained deployment model, in which applications are packaged into one or more functional modules, uploaded to the platform, and then executed, extended, and billed to respond to the exact needs at that time.
At the same time, UC Berkeley's article "Cloud Programming Simplified: A Berkeley View on Serverless Computing" in 2019 also gives a supplementary description and definition of what serverless is from the perspective of serverless architecture characteristics: in short, Serverless=FaaS+BaaS, which must have the characteristics of elastic scaling and pay as you go.
The concept of Serverless is also described in the White Paper on Cloud Native Development (2020) released by China Academy of Information and Communication Research (hereinafter referred to as "the Academy") Cloud Native Production Alliance: serverless (namely, Serverless) is an architecture concept, whose core idea is to abstract the infrastructure providing service resources into various services, provide users with on-demand calls in the way of API interface, and truly achieve on-demand scaling Charge by use.
This architecture system eliminates the need for traditional massive continuous online server components, reduces the complexity of development and operation and maintenance, reduces operating costs and shortens the delivery cycle of business systems, enabling users to focus on the development of business logic with higher value density.
So far, the definition of Serverless architecture in terms of structure, behavior and characteristics can be summarized in Figure 1-2.
1-2 Define the Serverless architecture from different perspectives
Features of Serverless architecture
As we all know, things have two sides. Today, the development of cloud computing has made great progress, but as the latest product of cloud computing, the Serverless architecture still faces challenges that cannot be ignored behind its huge advantages.
Advantages and values
Fei Lianghong, the chief cloud computing technical consultant of Amazon AWS, once said: When most companies are developing applications and deploying them on servers today, whether they choose a public cloud or a private data center, they need to know in advance how many servers, how much storage capacity and database functions are required, and they need to deploy and run applications and dependent software on infrastructure.
Assuming that we do not want to spend energy on these details, is there a simple architecture that can meet this requirement? Today, with the Serverless architecture gradually "entering the homes of ordinary people", the answer has been obvious.
In the process of project launch, we generally need to apply for host resources. At this time, we need to spend a lot of time and energy to evaluate the peak maximum cost. Even if some services are applied for resources according to the maximum consumption, there should be a dedicated person to expand or shrink the resources in different time periods, so as to achieve the balance between ensuring business stability and saving costs.
For some services, sometimes the requested resources need to be evaluated on the basis of the maximum cost. Even though there may be many traffic troughs and a large amount of resource waste, they have to do so. For example, applications that are difficult to expand such as databases are "better than applications that cannot be served because of insufficient resources when the peak value comes".
As Fei Lianghong said, under the Serverless architecture, this problem has been solved better. Instead of planning how many resources to use, we request resources based on actual needs, pay according to the time of use, and pay according to the computing resources applied each time. In addition, the granularity of billing is smaller, which is more conducive to reducing the cost of resources.
The Serverless architecture has six potential advantages:
• Unlimited computing resources on demand.
• Eliminate the early commitment of cloud users.
• The ability to pay for computing resources in the short term as needed.
• Reduce costs on a large scale.
• Simplify operations and improve utilization through resource virtualization.
• Improve hardware utilization by reusing workloads from different organizations.
Compared with the traditional architecture, the Serverless architecture does have the advantages of business focus, elastic scaling, pay as you go and so on. These advantages are often important references for developers in technology selection.
1. Business Focus
The so-called business focus refers to letting developers focus more on their own business logic without spending more attention on the underlying resources.
As we all know, applications in the era of single architecture are relatively simple, and the resources of physical servers are sufficient to support business deployment. With the soaring complexity of the business, the functional modules are complex and huge, and the single architecture seriously blocks the efficiency of development and deployment. As a result, the microservice architecture with decoupled business functions and parallel development and deployment of separate modules has gradually become popular. The refined management of business inevitably promotes the improvement of the utilization rate of basic resources.
As shown in Figure 1-3, virtualization technology has been continuously improved and widely used, bridging the gap between physical resources and reducing the burden of user management infrastructure. The container and PaaS platform are further abstracted, providing the application's dependent services, running environment and computing resources required by the bottom layer. This makes the overall efficiency of application development, deployment and operation and maintenance improved again. The Serverless architecture abstracts computing more thoroughly, and delegates the management of all kinds of resources in the application architecture stack to the platform, eliminating the operation and maintenance of infrastructure, so that users can focus on high-value business areas.
1-3 Evolution diagram of virtual machine, container and Serverless architecture
2. Elastic expansion
The so-called elastic scaling refers to the automatic allocation and destruction of resources according to the fluctuation of business traffic, so as to maximize balance, stability, high performance and improve resource utilization.
As we all know, from IaaS to PaaS and then to SaaS, de server becomes more and more obvious. With the Serverless architecture, deserverization has reached a new height. Compared with ServerFul, Serverless emphasizes the mind of Noserve r for business users.
The so-called Noserver does not mean that it is separated from the server or does not need the server, but that it removes concerns and concerns about the server's running status, which means that the operations that originally needed to expand and shrink the server do not need the attention of business personnel anymore, and they are all managed by the cloud mall. As shown in Figure 1-4, the broken line is the flow trend of a website on a certain day.
1-4 Comparison of traffic and load between traditional virtual machine architecture and Serverless architecture in elastic mode
The analysis of Figure 1-4a is as follows:
• Technicians need to evaluate the resource usage of the website. The evaluation result shows that the maximum peak traffic of the website is 800PV/hour, so they have purchased the corresponding ECS.
• But at 10:00 on the same day, the operation and maintenance personnel found that the website traffic increased suddenly, gradually approaching 800PV/h. At this time, the operation and maintenance personnel purchased a new virtual machine online and configured the environment. Finally, they added corresponding policies to the master machine, which passed the traffic peak at 10-15 hours.
• After 15:00, the operation and maintenance personnel found that the traffic was back to normal, stopped the virtual machine added with the policy and released additional resources.
• At 18:00, we found the arrival of overload flow again
It is clear from Figure 1-4b that the load capacity always matches the traffic (of course, there is a problem with this figure itself, that is, the real load capacity may be slightly higher than the current traffic to a certain extent), that is, the traditional virtual machine architecture does not need to deal with the peaks and valleys of traffic with the intervention of technicians, and its flexibility (including capacity expansion and capacity reduction) is provided by the cloud manufacturer.
It is easy to see from the analysis of Figure 1-4 that the flexibility of the Serverless architecture comes from the manufacturer's O&M technical support to a certain extent.
The Serverless architecture advocates "giving more professional things to more professional people so that developers can focus more on their own business logic", which is also a very intuitive embodiment of the elastic model.
3. Pay as you go
The so-called pay as you go means that the Serverless architecture supports users to pay according to the actual resource usage, which can maximize the user side resource utilization efficiency and reduce costs. Under the traditional virtual machine architecture, once the server is purchased and operated, it will continue to consume resources and generate costs. Although the available resources of each server are limited and usually fixed, the load of the server is different at every moment and the resource utilization rate is also different, which leads to a relatively obvious waste of resources under the traditional virtual machine architecture.
In general, the utilization rate of resources is relatively high during the day, and the waste of resources is less; The utilization rate of resources at night is low, and the waste of resources will be relatively high. According to the statistics of Forbes magazine, typical servers in commercial and enterprise data centers only provide 5%~15% of the average maximum processing capacity output, which undoubtedly proves the correctness of the analysis on resource utilization and waste of traditional virtual machine architecture.
The Serverless architecture allows users to entrust service providers to manage servers, databases, applications, and even logic. On the one hand, this method reduces the trouble of user maintenance, and on the other hand, users can pay the cost according to their actual granularity. For service providers, they can handle more idle resources. This is very good in terms of cost and "green" computing.
1-5 Comparison of traffic and expense between traditional virtual machine architecture and Serverless architecture in elastic mode
As shown in Figure 1-5, the broken line is a traffic trend chart of a website on a certain day.
Figure 1-5a shows the flow and expense under the traditional virtual machine architecture. Generally, resource usage evaluation is required before the business goes online. After evaluating the resource usage of the website, the staff purchased a server that can withstand a maximum of 1300PV per hour.
In a whole day, the total computing power provided by this server is shadow area, and the cost required is also the cost of shadow area corresponding computing power. However, it is obvious that the really effective resource use and cost expenditure is only the area under the flow curve, while the shaded part above the flow curve is the resource loss and additional expenditure.
Figure 1-5b shows the expense diagram under the elastic mode of Serverless architecture. It can be clearly seen that the relationship between expenditure and flow is basically proportional, that is, when the flow is at a low value, the corresponding resource usage is relatively small, and the corresponding expenditure is relatively small; When the traffic is at a high value, the resource usage and expenditure are positively correlated.
In the whole process, it can be clearly seen that the Serverless architecture does not generate significant resource waste and additional costs like the traditional virtual machine architecture.
Through the analysis of Figures 1-5, it is not difficult to see that the flexible scalability of Serverless architecture is organically combined with the pay as you go model, which can minimize resource waste and reduce business costs.
4. Other advantages
In addition to the business focus, elastic scaling, pay as you go and other advantages mentioned above, the Serverless architecture also has other advantages.
● Shorten the business innovation cycle: Since the Serverless architecture is, to a certain extent, a model of "cloud manufacturers strive to do more, so that developers can pay more attention to their own business", we can think that developers will pay less time and attention to the OS level, virtual machine level, and system environment level of the ServerFul architecture, and pay more attention to their own business logic, The direct effect of this is to improve the online efficiency of the project, reduce the innovation cycle of the business, and improve the R&D delivery speed.
● Higher system security: Although the Serverless architecture has a "black box" visual sense to a certain extent, because of this, the Serverless architecture often does not provide the function of logging in to the instance, nor does it expose the details of the system externally. At the same time, the maintenance of the operating system and other levels is also handed over to the cloud manufacturer, which means that the Serverless architecture is more secure to a certain extent: on the one hand, the Serverless architecture only exposes the scheduled services and interfaces, which avoids the risk of being brute force cracked compared with the virtual machine to a certain extent; On the other hand, cloud manufacturers have more professional security teams and server operation and maintenance teams to help developers ensure overall business security and service stability.
● More stable business changes: The Serverless architecture is a natural distributed architecture provided by cloud service providers. At the same time, because the features of Noserver eliminate developers' concerns about the running state of the server, under the Serverless architecture, developers can easily change the business code and configuration by using the tools provided by the cloud manufacturer, After the new business logic comes into effect smoothly, developers no longer need to pay attention to it. Therefore, the Serverless architecture has great advantages in business smooth upgrading, change, agile development, function iteration, grayscale publishing and other aspects.
Of course, even though the advantages of many Serverless architectures have been illustrated above, we still cannot enumerate all the advantages and values. However, it is undeniable that the Serverless architecture is being paid more attention, accepted and applied by more teams and individuals, and its value has been quickly highlighted.
Related Articles
-
A detailed explanation of Hadoop core architecture HDFS
Knowledge Base Team
-
What Does IOT Mean
Knowledge Base Team
-
6 Optional Technologies for Data Storage
Knowledge Base Team
-
What Is Blockchain Technology
Knowledge Base Team
Explore More Special Offers
-
Short Message Service(SMS) & Mail Service
50,000 email package starts as low as USD 1.99, 120 short messages start at only USD 1.00