By Du Wan (Yixian)
First, let's take a quick look at several key concepts mentioned in this article:
template.yml
file. For more information about Fun, see here.Note: The operations in this article are applicable to Fun version 2.16.0 and and all later versions.
This project is developed based on macOS. However, the tools involved are platform-independent and also applicable to the Linux and Windows operating systems. Before proceeding with the example, make sure that the following tools are correctly installed, updated to the latest version, and properly configured.
Fun is based on Docker to simulate the local environment.
MacOS users may use homebrew to install these tools:
brew tap vangie/formula
brew install fun
brew install fcli
Windows and Linux users must read the following articles to install these tools:
After the installation, first run the fun config
command to initialize the configuration.
Note: Make sure you are using Fun version 2.16.0 or any later version.
$ fun --version
2.16.0
AI model serving is a typical scenario of Function Compute. Data engineers train a model and require software engineers to convert the model into a system or service. This process is called model serving. Function Compute features O&M-free and auto-scaling, meeting the requirement of model serving for a high-availability distributed system. This article introduces an example of deploying a model trained by TensorFlow CharRNN for automatically writing Chinese ancient poems in five-character quatrain style to Function Compute. The Python TensorFlow dependency library and training model files constitute hundreds of MB, which exceeds the upper limit of the code package size (50 MB) in Function Compute. For such ultra-large files, the NAS file system is the best choice. This article introduces how to use Fun and NAS for TensorFlow serving.
git clone https://github.com/vangie/poetry.git
Run the fun install
command to install the dependencies.
The fun.yml
file declares TensorFlow dependencies and contains the script commands for training models. Therefore, it takes more time to run the command. Note that the target property is specified in the fun.yml
file. Therefore, the dependencies are installed in the locally simulated NAS Directory.
The following figure shows the dynamic demonstration. To obtain a better demonstration effect, set the max_steps
parameter to 100.
Run the fun local invoke
command to call the function locally. The following figure shows the successful output.
$ fun local invoke poetry
Reading event data from stdin, which can be ended with Enter then Ctrl+D
(you can also pass it from file with -e)
mouting local nas mock dir /Users/vangie/Desktop/poetry/.fun/nas/3be7b4835d-pvs14.cn-shanghai.nas.aliyuncs.com/ into container /mnt/nas
skip pulling image aliyunfc/runtime-python3.6:1.5.2...
FunctionCompute python3 runtime inited.
FC Invoke Start RequestId: 938334c4-5407-4a72-93e1-6d59e52774d8
.......£¨Ê¡ÂÔÁ˲¿·ÖÈÕÖ¾£©
²»¼û½Öпͣ¬ÎÞÑԴ˱ð¹é¡£
½·çÇïÓêÂ䣬ɽɫҹɽ³¤¡£
²»ÎʽÄÏ¿Í£¬¹ÂÖÛÔÚ¹ÊÏç¡£
Ò»ÄêÈçÔ¶±ð£¬ºÎ´¦ÊǹéÈË¡£
Ò»Ò¹ÎÞÈË
RequestId: 938334c4-5407-4a72-93e1-6d59e52774d8 Billed Duration: 14074 ms Memory Size: 1998 MB Max Memory Used: 226 MB
The following figure shows the dynamic demonstration.
In the previous step, we ran the fun local invoke
command to call the Fun NAS function locally. However, these NAS resources are stored in the local system. Therefore, we still need to upload them to the NAS service.
The following sections describe two uploading methods: Fun NAS and ECS.
1) Run the fun nas init
command to initialize NAS configurations.
2) Run the fun nas info
command to view the local directory of NAS. Skip this step as the fun install target
command to directly specify the location is already executed in the previous step.
3) Run the fun nas sync
command to upload local NAS resources to the NAS service in the cloud.
4) Run the fun nas ls nas://poetry:/mnt/auto/
command to check whether the local files have been uploaded to the NAS service.
The following figure shows the dynamic demonstration of running the Fun-NAS sync command. Here, we reduce the dependency size for better demonstration effect.
Alternatively, upload local NAS resources through an Elastic Compute Service (ECS) instance.
Run the following commands to compress the local files, copy them to the NAS directory, and decompress them, as shown in the following figure.
# ¹ÒÔØ nas ÍøÅÌ
mount -t nfs -o vers=4.0 3be7b4835d-pvs14.cn-shanghai.nas.aliyuncs.com:/ /mnt/nas
# ѹËõ±¾µØÒªÉÏ´«µÄĿ¼
cd .fun/nas/3be7b4835d-pvs14.cn-shanghai.nas.aliyuncs.com/
tar -czvf nas.tar.gz lib model
# ¿½±´µ½ nas Ŀ¼
scp nas.tar.gz root@47.103.83.174:/mnt/nas
# ½âѹ
tar -xvf nas.tar.gz
Run the fun deploy
command to deploy the function to the remote end.
$ fun deploy
using region: cn-shanghai
using accountId: ***********4733
using accessKeyId: ***********EUz3
using timeout: 60
Waiting for service poetry to be deployed...
Waiting for function poetry to be deployed...
Waiting for packaging function poetry code...
package function poetry code done
function poetry deploy success
service poetry deploy success
Use fcli to call the function from the remote end. Alternatively, call the function in the Function Compute console.
$ fcli function invoke -s poetry -f poetry
Now, observe that the ancient poetry creation program has been deployed to Function Compute.
Serverless Practices - Quickly Build a SpringBoot Application
How to Develop Function Compute - Using Ghostscript to Convert PDF Files to JPG Files
99 posts | 7 followers
FollowAlibaba Clouder - May 21, 2019
Alibaba Cloud Serverless - December 17, 2020
Alibaba Cloud Native Community - February 9, 2023
Alibaba Cloud Data Intelligence - September 5, 2023
Alibaba Container Service - February 17, 2020
Alibaba Cloud Serverless - August 21, 2019
99 posts | 7 followers
FollowAlibaba Cloud Function Compute is a fully-managed event-driven compute service. It allows you to focus on writing and uploading code without the need to manage infrastructure such as servers.
Learn MoreVisualization, O&M-free orchestration, and Coordination of Stateful Application Scenarios
Learn MoreServerless Application Engine (SAE) is the world's first application-oriented serverless PaaS, providing a cost-effective and highly efficient one-stop application hosting solution.
Learn MoreRealtime Compute for Apache Flink offers a highly integrated platform for real-time data processing, which optimizes the computing of Apache Flink.
Learn MoreMore Posts by Alibaba Cloud Serverless