You can configure service deployment for a model when you register the model. The system automatically uses the configurations when you deploy the model. This eliminates the need to manually modify deployment configurations and improves the deployment efficiency. This topic describes how to configure service deployment and deploy model services with a few clicks.
Model service deployment
You can customize the service deployment configuration in the Model Service Deployment section when you register a new model.
Select Custom Configuration and enter the deployment configuration information.
Sample configurations for models deployed by using a custom image:
{ "containers": [ { "image": "registry-vpc.cn-shanghai.aliyuncs.com/xxx/yyy:zzz", "env": [ { "name": "var_name", "value": "var_value" } ], "command": "/data/eas/ENV/bin/python /data/eas/app.py", "port": 8000 } ], "storage": [ { "oss": { "readOnly": false }, "properties": { "resource_type": "model" } } ] }
For more information about the parameters, see Deploy a model service by using a custom image. For more information about model deployment in Elastic Algorithm Service (EAS), see Service deployment.
Sample configurations for models deployed by using a preset processor:
{ "processor": "tensorflow_gpu_1.12" }
Sample configurations for models deployed by using a custom processor:
{ "processor_entry": "./service.py", "processor_type": "python", "processor_path": "http://eas-data.oss-cn-shanghai.aliyuncs.com/demo/service.py", "data_image": "registry.cn-shanghai.aliyuncs.com/eas-service/develop:latest" }
Model deployment
You can deploy a registered model to EAS based on the configurations that you specify in the Model Service Deployment. Perform the following steps:
On the Model Management page, find the model that you want to manage, and click Deploy to EAS in the Operation column. Follow the on-screen instructions to confirm the deployment. The Deploy Service page appears.
On the Deploy Service page, the key parameters in the Model Service Information section are set according to the model service deployment configuration. You only need to configure other required parameters to start the deployment. For more information, see Model service deployment by using the PAI console and Machine Learning Designer.