Elastic container instances can be started up within seconds and scaled out based on your business requirements. You can run short-lived jobs on elastic container instances to reduce computing costs and improve elasticity of clusters and resource utilization while ensuring that business requirements are met.
Background information
Most Kubernetes clusters must concurrently support a variety of online and offline workloads. The traffic volume of online workloads fluctuates and the amount of time required to complete offline workloads is unpredictable, which causes resource demands to vary with time. For example, many enterprises perform intensive computing on weekends and in the middle or at the end of each month, and their demands on computing resources increase sharply during these periods.
Typically, a Kubernetes cluster uses an autoscaler to scale out temporary nodes until all pods are scheduled. After the pods are scheduled and their execution is complete, the temporary nodes are automatically released. In this scale-out mode, a pod must wait two or more minutes before it can be scheduled.
In this scenario, we recommend that you use elastic container instances to run jobs. You can connect elastic container instances to Kubernetes clusters by deploying virtual nodes. Elastic container instances can be started within seconds and scaled out on demand to make Kubernetes clusters more elastic. You do not need to estimate the traffic volume of your business or reserve idle resources before you use elastic container instances to run jobs. This reduces your costs in use and O&M without compromising performance.
Operations
For information about how to run jobs on an elastic container instance, perform the following operations based on the type of your Kubernetes clusters:
If you use Container Service for Kubernetes (ACK) clusters, you must deploy virtual nodes within the clusters. Then, you can create elastic container instances on the virtual nodes to run jobs. For more information, see Use an elastic container instance to run a job.
If you use ACK Serverless clusters, you can directly use elastic container instances to run jobs. For more information, see Use an ACK Serverless cluster to run jobs.
If you use self-managed Kubernetes clusters on the cloud or in data centers, you can deploy virtual nodes within the clusters. Then, you can create elastic container instances on the virtual nodes and schedule jobs to the elastic container instances. For more information about how to use elastic container instances in self-managed Kubernetes clusters, see Overview.
References
You can also use preemptible elastic container instances to run jobs at reduced costs. For more information, see Run jobs on a preemptible instance.