您可以在EMR on ACK控制台管理您的作业,也可以通过Kubernetes工具或API直接管理您的作业。本文为您介绍如何通过kubectl管理Spark作业。

前提条件

已在E-MapReduce on ACK控制台创建Spark集群,详情请参见快速入门

操作步骤

  1. 通过kubectl连接Kubernetes集群,详情请参见通过kubectl工具连接集群
    您也可以通过API等方式连接Kubernetes集群,详情请参见使用Kubernetes API
  2. 执行以下命令,管理作业。
    • 您可以执行以下命令,查看作业状态。
      kubectl describe SparkApplication <作业名> --namespace <集群对应的namespace>
      返回信息如下所示。
      Name:         spark-pi-simple
      Namespace:    c-48e779e0d9ad****
      Labels:       <none>
      Annotations:  <none>
      API Version:  sparkoperator.k8s.io/v1beta2
      Kind:         SparkApplication
      Metadata:
        Creation Timestamp:  2021-07-22T06:25:33Z
        Generation:          1
        Resource Version:  7503740
        UID:               930874ad-bb17-47f1-a556-55118c1d****
      Spec:
        Arguments:
          1000
        Driver:
          Core Limit:  1000m
          Cores:       1
          Memory:      4g
        Executor:
          Core Limit:           1000m
          Cores:                1
          Instances:            1
          Memory:               8g
          Memory Overhead:      1g
        Image:                  registry-vpc.cn-hangzhou.aliyuncs.com/emr/spark:emr-2.4.5-1.0.0
        Main Application File:  local:///opt/spark/examples/target/scala-2.11/jars/spark-examples_2.11-2.4.5.jar
        Main Class:             org.apache.spark.examples.SparkPi
        Spark Version:          2.4.5
        Type:                   Scala
      Status:
        Application State:
          State:  RUNNING
        Driver Info:
          Pod Name:                spark-pi-simple-driver
          Web UI Address:          172.16.230.240:4040
          Web UI Ingress Address:  spark-pi-simple.c-48e779e0d9ad4bfd.c7f6b768c34764c27ab740bdb1fc2a3ff.cn-hangzhou.alicontainer.com
          Web UI Ingress Name:     spark-pi-simple-ui-ingress
          Web UI Port:             4040
          Web UI Service Name:     spark-pi-simple-ui-svc
        Execution Attempts:        1
        Executor State:
          spark-pi-1626935142670-exec-1:  RUNNING
        Last Submission Attempt Time:     2021-07-22T06:25:33Z
        Spark Application Id:             spark-15b44f956ecc40b1ae59a27ca18d****
        Submission Attempts:              1
        Submission ID:                    d71f30e2-9bf8-4da1-8412-b585fd45****
        Termination Time:                 <nil>
      Events:
        Type    Reason                     Age   From            Message
        ----    ------                     ----  ----            -------
        Normal  SparkApplicationAdded      17s   spark-operator  SparkApplication spark-pi-simple was added, enqueuing it for submission
        Normal  SparkApplicationSubmitted  14s   spark-operator  SparkApplication spark-pi-simple was submitted successfully
        Normal  SparkDriverRunning         13s   spark-operator  Driver spark-pi-simple-driver is running
        Normal  SparkExecutorPending       7s    spark-operator  Executor spark-pi-1626935142670-exec-1 is pending
        Normal  SparkExecutorRunning       6s    spark-operator  Executor spark-pi-1626935142670-exec-1 is running

      本文示例代码中的<集群对应的namespace>,需要替换为集群的命名空间,您可以登录E-MapReduce on ACK控制台,在集群详情页面查看。

      本文示例代码中的<作业名>,您可以登录E-MapReduce on ACK控制台,在集群详情页面的作业区域,查看已创建的作业名。

    • 您可以执行以下命令,停止并删除作业。
      kubectl delete SparkApplication <作业名> -n <集群对应的namespace>
      返回信息如下所示。
      sparkapplication.sparkoperator.k8s.io "spark-pi-simple" deleted
    • 您可以执行以下命令,查看作业日志。
      kubectl logs <作业名-driver> -n <集群对应的namespace>
      说明 例如,作业名为spark-pi-simple,集群对应的namespace为c-d2232227b95145d3,则对应的命令为kubectl logs spark-pi-simple-driver -n c-d2232227b95145d3
      返回如下类似信息。
      ......
      Pi is roughly 3.141488791414888
      21/07/22 14:37:57 INFO SparkContext: Successfully stopped SparkContext
      21/07/22 14:37:57 INFO ShutdownHookManager: Shutdown hook called
      21/07/22 14:37:57 INFO ShutdownHookManager: Deleting directory /var/data/spark-b6a43b55-a354-44d7-ae5e-45b8b1493edb/spark-56aae0d1-37b9-4a7d-9c99-4e4ca12deb4b
      21/07/22 14:37:57 INFO ShutdownHookManager: Deleting directory /tmp/spark-e2500491-6ed7-48d7-b94e-a9ebeb899320