All Products
Search
Document Center

AnalyticDB:ListSparkAppAttempts

Last Updated:Aug 30, 2024

Queries the information about retry attempts of a Spark application.

Operation description

  • Regional public endpoint: adb.<region-id>.aliyuncs.com. Example: adb.cn-hangzhou.aliyuncs.com.
  • Regional Virtual Private Cloud (VPC) endpoint: adb-vpc.<region-id>.aliyuncs.com. Example: adb-vpc.cn-hangzhou.aliyuncs.com.
Note If HTTP status code 409 is returned when you call this operation in the China (Qingdao), China (Shenzhen), China (Guangzhou), or China (Hong Kong) region, contact technical support.

Debugging

OpenAPI Explorer automatically calculates the signature value. For your convenience, we recommend that you call this operation in OpenAPI Explorer.

Authorization information

The following table shows the authorization information corresponding to the API. The authorization information can be used in the Action policy element to grant a RAM user or RAM role the permissions to call this API operation. Description:

  • Operation: the value that you can use in the Action element to specify the operation on a resource.
  • Access level: the access level of each operation. The levels are read, write, and list.
  • Resource type: the type of the resource on which you can authorize the RAM user or the RAM role to perform the operation. Take note of the following items:
    • The required resource types are displayed in bold characters.
    • If the permissions cannot be granted at the resource level, All Resources is used in the Resource type column of the operation.
  • Condition Key: the condition key that is defined by the cloud service.
  • Associated operation: other operations that the RAM user or the RAM role must have permissions to perform to complete the operation. To complete the operation, the RAM user or the RAM role must have the permissions to perform the associated operations.
OperationAccess levelResource typeCondition keyAssociated operation
adb:ListSparkAppAttemptslist
  • DBClusterLakeVersion
    acs:adb:{#regionId}:{#accountId}:dbcluster/{#DBClusterId}
    none
none

Request parameters

ParameterTypeRequiredDescriptionExample
AppIdstringYes

The ID of the Spark application.

Note You can call the ListSparkApps operation to query all application IDs.
s202204132018hzprec1ac****
PageNumberlongYes

The page number. The value must be an integer that is greater than 0. Default value: 1.

1
PageSizelongNo

The number of entries per page. Valid values:

  • 10 (default)
  • 50
  • 100
10
DBClusterIdstringNo

The ID of the AnalyticDB for MySQL Data Lakehouse Edition cluster.

amv-uf6o6m8p6x***

Response parameters

ParameterTypeDescriptionExample
object

The response parameters.

RequestIdstring

The request ID.

1AD222E9-E606-4A42-BF6D-8A4442913CEF
Dataobject

The returned data.

AttemptInfoListarray

The information about the attempts. Fields in the response parameter:

  • AttemptId: the attempt ID.

  • State: the state of the Spark application. Valid values:

    • SUBMITTED
    • STARTING
    • RUNNING
    • FAILING
    • FAILED
    • KILLING
    • KILLED
    • SUCCEEDING
    • COMPLETED
    • FATAL
    • UNKNOWN
  • Message: the alert message that is returned. If no alert is generated, null is returned.

  • Data: the data of the Spark application template.

  • EstimateExecutionCpuTimeInSeconds: the amount of time it takes to consume CPU resources for running the Spark application. Unit: milliseconds.

  • LogRootPath: the storage path of log files.

  • LastAttemptId: the ID of the last attempt.

  • WebUiAddress: the web UI address.

  • SubmittedTimeInMillis: the time when the Spark application was submitted. This value is a UNIX timestamp representing the number of milliseconds that have elapsed since January 1, 1970, 00:00:00 UTC.

  • StartedTimeInMillis: the time when the Spark application was created. This value is a UNIX timestamp representing the number of milliseconds that have elapsed since January 1, 1970, 00:00:00 UTC.

  • LastUpdatedTimeInMillis: the time when the Spark application was last updated. This value is a UNIX timestamp representing the number of milliseconds that have elapsed since January 1, 1970, 00:00:00 UTC.

  • TerminatedTimeInMillis: the time when the Spark application task was terminated. This value is a UNIX timestamp representing the number of milliseconds that have elapsed since January 1, 1970, 00:00:00 UTC.

  • DBClusterId: the ID of the cluster on which the Spark application runs.

  • ResourceGroupName: the name of the job resource group.

  • DurationInMillis: the amount of time it takes to run the Spark application. Unit: milliseconds.

SparkAttemptInfo

The information about the attempts. Fields in the response parameter:

  • AttemptId: the attempt ID.

  • State: the state of the Spark application. Valid values:

    • SUBMITTED
    • STARTING
    • RUNNING
    • FAILING
    • FAILED
    • KILLING
    • KILLED
    • SUCCEEDING
    • COMPLETED
    • FATAL
    • UNKNOWN
  • Message: the alert message that is returned. If no alert is generated, null is returned.

  • Data: the data of the Spark application template.

  • EstimateExecutionCpuTimeInSeconds: the amount of time it takes to consume CPU resources for running the Spark application. Unit: milliseconds.

  • LogRootPath: the storage path of log files.

  • LastAttemptId: the ID of the last attempt.

  • WebUiAddress: the web UI address.

  • SubmittedTimeInMillis: the time when the Spark application was submitted. This value is a UNIX timestamp representing the number of milliseconds that have elapsed since January 1, 1970, 00:00:00 UTC.

  • StartedTimeInMillis: the time when the Spark application was created. This value is a UNIX timestamp representing the number of milliseconds that have elapsed since January 1, 1970, 00:00:00 UTC.

  • LastUpdatedTimeInMillis: the time when the Spark application was last updated. This value is a UNIX timestamp representing the number of milliseconds that have elapsed since January 1, 1970, 00:00:00 UTC.

  • TerminatedTimeInMillis: the time when the Spark application task was terminated. This value is a UNIX timestamp representing the number of milliseconds that have elapsed since January 1, 1970, 00:00:00 UTC.

  • DBClusterId: the ID of the cluster on which the Spark application runs.

  • ResourceGroupName: the name of the job resource group.

  • DurationInMillis: the amount of time it takes to run the Spark application. Unit: milliseconds.

PageNumberlong

The page number.

1
PageSizelong

The number of entries per page.

10
TotalCountlong

The total number of entries returned.

3

Examples

Sample success responses

JSONformat

{
  "RequestId": "1AD222E9-E606-4A42-BF6D-8A4442913CEF",
  "Data": {
    "AttemptInfoList": [
      {
        "AttemptId": "s202207151211hz****-0001",
        "Priority": "NORMAL",
        "State": "SUBMITTED",
        "Message": "WARN: Disk is full",
        "Detail": {
          "Data": "{     \"name\": \"SparkPi\",     \"file\": \"local:///tmp/spark-examples.jar\",     \"className\": \"org.apache.spark.examples.SparkPi\",     \"args\": [         \"1000000\"     ],     \"conf\": {         \"spark.driver.resourceSpec\": \"small\",         \"spark.executor.instances\": 1,         \"spark.executor.resourceSpec\": \"small\"     } }",
          "EstimateExecutionCpuTimeInSeconds": 100,
          "LogRootPath": "oss://<bucket-name>/logs/driver",
          "LastAttemptId": "s202204291426hzpre60****-0003",
          "WebUiAddress": "https://adbsparkui-cn-hangzhou.aliyuncs.com/?token=****",
          "SubmittedTimeInMillis": 1651213645000,
          "StartedTimeInMillis": 1651213645010,
          "LastUpdatedTimeInMillis": 1651213645200,
          "TerminatedTimeInMillis": 1651213645300,
          "DBClusterId": "amv-bp11q28kvl688****",
          "ResourceGroupName": "spark-rg",
          "DurationInMillis": 100
        }
      }
    ],
    "PageNumber": 1,
    "PageSize": 10,
    "TotalCount": 3
  }
}

Error codes

HTTP status codeError codeError messageDescription
400Spark.InvalidParameterInvalid parameter value: %sThe specified parameter is invalid.
400Spark.InvalidStateThe object of the operation is in an invalid state: %sThe operation object is invalid.
403Spark.ForbiddenNo permissions to access the resources: %sInsufficient permissions to access the related resources. Information that you want to access: %s.
404Spark.App.ContentNotFoundThe requested content %s of the Spark application is not found.-
404Spark.ObjectNotFoundThe object is not found. More information: %s-
500Spark.ServerErrorThe Spark control component system encountered an error, please create a ticket to solve the problem or concat the supported engineer on duty. Error message: %sAn error occurred on the Spark control component system. Submit a ticket or contact technical support.

For a list of error codes, visit the Service error codes.

Change history

Change timeSummary of changesOperation
2023-11-24The Error code has changed. The request parameters of the API has changedView Change Details
2023-06-28The Error code has changedView Change Details