All Products
Search
Document Center

AnalyticDB:ListSparkApps

Last Updated:Jan 14, 2026

Lists Spark applications.

Operation description

  • Public endpoint: adb.<region-id>.aliyuncs.com. For example, adb.cn-hangzhou.aliyuncs.com.

  • VPC endpoint: adb-vpc.<region-id>.aliyuncs.com. For example, adb-vpc.cn-hangzhou.aliyuncs.com.

Note

If you receive a 409 error when you send a request from the China (North) 1 (Qingdao), China (South) 1 (Shenzhen), China (South) 3 (Guangzhou), or China (Hong Kong) region, contact technical support.

Try it now

Try this API in OpenAPI Explorer, no manual signing needed. Successful calls auto-generate SDK code matching your parameters. Download it with built-in credential security for local usage.

Test

RAM authorization

The table below describes the authorization required to call this API. You can define it in a Resource Access Management (RAM) policy. The table's columns are detailed below:

  • Action: The actions can be used in the Action element of RAM permission policy statements to grant permissions to perform the operation.

  • API: The API that you can call to perform the action.

  • Access level: The predefined level of access granted for each API. Valid values: create, list, get, update, and delete.

  • Resource type: The type of the resource that supports authorization to perform the action. It indicates if the action supports resource-level permission. The specified resource must be compatible with the action. Otherwise, the policy will be ineffective.

    • For APIs with resource-level permissions, required resource types are marked with an asterisk (*). Specify the corresponding Alibaba Cloud Resource Name (ARN) in the Resource element of the policy.

    • For APIs without resource-level permissions, it is shown as All Resources. Use an asterisk (*) in the Resource element of the policy.

  • Condition key: The condition keys defined by the service. The key allows for granular control, applying to either actions alone or actions associated with specific resources. In addition to service-specific condition keys, Alibaba Cloud provides a set of common condition keys applicable across all RAM-supported services.

  • Dependent action: The dependent actions required to run the action. To complete the action, the RAM user or the RAM role must have the permissions to perform all dependent actions.

Action

Access level

Resource type

Condition key

Dependent action

adb:ListSparkApps

list

*DBClusterLakeVersion

acs:adb:{#regionId}:{#accountId}:dbcluster/{#DBClusterId}

None None

Request parameters

Parameter

Type

Required

Description

Example

DBClusterId

string

Yes

The ID of the Data Lakehouse Edition cluster.

amv-bp11q28kvl688****

ResourceGroupName

string

No

The name of the job resource group.

test_instance

PageNumber

integer

Yes

The page number. The value must be a positive integer. The default value is 1.

1

PageSize

integer

No

The number of entries per page. Valid values:

  • 10 (Default)

  • 50

  • 100

30

Filters

string

No

The filter conditions, which are a JSON-formatted string. The following keys are supported:

  • SubmittedTimeRange: The start time.

  • TerminatedTimeRange: The end time.

  • AppStates: The state of the Spark job.

  • AppId: The ID of the Spark job.

  • AppNameRegex: A regular expression for the name of the Spark job.

  • Tag: The tag information.

  • ResourceGroupName: The name of the resource group.

The start time and end time filter conditions use the following substructure to specify a range:

  • Min: The lower bound of the time range. `null` indicates no lower bound.

  • Max: The upper bound of the time range. `null` indicates no upper bound.

{ "SubmittedTimeRang": { "Max": 10000, "Min": 0 }, "TerminatedTimeRange": { "Max": 10000, "Min": 0 }, "AppStates": ["STARTING"], "AppId": "adc", "AppNameRegex": "cde", "AttemptId": "abc-001" }

Response elements

Element

Type

Description

Example

object

Schema of Response

PageNumber

integer

The page number.

1

PageSize

integer

The number of entries per page.

10

TotalCount

integer

The total number of entries.

1

RequestId

string

The request ID.

D65A809F-34CE-4550-9BC1-0ED21ETG380

Data

object

The returned data.

AppInfoList

array

The list of application information. The following parameters are returned:

  • Data: The data of the Spark application template.

  • EstimateExecutionCpuTimeInSeconds: The CPU time consumed to execute the Spark application, in milliseconds (ms).

  • LogRootPath: The path where the log file is stored.

  • LastAttemptId: The retry ID.

  • WebUiAddress: The web UI address.

  • SubmittedTimeInMillis: The time when the Spark application was submitted. This is a UNIX timestamp, in milliseconds (ms).

  • StartedTimeInMillis: The time when the Spark application was created. This is a UNIX timestamp, in milliseconds (ms).

  • LastUpdatedTimeInMillis: The time when the Spark application was last updated. This is a UNIX timestamp, in milliseconds (ms).

  • TerminatedTimeInMillis: The time when the Spark application stopped running. This is a UNIX timestamp, in milliseconds (ms).

  • DBClusterId: The ID of the cluster on which the Spark application is running.

  • ResourceGroupName: The name of the job resource group.

  • DurationInMillis: The execution duration of the Spark application, in milliseconds (ms).

SparkAppInfo

The list of application information. The following parameters are returned:

  • Data: The data of the Spark application template.

  • EstimateExecutionCpuTimeInSeconds: The CPU time consumed to execute the Spark application, in milliseconds (ms).

  • LogRootPath: The path where the log file is stored.

  • LastAttemptId: The retry ID.

  • WebUiAddress: The web UI address.

  • SubmittedTimeInMillis: The time when the Spark application was submitted. This is a UNIX timestamp, in milliseconds (ms).

  • StartedTimeInMillis: The time when the Spark application was created. This is a UNIX timestamp, in milliseconds (ms).

  • LastUpdatedTimeInMillis: The time when the Spark application was last updated. This is a UNIX timestamp, in milliseconds (ms).

  • TerminatedTimeInMillis: The time when the Spark application stopped running. This is a UNIX timestamp, in milliseconds (ms).

  • DBClusterId: The ID of the cluster on which the Spark application is running.

  • ResourceGroupName: The name of the job resource group.

  • DurationInMillis: The execution duration of the Spark application, in milliseconds (ms).

{ "AppId": "test-app-id", "State": "Running", "Detail": { "LastAttemptId": "0001", "WebUiAddress": "http://spark-ui:4040", "SubmittedTimeInMillis": 1644805200260, "DBClusterId": "db", "EstimateExecutionCpuTimeInSeconds": 1644812400, "AppConf": "{}", "StartedTimeInMillis": 1644806400260, "LastUpdatedTimeInMillis": 1644808800260, "ResourceGroupName": "rg" }, "AppName": "test app name" }

PageNumber

integer

The page number.

1

PageSize

integer

The number of entries per page.

10

TotalCount

integer

The total number of entries.

1

Examples

Success response

JSON format

{
  "PageNumber": 1,
  "PageSize": 10,
  "TotalCount": 1,
  "RequestId": "D65A809F-34CE-4550-9BC1-0ED21ETG380",
  "Data": {
    "AppInfoList": [
      {
        "AppId": "s202207151211hz0c****",
        "AppName": "SparkTest",
        "Priority": "NORMAL",
        "State": "SUBMITTED",
        "Message": "WARN: Disk is full.",
        "Detail": {
          "Data": "{     \"name\": \"SparkPi\",     \"file\": \"local:///tmp/spark-examples.jar\",     \"className\": \"org.apache.spark.examples.SparkPi\",     \"args\": [         \"1000000\"     ],     \"conf\": {         \"spark.driver.resourceSpec\": \"small\",         \"spark.executor.instances\": 1,         \"spark.executor.resourceSpec\": \"small\"     } }",
          "EstimateExecutionCpuTimeInSeconds": 100,
          "LogRootPath": "oss:///logs/driver",
          "LastAttemptId": "s202204291426hzpre60****-0003",
          "WebUiAddress": "https://adbsparkui-cn-hangzhou.aliyuncs.com/?token=****",
          "SubmittedTimeInMillis": 1651213645000,
          "StartedTimeInMillis": 1651213645010,
          "LastUpdatedTimeInMillis": 1651213645200,
          "TerminatedTimeInMillis": 1651213645300,
          "DBClusterId": "amv-bp11q28kvl688****",
          "ResourceGroupName": "spark-rg",
          "DurationInMillis": 100,
          "AppType": "BATCH"
        },
        "DBClusterId": "amv-23xxxx"
      }
    ],
    "PageNumber": 1,
    "PageSize": 10,
    "TotalCount": 1
  }
}

Error codes

HTTP status code

Error code

Error message

Description

400 Spark.InvalidParameter Invalid parameter value: %s Incorrect input parameter:%s.
400 Spark.InvalidState The object of the operation is in an invalid state: %s The operation object is invalid.
500 Spark.ServerError The Spark control component system encountered an error, please create a ticket to solve the problem or concat the supported engineer on duty. Error message: %s An error occurred on the Spark control component system. Submit a ticket or contact technical support.
403 Spark.Forbidden No permissions to access the resources: %s Insufficient permissions to access the related resources. Information that you want to access: %s.
404 Spark.ObjectNotFound The object is not found. More information: %s

See Error Codes for a complete list.

Release notes

See Release Notes for a complete list.