GetSparkAppInfo

Updated at: 2024-12-05 06:06

Queries the information about an Spark application.

Operation description

  • Regional public endpoint: adb.<region-id>.aliyuncs.com. Example: adb.cn-hangzhou.aliyuncs.com.
  • Regional Virtual Private Cloud (VPC) endpoint: adb-vpc.<region-id>.aliyuncs.com. Example: adb-vpc.cn-hangzhou.aliyuncs.com.
Note
If HTTP status code 409 is returned when you call this operation in the China (Qingdao), China (Shenzhen), China (Guangzhou), or China (Hong Kong) region, contact technical support.

Debugging

You can run this interface directly in OpenAPI Explorer, saving you the trouble of calculating signatures. After running successfully, OpenAPI Explorer can automatically generate SDK code samples.

Authorization information

The following table shows the authorization information corresponding to the API. The authorization information can be used in the Action policy element to grant a RAM user or RAM role the permissions to call this API operation. Description:

  • Operation: the value that you can use in the Action element to specify the operation on a resource.
  • Access level: the access level of each operation. The levels are read, write, and list.
  • Resource type: the type of the resource on which you can authorize the RAM user or the RAM role to perform the operation. Take note of the following items:
    • The required resource types are displayed in bold characters.
    • If the permissions cannot be granted at the resource level, All Resources is used in the Resource type column of the operation.
  • Condition Key: the condition key that is defined by the cloud service.
  • Associated operation: other operations that the RAM user or the RAM role must have permissions to perform to complete the operation. To complete the operation, the RAM user or the RAM role must have the permissions to perform the associated operations.
OperationAccess levelResource typeCondition keyAssociated operation
OperationAccess levelResource typeCondition keyAssociated operation
adb:GetSparkAppInfoget
*DBClusterLakeVersion
acs:adb:{#regionId}:{#accountId}:dbcluster/{#DBClusterId}/resourcegroup/{#ResourceGroup}/sparkapp/{#SparkAppId}
    none
none

Request parameters

ParameterTypeRequiredDescriptionExample
ParameterTypeRequiredDescriptionExample
AppIdstringYes

The application ID.

Note
You can call the ListSparkApps operation to query the Spark application IDs.
s202205201533hz1209892000****
DBClusterIdstringNo

The ID of the AnalyticDB for MySQL Data Lakehouse Edition cluster.

Note
You can call the DescribeDBClusters operation to query the IDs of all AnalyticDB for MySQL clusters within a region.
am-bp11q28kvl688****

Response parameters

ParameterTypeDescriptionExample
ParameterTypeDescriptionExample
object
RequestIdstring

The request ID.

D65A809F-34CE-4550-9BC1-0ED21ETG380
DataSparkAppInfo

The queried Spark application. Fields in the response parameter:

  • Data: the data of the Spark application template.
  • EstimateExecutionCpuTimeInSeconds: the amount of time that is required to consume CPU resources for running the Spark application. Unit: milliseconds.
  • LogRootPath: the storage path of log files.
  • LastAttemptId: the most recent attempt ID.
  • WebUiAddress: the web UI URL.
  • SubmittedTimeInMillis: the time when the Spark application was submitted. This value is a UNIX timestamp representing the number of milliseconds that have elapsed since January 1, 1970, 00:00:00 UTC.
  • StartedTimeInMillis: the time when the Spark application was created. This value is a UNIX timestamp representing the number of milliseconds that have elapsed since January 1, 1970, 00:00:00 UTC.
  • LastUpdatedTimeInMillis: the time when the Spark application was last updated. This value is a UNIX timestamp representing the number of milliseconds that have elapsed since January 1, 1970, 00:00:00 UTC.
  • TerminatedTimeInMillis: the time when the Spark application was terminated. This value is a UNIX timestamp representing the number of milliseconds that have elapsed since January 1, 1970, 00:00:00 UTC.
  • DBClusterId: the ID of the cluster on which the Spark application runs.
  • ResourceGroupName: the name of the job resource group.
  • DurationInMillis: the amount of time that is required to run the Spark application. Unit: milliseconds.
{ \"name\": \"SparkPi\", \"file\": \"local:///tmp/spark-examples.jar\", \"className\": \"org.apache.spark.examples.SparkPi\", \"args\": [ \"1000000\" ], \"conf\": { \"spark.driver.resourceSpec\": \"small\", \"spark.executor.instances\": 1, \"spark.executor.resourceSpec\": \"small\" } }", "EstimateExecutionCpuTimeInSeconds" : 100, "LogRootPath" : "oss://test/logs/driver", "LastAttemptId" : "s202204291426hzpre60cfabb0000004-0003", "WebUiAddress" : "https://sparkui.aliyuncs.com/token=xxx", "SubmittedTimeInMillis" : 1651213645000, "StartedTimeInMillis" : 1651213645010, "LastUpdatedTimeInMillis" : 1651213645200, "TerminatedTimeInMillis" : 1651213645300, "DBClusterId" : "am-dbclusterid", "ResourceGroupName" : "spark-rg", "DurationInMillis" : 100 }

Examples

Sample success responses

JSONformat

{
  "RequestId": "D65A809F-34CE-4550-9BC1-0ED21ETG380",
  "Data": {
    "AppId": "s202207151211hz0c****",
    "AppName": "SparkTest",
    "Priority": "NORMAL",
    "State": "SUBMITTED",
    "Message": "WARN: Disk is full.",
    "Detail": {
      "Data": "{     \"name\": \"SparkPi\",     \"file\": \"local:///tmp/spark-examples.jar\",     \"className\": \"org.apache.spark.examples.SparkPi\",     \"args\": [         \"1000000\"     ],     \"conf\": {         \"spark.driver.resourceSpec\": \"small\",         \"spark.executor.instances\": 1,         \"spark.executor.resourceSpec\": \"small\"     } }",
      "EstimateExecutionCpuTimeInSeconds": 100,
      "LogRootPath": "oss://<bucket-name>/logs/driver",
      "LastAttemptId": "s202204291426hzpre60****-0003",
      "WebUiAddress": "https://adbsparkui-cn-hangzhou.aliyuncs.com/?token=****",
      "SubmittedTimeInMillis": 1651213645000,
      "StartedTimeInMillis": 1651213645010,
      "LastUpdatedTimeInMillis": 1651213645200,
      "TerminatedTimeInMillis": 1651213645300,
      "DBClusterId": "amv-bp11q28kvl688****",
      "ResourceGroupName": "spark-rg",
      "DurationInMillis": 100
    },
    "DBClusterId": "amv-23xxxx"
  }
}

Error codes

HTTP status codeError codeError messageDescription
HTTP status codeError codeError messageDescription
400Spark.InvalidParameterInvalid parameter value: %sIncorrect input parameter: %s.
400Spark.InvalidStateThe object of the operation is in an invalid state: %sThe operation object is invalid.
400Spark.AnalyzeTask.AppStateNotAcceptedOnly Spark applications in the terminated state can be analyzed. The specified application %s does not meet the requirement.-
400Spark.AnalyzeTask.FailedToKillFailed to terminate the Spark log analysis task %s, because the task status has changed. Obtain the latest status and try again.Failed to terminate the Spark log analysis task because the state of the task has changed. Obtain the latest state of the task and try again.
400Spark.AnalyzeTask.InvalidStateWhenAnalyzingAppOnly logs of Spark applications in the terminated state can be analyzed. The specified application %s does not meet the requirement.You can perform log analysis only on Spark jobs that are in the terminated state.
400Spark.App.InvalidResourceSpecThe requested resource type is not supported:\n %s-
400Spark.App.KillOperationFailedFailed to kill the application %s, please retry in a few seconds.-
400Spark.App.ParameterConflictConflicting parameters submitted:\n %s-
400Spark.Config.invalidConnectorsThe spark.adb.connectors configuration is invalid: %s-
400Spark.Config.RoleArnVerifyFailedRoleARN parameter verification failed. Error msg: %s when verify RoleArn %s-
400Spark.Log.IllegalPathInvalid job log URI: %s.The Spark log path is invalid.
400Spark.Log.InvalidStateFailed to obtain the logs of the Spark job %s in the %s state.-
400Spark.Oss.InternalErrorAn OSS internal error occurred: %s-
400Spark.RoleArn.Invalid%s is not found, or the RAM role has not been authorized.-
400Spark.SQL.NotFoundExecutableSQLErrorNo executable statements are submitted. Please check the input SQL.-
400Spark.SQL.ParserErrorFailed to parse the SQL %s. Error message: %s.-
400Spark.TemplateFile.BadFileTypeThe requested template %s is not a file.The specified template file ID is not of the file type.
403Spark.ForbiddenNo permissions to access the resources: %sInsufficient permissions to access the related resources. Information that you want to access: %s.
404Spark.AnalyzeTask.NotFoundThe requested analysis task %s is not found.The log analysis task you requested to view does not exist. %s.
404Spark.App.ContentNotFoundThe requested content %s of the Spark application is not found.-
404Spark.Log.PodLogNotFoundCan't find logs of the pod by podName[%s] in the namespace[%s].Unable to find log [%s] for corresponding pod via podName.
404Spark.ObjectNotFoundThe object is not found. More information: %s-
404Spark.TemplateFile.FileNotFoundThe template file %s is not found.Failed to find the specified template file.
404Spark.TemplateFile.TemplateNotFoundThe template %s is not found.-
406Spark.App.KillNotAcceptableCan't kill the application %s in %s state.-
500Spark.ServerErrorThe Spark control component system encountered an error, please create a ticket to solve the problem or concat the supported engineer on duty. Error message: %sAn error occurred on the Spark control component system. Submit a ticket or contact technical support.
500Spark.Resources.LoadFileFromClasspathFailedCan't load the content from file: %s-

For a list of error codes, visit the Service error codes.

Change history

Change timeSummary of changesOperation
Change timeSummary of changesOperation
2023-11-24The Error code has changed. The request parameters of the API has changedView Change Details
2023-06-28The Error code has changedView Change Details
  • On this page (1)
  • Operation description
  • Debugging
  • Authorization information
  • Request parameters
  • Response parameters
  • Examples
  • Error codes
  • Change history
Feedback
phone Contact Us

Chat now with Alibaba Cloud Customer Service to assist you in finding the right products and services to meet your needs.

alicare alicarealicarealicare