Promo Center

50% off for new user

Direct Mail-46% off

Learn More

GetSparkDefinitions

Updated at: 2024-08-30 10:10

Queries the common definitions of Spark applications.

Operation description

  • Regional public endpoint: adb.<region-id>.aliyuncs.com. Example: adb.cn-hangzhou.aliyuncs.com.
  • Regional Virtual Private Cloud (VPC) endpoint: adb-vpc.<region-id>.aliyuncs.com. Example: adb-vpc.cn-hangzhou.aliyuncs.com.
Note
If HTTP status code 409 is returned when you call this operation in the China (Qingdao), China (Shenzhen), China (Guangzhou), or China (Hong Kong) region, contact technical support.

Debugging

OpenAPI Explorer automatically calculates the signature value. For your convenience, we recommend that you call this operation in OpenAPI Explorer.

Authorization information

The following table shows the authorization information corresponding to the API. The authorization information can be used in the Action policy element to grant a RAM user or RAM role the permissions to call this API operation. Description:

  • Operation: the value that you can use in the Action element to specify the operation on a resource.
  • Access level: the access level of each operation. The levels are read, write, and list.
  • Resource type: the type of the resource on which you can authorize the RAM user or the RAM role to perform the operation. Take note of the following items:
    • The required resource types are displayed in bold characters.
    • If the permissions cannot be granted at the resource level, All Resources is used in the Resource type column of the operation.
  • Condition Key: the condition key that is defined by the cloud service.
  • Associated operation: other operations that the RAM user or the RAM role must have permissions to perform to complete the operation. To complete the operation, the RAM user or the RAM role must have the permissions to perform the associated operations.
OperationAccess levelResource typeCondition keyAssociated operation
OperationAccess levelResource typeCondition keyAssociated operation
adb:GetSparkDefinitionsget
  • DBClusterLakeVersion
    acs:adb:{#regionId}:{#accountId}:dbcluster/{#DBClusterId}
    none
none

Request parameters

ParameterTypeRequiredDescriptionExample
ParameterTypeRequiredDescriptionExample
DBClusterIdstringNo

The ID of the AnalyticDB for MySQL Data Lakehouse Edition cluster.

amv-clusterxxx

Response parameters

ParameterTypeDescriptionExample
ParameterTypeDescriptionExample
object

Schema of Response

RequestIdstring

The ID of the request.

D65A809F-34CE-4550-9BC1-0ED21ETG380
Datastring

The common definitions of Spark applications.

{"SQLTemplateExample": "-- Here is just an example of SparkSQL. Modify the content and run your spark program. conf spark.driver.resourceSpec=medium; conf spark.executor.instances=2; conf spark.executor.resourceSpec=medium; conf spark.app.name=Spark SQL Test; conf spark.adb.connectors=oss; -- Here are your sql statements show databases;", "BatchTemplateExample": "{ "comments": [ "-- Here is just an example of SparkPi. Modify the content and run your spark program." ], "args": ["1000"], "file":"local:///tmp/spark-examples.jar", "name": "SparkPi", "className": "org.apache.spark.examples.SparkPi", "conf": { "spark.driver.resourceSpec": "medium", "spark.executor.instances": 2, "spark.executor.resourceSpec": "medium" } }"

Examples

Sample success responses

JSONformat

{
  "RequestId": "D65A809F-34CE-4550-9BC1-0ED21ETG380",
  "Data": "{\"SQLTemplateExample\": \"-- Here is just an example of SparkSQL. Modify the content and run your spark program.\nconf spark.driver.resourceSpec=medium;\nconf spark.executor.instances=2;\nconf spark.executor.resourceSpec=medium;\nconf spark.app.name=Spark SQL Test;\nconf spark.adb.connectors=oss;\n\n-- Here are your sql statements\nshow databases;\",\n                 \"BatchTemplateExample\": \"{\n    \"comments\": [\n        \"-- Here is just an example of SparkPi. Modify the content and run your spark program.\"\n    ],\n    \"args\": [\"1000\"],\n  \"file\":\"local:///tmp/spark-examples.jar\",\n    \"name\": \"SparkPi\",\n    \"className\": \"org.apache.spark.examples.SparkPi\",\n    \"conf\": {      \"spark.driver.resourceSpec\": \"medium\",\n        \"spark.executor.instances\": 2,\n        \"spark.executor.resourceSpec\": \"medium\"\n    }\n}\"\n"
}

Error codes

HTTP status codeError codeError messageDescription
HTTP status codeError codeError messageDescription
400Spark.InvalidParameterInvalid parameter value: %sThe specified parameter is invalid.
400Spark.InvalidStateThe object of the operation is in an invalid state: %sThe operation object is invalid.
400Spark.AnalyzeTask.AppStateNotAcceptedOnly Spark applications in the terminated state can be analyzed. The specified application %s does not meet the requirement.-
400Spark.AnalyzeTask.FailedToKillFailed to terminate the Spark log analysis task %s, because the task status has changed. Obtain the latest status and try again.Failed to terminate the Spark log analysis task because the state of the task has changed. Obtain the latest state of the task and try again.
400Spark.AnalyzeTask.InvalidStateWhenAnalyzingAppOnly logs of Spark applications in the terminated state can be analyzed. The specified application %s does not meet the requirement.You can perform log analysis only on Spark jobs that are in the terminated state.
400Spark.App.InvalidResourceSpecThe requested resource type is not supported:\n %s-
400Spark.App.ParameterConflictConflicting parameters submitted:\n %s-
400Spark.Config.invalidConnectorsThe spark.adb.connectors configuration is invalid: %s-
400Spark.Config.RoleArnVerifyFailedRoleARN parameter verification failed. Error msg: %s when verify RoleArn %s-
400Spark.Log.InvalidStateFailed to obtain the logs of the Spark job %s in the %s state.-
400Spark.SQL.NotFoundExecutableSQLErrorNo executable statements are submitted. Please check the input SQL.-
400Spark.SQL.ParserErrorFailed to parse the SQL %s. Error message: %s.-
404Spark.AnalyzeTask.NotFoundThe requested analysis task %s is not found.The log analysis task you requested to view does not exist. %s.
404Spark.App.ContentNotFoundThe requested content %s of the Spark application is not found.-
404Spark.ObjectNotFoundThe object is not found. More information: %s-
404Spark.TemplateFile.FileNotFoundThe template file %s is not found.Failed to find the specified template file.
404Spark.TemplateFile.TemplateNotFoundThe template %s is not found.-
500Spark.ServerErrorThe Spark control component system encountered an error, please create a ticket to solve the problem or concat the supported engineer on duty. Error message: %sAn error occurred on the Spark control component system. Submit a ticket or contact technical support.
500Spark.Resources.LoadFileFromClasspathFailedCan't load the content from file: %s-

For a list of error codes, visit the Service error codes.

Change history

Change timeSummary of changesOperation
Change timeSummary of changesOperation
2023-06-28The Error code has changedView Change Details
  • On this page (1)
  • Operation description
  • Debugging
  • Authorization information
  • Request parameters
  • Response parameters
  • Examples
  • Error codes
  • Change history
Feedback
phone Contact Us

Chat now with Alibaba Cloud Customer Service to assist you in finding the right products and services to meet your needs.

alicare alicarealicarealicare