All Products
Search
Document Center

E-MapReduce:Get started with Spark Submit development

Last Updated:Feb 11, 2026

EMR Serverless Spark supports spark-submit command-line parameters. This support simplifies task execution. This topic walks you through an example to help you quickly get started with Spark Submit development.

Prerequisites

  • A workspace has been created. For more information, see Workspace Management.

  • The business application has been developed ahead of schedule, and the JAR package is ready.

Procedure

Step 1: Develop a JAR package

This Quick Start helps you quickly become familiar with Spark Submit tasks. It provides project files and a test JAR package that you can download for use in the following steps.

Click spark-examples_2.12-3.5.2.jar to download the test JAR package.

Note

spark-examples_2.12-3.5.2.jar is a simple Spark-provided example that calculates the value of Pi (π). You must use it with the esr-4.x database engine version to submit tasks. If you use the esr-5.x database engine version, download spark-examples_2.13-4.0.1.jar for validation in this topic.

Step 2: Upload the JAR package to OSS

This example uploads spark-examples_2.12-3.5.2.jar. For upload instructions, see Simple upload.

Step 3: Develop and run a task

  1. On the EMR Serverless Spark page, click Data Development in the navigation pane on the left.

  2. On the Development tab, click the image icon.

  3. Enter a name. Set Type to Batch Job > Spark Submit. Then click OK.

  4. Select a queue in the upper-right corner.

    For instructions on adding a queue, see Manage resource queues.

  5. In the new task editor, configure the following parameter. Leave other parameters unchanged. Then click Run.

    Parameter

    Description

    Script

    Enter your Spark Submit script.

    The following code provides an example:

    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.memory=2g \
    oss://<YourBucket>/spark-examples_2.12-3.3.1.jar
  6. In the Execution Records section below, click Log Exploration in the Actions column.

  7. On the Log Exploration tab, view related log information.

    image

Step 4: Publish the task

Important

Published tasks can be used as workflow nodes.

  1. After the task runs successfully, click Publish on the right.

  2. In the Publish dialog box, enter release notes and click OK.

(Optional) Step 5: View the Spark UI

After the task runs normally, view its execution details in the Spark UI.

  1. In the navigation pane on the left, click Job History.

  2. On the Application page, click Spark UI in the Actions column of the target task.

    The Spark UI page opens automatically. View task details there.

References

After publishing a task, you can schedule it in workflows. For more information, see Manage workflows. For a complete example of task orchestration and development, see Get started with SparkSQL development.