Lindorm Distributed Processing System (LDPS) provided by Lindorm provides a Spark web UI for each Spark job. On the Spark web UI page of your Spark job, you can view the information about the job, such as the status of the Spark job, the point in time when the Spark job was submitted, and the resource usage of the Spark job.
Go to the Spark web UI
Note
- For more information about how to obtain the address of the Spark web UI, see Manage jobs in the Lindorm console.
- For more information about the Spark web UI of a Spark job that is created by using open source Apache Spark, see Web UI.
Copy the address of the Spark web UI of your Spark job and paste the address in the address bar of your browser to enter the Spark web UI page.
Tab | Description |
---|---|
Jobs | Displays the information about all Spark jobs. |
Stages | Displays the status information of all stages of the Spark job. |
Storage | Displays the information about persisted resilient distributed datasets (RDDs) and DataFrames (DFs). |
Environment | Displays the information about the environment in which the Spark job runs. When the Spark job starts, the environment is built based on the runtime environment files, configuration files, and parameters that you specify. |
Executors | Displays the information about the executors that are used to run the Spark job. |
SQL | Displays the information about SQL queries that are processed by LDPS. |
Kyuubi Query Engine | Displays the information about Java Database Connectivity (JDBC) sessions. |