Setting the Priority for a Job

Scenario

In actual job operations, jobs have different importance and urgency levels. So, the compute resources required for normal operations of important and urgent jobs need to be guaranteed.

DLI allows you to set the priority of each Spark job, Spark SQL job, and Flink job. When compute resources are insufficient, the resources are preferentially used for jobs with higher priorities.

Note

  • Priorities can only be set for jobs running in elastic resource pools.

  • SQL jobs in elastic resource pools support priorities.

  • Priorities can be set for jobs of Spark 2.4.5 or later.

  • Priorities can be set for jobs of Flink 1.12 or later.

Notes

  • You can set the priority for each job. The value ranges from 1 to 10. A larger value indicates a higher priority. Compute resources are preferentially allocated to high-priority jobs. That is, if compute resources required for high-priority jobs are insufficient, compute resources for low-priority jobs are reduced.

  • The default priority of Flink jobs running on a general-purpose queue is 5.

  • The default priority of Spark jobs running on a general-purpose queue is 3.

  • The default priority of jobs running on a SQL queue is 3.

  • The change to the job priority takes effect only after the job is stopped, edited, and submitted.

  • For Flink jobs, enable dynamic scaling by setting flink.dli.job.scale.enable to true by referring to Enabling Dynamic Scaling for Flink Jobs, and then set the job priority.

  • The change to the priority for a Flink job takes effect only after the job is stopped, edited, and submitted.

Procedure for Setting the Priorities for Spark Jobs

In the Spark Arguments(--conf) text box, enter the following statement. x indicates the priority value.

spark.dli.job.priority=x
  1. Log in to the DLI management console.

  2. In the navigation pane on the left, choose Job Management > Spark Jobs.

  3. Locate the row containing the job for which you want to set the priority and click Edit in the Operation column.

  4. In the Spark Arguments(--conf) text box, configure the spark.dli.job.priority parameter.

Procedure for Setting the Priorities for Spark SQL Jobs

Click Settings. In the Parameter Settings area, configure the following parameter. x indicates the priority value.

spark.sql.dli.job.priority=x
  1. Log in to the DLI management console.

  2. In the navigation pane on the left, choose Job Management > SQL Jobs.

  3. Locate the row containing the job for which you want to set the priority and click Edit in the Operation column.

  4. Click Settings. In the Parameter Settings area, configure the spark.sql.dli.job.priority parameter.