DLI Spark

Functions

The DLI Spark node is used to execute a predefined Spark job.

Parameters

Table 1, Table 2, and Table 3 describe the parameters of the DLI Sparknode node.

Table 1 Parameters of DLI Spark nodes

Parameter

Mandatory

Description

Node Name

Yes

Name of a node. The name must contain 1 to 128 characters, including only letters, numbers, underscores (_), hyphens (-), slashes (/), less-than signs (<), and greater-than signs (>).

DLI Queue

Yes

Select a queue from the drop-down list box.

Job Type

No

Select a custom image and the corresponding version. This parameter is available only when the DLI queue is a containerized queue.

A custom image is a feature of DLI. You can use the Spark or Flink basic images provided by DLI to pack the dependencies (files, JAR packages, or software) required into an image using Dockerfile, generate a custom image, and release the image to SWR. Then, select the generated image and run the job.

Custom images can change the container runtime environments of Spark and Flink jobs. You can embed private capabilities into custom images to enhance the functions and performance of jobs. .

Job Name

Yes

Name of the DLI Spark job. The name must contain 1 to 64 characters, including only letters, numbers, and underscores (_). The default value is the same as the node name.

Job Running Resources

No

Select the running resource specifications of the job.

  • 8-core, 32 GB memory

  • 16-core, 64 GB memory

  • 32-core, 128 GB memory

Major Job Class

Yes

Name of the major class of the Spark job. When the application type is .jar, the main class name cannot be empty.

Spark program resource package

Yes

JAR file on which the Spark job depends. You can enter the JAR package name or the corresponding OBS path. The format is as follows: obs://Bucket name/Folder name/Package name. Before selecting a resource package, upload the JAR package and its dependency packages to the OBS bucket and create resources on the Manage Resource page. For details, see Creating a Resource.

Major-Class Entry Parameters

No

User-defined parameters. Separate multiple parameters by Enter.

These parameters can be replaced by global variables. For example, if you create a global variable batch_num on the Global Configuration > Global Variables page, you can use {{batch_num}} to replace a parameter with this variable after the job is submitted.

Spark Job Running Parameters

No

Enter a parameter in the format of key/value. Press Enter to separate multiple key-value pairs. For details about the parameters, see Spark Configuration.

These parameters can be replaced by global variables. For example, if you create a global variable custom_class on the Global Configuration > Global Variables page, you can use "spark.sql.catalog"={{custom_class}} to replace a parameter with this variable after the job is submitted.

Note

The JVM garbage collection algorithm cannot be customized for Spark jobs.

Module Name

No

Dependency modules provided by DLI for executing datasource connection jobs. To access different services, you need to select different modules.

  • CloudTable/MRS HBase: sys.datasource.hbase

  • DDS: sys.datasource.mongo

  • CloudTable/MRS OpenTSDB: sys.datasource.opentsdb

  • DWS: sys.datasource.dws

  • RDS MySQL: sys.datasource.rds

  • RDS PostGre: sys.datasource.rds

  • DCS: sys.datasource.redis

  • CSS: sys.datasource.css

DLI internal modules include:

  • sys.res.dli-v2

  • sys.res.dli

  • sys.datasource.dli-inner-table

Table 2 Advanced parameters

Parameter

Mandatory

Description

Node Status Polling Interval (s)

Yes

Specifies how often the system check completeness of the node task. The value ranges from 1 to 60 seconds.

Max. Node Execution Duration

Yes

Execution timeout interval for the node. If retry is configured and the execution is not complete within the timeout interval, the node will not be retried and is set to the failed state.

Retry upon Failure

Yes

Indicates whether to re-execute a node task if its execution fails. Possible values:

  • Yes: The node task will be re-executed, and the following parameters must be configured:

    • Maximum Retries

    • Retry Interval (seconds)

  • No: The node task will not be re-executed. This is the default setting.

Note

If Timeout Interval is configured for the node, the node will not be executed again after the execution times out. Instead, the node is set to the failure state.

Failure Policy

Yes

Operation that will be performed if the node task fails to be executed. Possible values:

  • End the current job execution plan: stops running the current job. The job instance status is Failed.

  • Go to the next node: ignores the execution failure of the current node. The job instance status is Failure ignored.

  • Suspend current job execution plan: suspends running the current job. The job instance status is Waiting.

  • Suspend execution plans of the subsequent nodes: stops running subsequent nodes. The job instance status is Failed.

Table 3 Lineage

Parameter

Description

Input

Add

Click Add. In the Type drop-down list, select the type to be created. The value can be DWS, OBS, CSS, HIVE, DLI, or CUSTOM.

  • DWS

    • Connection Name: Click image1. In the displayed dialog box, select a DWS data connection.

    • Database: Click image2. In the displayed dialog box, select a DWS database.

    • Schema: Click image3. In the displayed dialog box, select a DWS schema.

    • Table Name: Click image4. In the displayed dialog box, select a DWS table.

  • OBS

    • Path: Click image5. In the displayed dialog box, select an OBS path.

  • CSS

    • Cluster Name: Click image6. In the displayed dialog box, select a CSS cluster.

    • Index: Enter a CSS index name.

  • HIVE

    • Connection Name: Click image7. In the displayed dialog box, select a HIVE data connection.

    • Database: Click image8. In the displayed dialog box, select a HIVE database.

    • Table Name: Click image9. In the displayed dialog box, select a HIVE table.

  • CUSTOM

    • Name: Enter a name of the CUSTOM type.

    • Attribute: Enter an attribute of the CUSTOM type. You can add more than one attribute.

  • DLI

    • Connection Name: Click image10. In the displayed dialog box, select a DLI data connection.

    • Database: Click image11. In the displayed dialog box, select a DLI database.

    • Table Name: Click image12. In the displayed dialog box, select a DLI table.

OK

Click OK to save the parameter settings.

Cancel

Click Cancel to cancel the parameter settings.

Modify

Click image13 to modify the parameter settings. After the modification, save the settings.

Delete

Click image14 to delete the parameter settings.

View Details

Click image15 to view details about the table created based on the input lineage.

Output

Add

Click Add. In the Type drop-down list, select the type to be created. The value can be DWS, OBS, CSS, HIVE, DLI, or CUSTOM.

  • DWS

    • Connection Name: Click image16. In the displayed dialog box, select a DWS data connection.

    • Database: Click image17. In the displayed dialog box, select a DWS database.

    • Schema: Click image18. In the displayed dialog box, select a DWS schema.

    • Table Name: Click image19. In the displayed dialog box, select a DWS table.

  • OBS

    • Path: Click image20. In the displayed dialog box, select an OBS path.

  • CSS

    • Cluster Name: Click image21. In the displayed dialog box, select a CSS cluster.

    • Index: Enter a CSS index name.

  • HIVE

    • Connection Name: Click image22. In the displayed dialog box, select a HIVE data connection.

    • Database: Click image23. In the displayed dialog box, select a HIVE database.

    • Table Name: Click image24. In the displayed dialog box, select a HIVE table.

  • CUSTOM

    • Name: Enter a name of the CUSTOM type.

    • Attribute: Enter an attribute of the CUSTOM type. You can add more than one attribute.

  • DLI

    • Connection Name: Click image25. In the displayed dialog box, select a DLI data connection.

    • Database: Click image26. In the displayed dialog box, select a DLI database.

    • Table Name: Click image27. In the displayed dialog box, select a DLI table.

OK

Click OK to save the parameter settings.

Cancel

Click Cancel to cancel the parameter settings.

Modify

Click image28 to modify the parameter settings. After the modification, save the settings.

Delete

Click image29 to delete the parameter settings.

View Details

Click image30 to view details about the table created based on the output lineage.