section> Computing
  • Auto Scaling
  • Bare Metal Server
  • Dedicated Host
  • Elastic Cloud Server
  • FunctionGraph
  • Image Management Service
Network
  • Direct Connect
  • Domain Name Service
  • Elastic IP
  • Elastic Load Balancing
  • Enterprise Router
  • NAT Gateway
  • Private Link Access Service
  • Secure Mail Gateway
  • Virtual Private Cloud
  • Virtual Private Network
  • VPC Endpoint
Storage
  • Cloud Backup and Recovery
  • Cloud Server Backup Service
  • Elastic Volume Service
  • Object Storage Service
  • Scalable File Service
  • Storage Disaster Recovery Service
  • Volume Backup Service
Application
  • API Gateway (APIG)
  • Application Operations Management
  • Application Performance Management
  • Distributed Message Service (for Kafka)
  • Simple Message Notification
Data Analysis
  • Cloud Search Service
  • Data Lake Insight
  • Data Warehouse Service
  • DataArts Studio
  • MapReduce Service
  • ModelArts
  • Optical Character Recognition
Container
  • Application Service Mesh
  • Cloud Container Engine
  • Cloud Container Instance
  • Software Repository for Containers
Databases
  • Data Replication Service
  • Distributed Cache Service
  • Distributed Database Middleware
  • Document Database Service
  • GeminiDB
  • Relational Database Service
  • TaurusDB
Management & Deployment
  • Cloud Create
  • Cloud Eye
  • Cloud Trace Service
  • Config
  • Log Tank Service
  • Resource Formation Service
  • Tag Management Service
Security Services
  • Anti-DDoS
  • Cloud Firewall
  • Database Security Service
  • Dedicated Web Application Firewall
  • Host Security Service
  • Identity and Access Management
  • Key Management Service
  • Web Application Firewall
Other
  • Enterprise Dashboard
  • Marketplace
  • Price Calculator
  • Status Dashboard
APIs
  • REST API
  • API Usage Guidelines
  • Endpoints
Development and Automation
  • SDKs
  • Drivers and Tools
  • Terraform
  • Ansible
  • Cloud Create
Architecture Center
  • Best Practices
  • Blueprints
IaaSComputingAuto ScalingBare Metal ServerDedicated HostElastic Cloud ServerFunctionGraphImage Management ServiceNetworkDirect ConnectDomain Name ServiceElastic IPElastic Load BalancingEnterprise RouterNAT GatewayPrivate Link Access ServiceSecure Mail GatewayVirtual Private CloudVirtual Private NetworkVPC EndpointStorageCloud Backup and RecoveryCloud Server Backup ServiceElastic Volume ServiceObject Storage ServiceScalable File ServiceStorage Disaster Recovery ServiceVolume Backup ServicePaaSApplicationAPI Gateway (APIG)Application Operations ManagementApplication Performance ManagementDistributed Message Service (for Kafka)Simple Message NotificationData AnalysisCloud Search ServiceData Lake InsightData Warehouse ServiceDataArts StudioMapReduce ServiceModelArtsOptical Character RecognitionContainerApplication Service MeshCloud Container EngineCloud Container InstanceSoftware Repository for ContainersDatabasesData Replication ServiceDistributed Cache ServiceDistributed Database MiddlewareDocument Database ServiceGeminiDBRelational Database ServiceTaurusDBManagementManagement & DeploymentCloud CreateCloud EyeCloud Trace ServiceConfigLog Tank ServiceResource Formation ServiceTag Management ServiceSecuritySecurity ServicesAnti-DDoSCloud FirewallDatabase Security ServiceDedicated Web Application FirewallHost Security ServiceIdentity and Access ManagementKey Management ServiceWeb Application FirewallOtherOtherEnterprise DashboardMarketplacePrice CalculatorStatus Dashboard

Data Lake Insight

  • Service Overview
  • Getting Started
  • DLI Job Development Process
  • Preparations
  • Creating an Elastic Resource Pool and Queues Within It
  • Creating Databases and Tables
  • Data Migration and Transmission
  • Configuring an Agency to Allow DLI to Access Other Cloud Services
  • Submitting a SQL Job Using DLI
  • Submitting a Flink Job Using DLI
    • Flink Job Overview
    • Creating a Flink OpenSource SQL Job
    • Creating a Flink Jar Job
    • Configuring Flink Job Permissions
    • Managing Flink Jobs
    • Managing Flink Job Templates
    • Adding Tags to a Flink Job
  • Submitting a Spark Job Using DLI
  • Using Cloud Eye to Monitor DLI
  • Using AOM to Monitor DLI
  • Using CTS to Audit DLI
  • Permissions Management
  • Common DLI Management Operations
  • FAQ
  • Change History
  • User Guide
  • Submitting a Flink Job Using DLI
  • Managing Flink Job Templates

Managing Flink Job Templates¶

Flink templates include sample templates and custom templates. You can modify an existing sample template to meet the actual job logic requirements and save time for editing SQL statements. You can also customize a job template based on your habits and methods so that you can directly invoke or modify the template in later jobs.

Flink template management provides the following functions:

  • Flink SQL Sample Template

  • Flink OpenSource SQL Sample Template

  • Custom Templates

  • Creating a Template

  • Creating a Job Based on a Template

  • Modifying a Template

  • Deleting a Template

Flink SQL Sample Template¶

The template list displays existing sample templates for Flink SQL jobs. Table 1 describes the parameters in the template list.

The scenarios of sample templates can be different, which are subject to the console.

Table 1 Parameters in the Flink SQL sample template list¶

Parameter

Description

Name

Template name. Enter 1 to 64 characters. Only letters, numbers, hyphens (-), and underscores (_) are allowed.

Description

Description of a template. It can contain 0 to 512 characters.

Operation

Create Job: Create a job directly by using the template. After a job is created, the system switches to the Edit page under Job Management.

Flink OpenSource SQL Sample Template¶

The template list displays existing sample templates for Flink SQL OpenSource jobs. Table 1 describes the parameters in the template list.

Table 2 Parameters in the sample template list for Flink OpenSource SQL jobs¶

Parameter

Description

Name

Template name. Enter 1 to 64 characters. Only letters, numbers, hyphens (-), and underscores (_) are allowed.

Description

Description of a template. It can contain 0 to 512 characters.

Operation

Create Job: Create a job directly by using the template. After a job is created, the system switches to the Edit page under Job Management.

Custom Templates¶

The custom template list displays all Jar job templates. Table 1 describes parameters in the custom template list.

Table 3 Parameters in the custom template list¶

Parameter

Description

Name

Template name. Enter 1 to 64 characters. Only letters, numbers, hyphens (-), and underscores (_) are allowed.

Description

Description of a template. It can contain 0 to 512 characters.

Created

Time when a template is created.

Updated

Latest time when a template is modified.

Operation

  • Edit: Modify a template that has been created.

  • Create Job: Create a job directly by using the template. After a job is created, the system switches to the Edit page under Job Management.

  • More:

    • Delete: Delete a created template.

Creating a Template¶

You can create a template using any of the following methods:

  • Creating a template on the Template Management page

    1. In the left navigation pane of the DLI management console, choose Job Templates > Flink Templates.

    2. Click Create Template in the upper right corner of the page. The Create Template dialog box is displayed.

    3. Specify Name and Description.

      Table 4 Template parameters¶

      Parameter

      Description

      Type

      Template type

      • Flink SQL job template

      • Flink OpenSource SQL job template

      Name

      Template name. Enter 1 to 64 characters. Only letters, numbers, hyphens (-), and underscores (_) are allowed.

      Note

      The template name must be unique.

      Description

      Description of a template. It can contain 0 to 512 characters.

      Tags

      Tags used to identify cloud resources. A tag includes the tag key and tag value. If you want to use the same tag to identify multiple cloud resources, that is, to select the same tag from the drop-down list box for all services, you are advised to create predefined tags on the Tag Management Service (TMS).

      Note

      • A maximum of 20 tags can be added.

      • Only one tag value can be added to a tag key.

      • The key name in each resource must be unique.

      • Tag key: Enter a tag key name in the text box.

        Note

        A tag key can contain a maximum of 128 characters. Only letters, numbers, spaces, and special characters (_.:+-@) are allowed, but the value cannot start or end with a space or start with _sys_.

      • Tag value: Enter a tag value in the text box.

        Note

        A tag value can contain a maximum of 255 characters. Only letters, numbers, spaces, and special characters (_.:+-@) are allowed.

    4. Click OK to enter the editing page.

      The Table 5 describes the parameters on the template editing page.

      Table 5 Template parameters¶

      Parameter

      Description

      Name

      You can modify the template name.

      Description

      You can modify the template description.

      Saving Mode

      • Save Here: Save the modification to the current template.

      • Save as New: Save the modification as a new template.

      SQL statement editing area

      In the area, you can enter detailed SQL statements to implement business logic. For how to compile SQL statements, see Data Lake Insight SQL Syntax Reference.

      Save

      Save the modifications.

      Create Job

      Use the current template to create a job.

      Format

      Format SQL statements. After SQL statements are formatted, you need to compile SQL statements again.

      Theme Settings

      Change the font size, word wrap, and page style (black or white background).

    5. In the SQL statement editing area, enter SQL statements to implement service logic. For how to compile SQL statements, see Data Lake Insight SQL Syntax Reference.

    6. After the SQL statement is edited, click Save in the upper right corner to complete the template creation.

    7. (Optional) If you do not need to modify the template, click Create Job in the upper right corner to create a job based on the current template. For how to create a job, see Creating a Flink Jar Job.

  • Creating a template based on an existing job template

    1. In the left navigation pane of the DLI management console, choose Job Templates > Flink Templates. Click the Custom Templates tab.

    2. In the row where the desired template is located in the custom template list, click Edit under Operation to enter the Edit page.

    3. After the modification is complete, set Saving Mode to Save as New.

    4. Click Save in the upper right corner to save the template as a new one.

  • Creating a template using a created job

    1. In the left navigation pane of the DLI management console, choose Job Management > Flink Jobs. The Flink Jobs page is displayed.

    2. Click Create Job in the upper right corner. The Create Job page is displayed.

    3. Specify parameters as required.

    4. Click OK to enter the editing page.

    5. After the SQL statement is compiled, click Set as Template.

    6. In the Set as Template dialog box that is displayed, specify Name and Description and click OK.

  • Creating a template based on the existing job

    1. In the left navigation pane of the DLI management console, choose Job Management > Flink Jobs. The Flink Jobs page is displayed.

    2. In the job list, locate the row where the job that you want to set as a template resides, and click Edit in the Operation column.

    3. After the SQL statement is compiled, click Set as Template.

    4. In the Set as Template dialog box that is displayed, specify Name and Description and click OK.

Creating a Job Based on a Template¶

You can create jobs based on sample templates or custom templates.

  1. In the left navigation pane of the DLI management console, choose Job Templates > Flink Templates.

  2. In the sample template list, click Create Job in the Operation column of the target template. For how to create a job, see Creating a Flink OpenSource SQL Job and Creating a Flink Jar Job.

Modifying a Template¶

After creating a custom template, you can modify it as required. The sample template cannot be modified, but you can view the template details.

  1. In the left navigation pane of the DLI management console, choose Job Templates > Flink Templates. Click the Custom Templates tab.

  2. In the row where the template you want to modify is located in the custom template list, click Edit in the Operation column to enter the Edit page.

  3. In the SQL statement editing area, modify the SQL statements as required.

  4. Set Saving Mode to Save Here.

  5. Click Save in the upper right corner to save the modification.

Deleting a Template¶

You can delete a custom template as required. The sample templates cannot be deleted. Deleted templates cannot be restored. Exercise caution when performing this operation.

  1. In the left navigation pane of the DLI management console, choose Job Templates > Flink Templates. Click the Custom Templates tab.

  2. In the custom template list, select the templates you want to delete and click Delete in the upper left of the custom template list.

    Alternatively, you can delete a template by performing the following operations: In the custom template list, locate the row where the template you want to delete resides, and click More > Delete in the Operation column.

  3. In the displayed dialog box, click Yes.

  • Prev
  • Next
last updated: 2025-06-16 14:07 UTC - commit: 2d6c283406071bb470705521bc41e86fa3400203
Edit pageReport Documentation Bug
Page Contents
  • Managing Flink Job Templates
    • Flink SQL Sample Template
    • Flink OpenSource SQL Sample Template
    • Custom Templates
    • Creating a Template
    • Creating a Job Based on a Template
    • Modifying a Template
    • Deleting a Template
© T-Systems International GmbH
  • Contact
  • Data privacy
  • Disclaimer of Liabilities
  • Imprint