section> Computing
  • Auto Scaling
  • Bare Metal Server
  • Dedicated Host
  • Elastic Cloud Server
  • FunctionGraph
  • Image Management Service
Network
  • Direct Connect
  • Domain Name Service
  • Elastic IP
  • Elastic Load Balancing
  • Enterprise Router
  • NAT Gateway
  • Private Link Access Service
  • Secure Mail Gateway
  • Virtual Private Cloud
  • Virtual Private Network
  • VPC Endpoint
Storage
  • Cloud Backup and Recovery
  • Cloud Server Backup Service
  • Elastic Volume Service
  • Object Storage Service
  • Scalable File Service
  • Storage Disaster Recovery Service
  • Volume Backup Service
Application
  • API Gateway (APIG)
  • Application Operations Management
  • Application Performance Management
  • Distributed Message Service (for Kafka)
  • Simple Message Notification
Data Analysis
  • Cloud Search Service
  • Data Lake Insight
  • Data Warehouse Service
  • DataArts Studio
  • MapReduce Service
  • ModelArts
  • Optical Character Recognition
Container
  • Application Service Mesh
  • Cloud Container Engine
  • Cloud Container Instance
  • Software Repository for Containers
Databases
  • Data Replication Service
  • Distributed Cache Service
  • Distributed Database Middleware
  • Document Database Service
  • GeminiDB
  • Relational Database Service
  • TaurusDB
Management & Deployment
  • Cloud Create
  • Cloud Eye
  • Cloud Trace Service
  • Config
  • Log Tank Service
  • Resource Formation Service
  • Tag Management Service
Security Services
  • Anti-DDoS
  • Cloud Firewall
  • Database Security Service
  • Dedicated Web Application Firewall
  • Host Security Service
  • Identity and Access Management
  • Key Management Service
  • Web Application Firewall
Other
  • Enterprise Dashboard
  • Marketplace
  • Price Calculator
  • Status Dashboard
APIs
  • REST API
  • API Usage Guidelines
  • Endpoints
Development and Automation
  • SDKs
  • Drivers and Tools
  • Terraform
  • Ansible
  • Cloud Create
Architecture Center
  • Best Practices
  • Blueprints
IaaSComputingAuto ScalingBare Metal ServerDedicated HostElastic Cloud ServerFunctionGraphImage Management ServiceNetworkDirect ConnectDomain Name ServiceElastic IPElastic Load BalancingEnterprise RouterNAT GatewayPrivate Link Access ServiceSecure Mail GatewayVirtual Private CloudVirtual Private NetworkVPC EndpointStorageCloud Backup and RecoveryCloud Server Backup ServiceElastic Volume ServiceObject Storage ServiceScalable File ServiceStorage Disaster Recovery ServiceVolume Backup ServicePaaSApplicationAPI Gateway (APIG)Application Operations ManagementApplication Performance ManagementDistributed Message Service (for Kafka)Simple Message NotificationData AnalysisCloud Search ServiceData Lake InsightData Warehouse ServiceDataArts StudioMapReduce ServiceModelArtsOptical Character RecognitionContainerApplication Service MeshCloud Container EngineCloud Container InstanceSoftware Repository for ContainersDatabasesData Replication ServiceDistributed Cache ServiceDistributed Database MiddlewareDocument Database ServiceGeminiDBRelational Database ServiceTaurusDBManagementManagement & DeploymentCloud CreateCloud EyeCloud Trace ServiceConfigLog Tank ServiceResource Formation ServiceTag Management ServiceSecuritySecurity ServicesAnti-DDoSCloud FirewallDatabase Security ServiceDedicated Web Application FirewallHost Security ServiceIdentity and Access ManagementKey Management ServiceWeb Application FirewallOtherOtherEnterprise DashboardMarketplacePrice CalculatorStatus Dashboard

Data Lake Insight

  • Spark SQL Syntax Reference
  • Flink OpenSource SQL 1.15 Syntax Reference
    • Constraints and Definitions
    • Overview
    • Flink OpenSource SQL 1.15 Usage
    • Formats
    • Connectors
    • DML Snytax
    • Functions
  • Flink OpenSource SQL 1.12 Syntax Reference
  • Flink Opensource SQL 1.10 Syntax Reference
  • Historical Versions
  • Identifiers
  • Operators
  • Change History
  • Sql Syntax Reference
  • Flink OpenSource SQL 1.15 Syntax Reference
  • Flink OpenSource SQL 1.15 Usage

Flink OpenSource SQL 1.15 Usage¶

When switching from Flink 1.12 to Flink 1.15 for job execution, keep in mind the following considerations when utilizing Flink OpenSource SQL 1.15:

  • Flink SQL utilizes a SQL client submission method. To configure this submission method in Flink 1.15, you need to use the SET 'key'='value' command in your SQL script. This is different from the optimization parameters used in Flink 1.12. For details about the syntax, see SQL Client Configuration.

  • The following Flink connectors are added to Flink 1.15: Doris Connector and Hive Connector. For details, see Overview.

  • In Flink 1.15, you need to configure a custom agency on the tenant plane and configure agency information in the job. The permissions included an agency should be configured based on the specific service scenario requirements of the job.

  • Methods to manage credentials for Flink 1.15 jobs:

    • You are advised to use DEW to manage access credentials, such as passwords and keys, in Flink OpenSource SQL.

    • Manage fixed AKs/SKs used by Flink Jar jobs to access OBS, temporary AKs/SKs used by Flink Jar jobs to obtain agencies, and temporary AKs/SKs used by Flink SQL UDFs to obtain agencies.

  • There are differences in the way Flink 1.15 Jar reads custom configuration files compared to Flink 1.12.

  • The Flink 1.15 Jar program uses a child-first reverse class loading mechanism. By setting the parent.first.classloader.jars parameter to include the names of the desired jars, for example, test1.jar,test2.jar, certain dependency packages can be loaded by the parent class loader.

  • For the built-in JAR file list of Flink 1.15 Jar, obtain information about Flink 1.15 dependency packages from Flink job logs.

    1. Check the logs of a Flink job.

      1. Log in to the DLI management console. In the navigation pane on the left, choose Job Management > Flink Jobs.

      2. Click the name of the desired job. On the displayed page, click the Run Log tab.

      3. Check the latest run logs. For more logs, check the OBS bucket where the job logs are stored.

    2. Search for dependency information in the logs.

      Search for Classpath: in the logs to check the dependencies.

  • Flink 1.15 no longer supports DLI package management. To upload dependency packages and files, select the OBS path directly when editing the job.

  • Prev
  • Next
last updated: 2025-06-16 14:07 UTC - commit: 2d6c283406071bb470705521bc41e86fa3400203
Edit pageReport Documentation Bug
Page Contents
  • Flink OpenSource SQL 1.15 Usage
© T-Systems International GmbH
  • Contact
  • Data privacy
  • Disclaimer of Liabilities
  • Imprint