This topic describes what Hive built-in functions are supported by Realtime Compute for Apache Flink and how to use them.
Limits
Only Realtime Compute for Apache Flink jobs that use Ververica Runtime (VVR) 8.0.11 or later support Hive built-in functions.
Hive user-defined functions (UDFs) are not supported.
Use Hive built-in functions
Hive SQL drafts
When developing a Hive SQL draft, you can directly call a Hive built-in function via Hive catalogs. If a Hive built-in function shares a name with a Flink system function, VVR executes the Hive built-in function by default to ensure the consistency of function behavior across engines. The corresponding Flink job can achieve the same function semantics as a native Hive job without the need for additional configurations. For more information, see Get started with a Hive SQL deployment.
Flink SQL drafts
In a general SQL draft, you can invoke a Hive built-in function via the Hive connector. The procedure is as follows:
Configure the dependency.
Obtain the Hive connector JAR.
Choose a Hive connector JAR according to your Flink job's VVR and Hive metastore versions. For example, if your job uses VVR 8.0.11 and Hive 2.3.x, choose flink-sql-connector-hive-2.3.9_2.12-1.17.2.jar. Download flink-sql-connector-hive-2.3.9_2.12/1.17.2 and flink-sql-connector-hive-2.3.9_2.12/1.17.2 at Maven Central Repository.
ImportantFor information about the mappings between VVR and Apache Flink versions, see Engine updates.
Flink supports the following Hive versions: 2.0.0 - 2.3.9 and 3.1.0 - 3.1.3. Mappings between Hive metastore and Hive connector versions:
Hive metastore version
Hive connector version
2.0.x, 2.1.x, or 2.2.x
2.2.0
2.3.x
2.3.9
3.1.x
3.1.3
Upload and configure the dependency.
Log on to the Realtime Compute for Apache Flink console.
In the left-side navigation pane of the development console, choose Artifacts.
Click Upload Artifact to upload the Hive connector JAR.
In the left-side navigation pane, choose .
Click New to create a draft.
In the New Draft dialog box, select an engine version compatible with the Apache Flink version of the uploaded Hive connector JAR.
Click the Configurations tab on the right side of the SQL editor. In the Additional Dependencies field, select the Hive connector JAR you uploaded.

Load the HiveModule.
Load the HiveModule.
Run the following command to load the HiveModule of a specific version:
LOAD MODULE hive WITH('hive-version'='2.3.9'); -- The Hive version must align with the version specified in the filename of the Hive connector JAR.ImportantThe value of
hive-versionmust align with the version specified in the filename of the Hive connector JAR to ensure the function loading mechanism is compatible with that of Hive metastore. For instance, when usingflink-sql-connector-hive-2.3.9_2.12-1.17.2.jar, configurehive-version='2.3.9'. Supported features of the HiveModule may vary according to Apache Flink versions. For details on supported Hive functions and syntax limitations, see the Apache Flink documentation of a specific version, such as Hive Module of Flink 1.17 and Hive Module of Flink 1.20.Verify the status of the HiveModule.
Copy the following SQL statements to the SQL editor.
LOAD MODULE hive WITH('hive-version' = '2.3.9'); SHOW MODULES;Select and right-click the SQL statements, and choose Run. Verify that the HiveModule is loaded and that the resolution order of modules is correctly configured.

Due to Flink's default resolution order (core, hive), when a Hive function shares a name with a Flink system function, Flink will resolve the system function to the core module by default. You can change the resolution order via the following statement:
USE MODULES `sql-gateway-module`,`hive`, `core`;sql-gateway-module is VVR's internal compilation module whose resolution order cannot be adjusted. You can only change the resolution order of the Hive and core modules.
After the statement is executed, Flink will resolve the function to the HiveModule first.

Supported functions
Hive built-in functions will be mapped to Scalar functions, Table functions, or Aggregate functions depending on the resolution and execution phases. Supported Hive functions may vary in different Flink versions. For more information, see the Apache Flink documentation of a specific version, such as Hive Functions of Flink 1.17 and Hive Functions of Flink 1.20. For a list of Hive native functions, see LanguageManual UDF. Note that Flink supports only some of the core functions.
To check the complete list of Hive functions supported by your Flink session, execute the following statements:
LOAD MODULE hive WITH('hive-version' = '2.3.9');
SHOW FUNCTIONS;