All Products
Search
Document Center

Realtime Compute for Apache Flink:Manage custom connectors

最終更新日:Aug 22, 2024

If built-in connectors cannot meet your business requirements, you can use custom connectors. This topic describes how to upload, use, and update a custom connector.

Precautions

  • To allow Realtime Compute for Apache Flink to identify and use your custom connector, you must develop the custom connector based on the connector standards that are defined by the Apache Flink community. Therefore, you must obtain the .meta file of the custom connector that you want to develop and declare the factory class of the custom connector. For more information about how to develop custom connectors, see User-defined Sources & Sinks.

  • You can upload the JAR file of connectors of the same type only once.

  • To avoid JAR file dependency conflicts, take note of the following points:

    • Make sure that the version of the Flink image is the same as the Flink version in the Project Object Model (POM) dependencies.

    • Do not upload JAR files at the runtime layer. This indicates that you need to add <scope>provided</scope> to the dependencies.

    • Use the Shade plug-in to package third-party dependencies. For more information, see Apache Maven Shade plug-in.

Upload and use a custom connector

  1. Go to the Create custom connector dialog box.

    1. Log on to the Realtime Compute for Apache Flink console.

    2. Find the workspace that you want to manage and click Console in the Actions column.

    3. In the left-side navigation pane, click Connectors.

  2. Create a custom connector.

    1. On the Connectors page, click Create Custom Connector.

    2. Upload the JAR file of the custom connector.

      You can use one of the following methods to upload the JAR file of a custom connector:

      • Upload File: Click Click to Select and select the desired JAR file.

      • Use External URL: Enter an external URL of another service. If you want to use a JAR file of another service, you can use this method to obtain the JAR file. For example, you can enter https://ossbucket/artifacts/namespaces/flink-default/flink-jobs-1.0-SNAPSHOT.jar.

        Note

        Only the following two types of external URLs are supported:

        • The endpoint of an Object Storage Service (OSS) bucket that you specify when you purchase a Realtime Compute for Apache Flink workspace. You can view the endpoint of the OSS bucket that you specify in the Workspace Details message in the management console of Realtime Compute for Apache Flink .

        • The endpoint of another external storage system that can be accessed by Realtime Compute for Apache Flink. The access control list (ACL) of the external storage system is public-read or Realtime Compute for Apache Flink is granted the permissions to access the external storage system.

    3. After you upload the JAR file, click Next.

      The system parses the content of the JAR file that you uploaded. If file parsing is successful, proceed to the next step. If file parsing fails, check whether the code of your custom connector complies with the standards that are defined by the Apache Flink community.

    4. Click Finish.

      The custom connector that you create appears in the connector list.

  3. Use the connector in the DDL statement of your draft.

    For more information about draft development, see Develop an SQL draft.

    Note

    You must set the value of the connector parameter in the WITH clause to the value of the identifier parameter of DynamicTableFactory in the JAR file of the custom connector. Other parameters in the WITH clause and their definitions vary based on the custom connector that you create.

Update a custom connector

Note

After you update a custom connector, running deployments that use the connector are not affected. After the deployments are restarted, the connector update applies to the deployments.

  1. Go to the Edit connector dialog box.

    1. Log on to the Realtime Compute for Apache Flink console.

    2. Find the workspace that you want to manage and click Console in the Actions column.

    3. In the left-side navigation pane, click Connectors.

  2. On the Connectors tab, click the Custom Connectors tab. On the Custom Connectors tab, find the desired custom connector and click Edit to the right of the name of the custom connector.

  3. Upload the JAR file of the custom connector.

    You can use one of the following methods to upload the JAR file of a custom connector:

    • Upload File: Click Click to Select and select the desired JAR file.

    • Use External URL: Enter an external URL. For example, you can enter https://ossbucket/artifacts/namespaces/flink-default/flink-jobs-1.0-SNAPSHOT.jar.

      Note

      Only the following two types of external URLs are supported:

      • The endpoint of an Object Storage Service (OSS) bucket that you specify when you purchase a Realtime Compute for Apache Flink workspace. You can view the endpoint of the OSS bucket that you specify in the Workspace Details message in the management console of Realtime Compute for Apache Flink .

      • The endpoint of another external storage system that can be accessed by Realtime Compute for Apache Flink. The access control list (ACL) of the external storage system is public-read or Realtime Compute for Apache Flink is granted the permissions to access the external storage system.

  4. After you upload the JAR file, click Next.

    The system parses the content of the JAR file that you uploaded. If file parsing is successful, proceed to the next step. If file parsing fails, check whether the code of your custom connector complies with the standards that are defined by the Apache Flink community.

  5. Click Finish.

References

  • Realtime Compute for Apache Flink provides various built-in connectors. For more information about the built-in connectors supported by Realtime Compute for Apache Flink, see Supported connectors.

  • For more information about how to use the Change Data Capture (CDC) connectors for Apache Flink, see Use a CDC connector for Apache Flink.

  • Realtime Compute for Apache Flink supports metadata management. You can create a catalog to manage and access metadata. For more information, see Manage catalogs.