DM data sources serve as a data hub. DataWorks provides DM Reader and DM Writer for you to read data from and write data to DM data sources. This helps quickly resolve the computing problems of large amounts of data. This topic describes the capabilities of synchronizing data from or to DM data sources.
Limits
Data of views can be read during batch synchronization.
DM Reader and DM Writer support only exclusive resource groups for Data Integration.
Data type mappings
DM Reader and DM Writer support most data types of common relational databases, such as numeric and string data types. Make sure that the data types of your database are supported.
The following table lists the data type mappings based on which DM Reader converts data types.
Category | DM data type |
Integer | INT, TINYINT, SMALLINT, and BIGINT |
Floating point | REAL, FLOAT, DOUBLE, NUMBER, and DECIMAL |
String | CHAR, VARCHAR, LONGVARCHAR, and TEXT |
Date and time | DATE, DATETIME, TIMESTAMP, and TIME |
Boolean | BIT |
Binary | BINARY, VARBINARY, and BLOB |
Develop a data synchronization task
For information about the entry point for and the procedure of configuring a data synchronization task, see the following sections. For information about the parameter settings, view the infotip of each parameter on the configuration tab of the task.
Add a data source
Before you configure a data synchronization task to synchronize data from or to a specific data source, you must add the data source to DataWorks. For more information, see Add and manage data sources.
Configure a batch synchronization task to synchronize data of a single table
For more information about the configuration procedure, see Configure a batch synchronization task by using the codeless UI and Configure a batch synchronization task by using the code editor.
For information about all parameters that are configured and the code that is run when you use the code editor to configure a batch synchronization task, see Appendix: Code and parameters.
Configure synchronization settings to implement batch synchronization of all data in a database
For more information about the configuration procedure, see Configure a synchronization task in Data Integration.
Appendix: Code and parameters
Appendix: Configure a batch synchronization task by using the code editor
If you use the code editor to configure a batch synchronization task, you must configure parameters for the reader and writer of the related data source based on the format requirements in the code editor. For more information about the format requirements, see Configure a batch synchronization task by using the code editor. The following information describes the configuration details of parameters for the reader and writer in the code editor.
Code for DM Reader
{
"order": {
"hops": [
{
"from": "Reader",
"to": "Writer"
}
]
},
"setting": {
"errorLimit": {
"record": "0"
},
"speed": {
"throttle":true,// Specifies whether to enable throttling. The value false indicates that throttling is disabled, and the value true indicates that throttling is enabled. The mbps parameter takes effect only when the throttle parameter is set to true.
"concurrent":1 // The maximum number of parallel threads.
"mbps":"12"// The maximum transmission rate. Unit: MB/s.
}
},
"steps": [
{
"category": "reader",
"name": "Reader",
"parameter": {
"datasource": "dm_datasource",
"table": "table",
"column": [
"*"
],
"preSql": [
"delete from XXX;"
],
"fetchSize": 2048
},
"stepType": "dm"
},
{
"category": "writer",
"name": "Writer",
"parameter": {},
"stepType": "stream"
}
],
"type": "job",
"version": "2.0"
}
Parameters in code for DM Reader
Parameter | Description | Required | Default value |
datasource | The name of the data source from which you want to read data. For more information about how to add a data source, see Add a DM data source. | Yes | No default value |
table | The name of the table from which you want to read data. | Yes | No default value |
column | The names of the columns from which you want to read data. Specify the names in a JSON array. The default value is [ * ], which indicates all the columns in the source table.
| Yes | No default value |
splitPk | The field that is used for data sharding when DM Reader reads data. If you specify this parameter, the table is sharded based on the value of this parameter. Data Integration then runs parallel threads to read data. This way, data can be synchronized more efficiently.
| No | No default value |
where | The WHERE clause. DM Reader generates an SQL statement based on the settings of the column, table, and where parameters and uses the generated statement to read data. For example, when you perform a test, you can set the where parameter to limit 10. To read the data that is generated on the current day, you can set the where parameter to
| No | No default value |
querySql | The SQL statement that is used for refined data filtering. If you specify this parameter, data is filtered based only on the value of this parameter. For example, if you want to join multiple tables for data synchronization, set this parameter to | No | No default value |
fetchSize | The number of data records to read at a time. This parameter determines the number of interactions between Data Integration and the source database and affects read efficiency. Note If you set this parameter to a value greater than 2048, an out of memory (OOM) error may occur during data synchronization. | No | 1,024 |
Code for DM Writer
{
"type": "job",
"steps": [
{
"stepType": "oracle",
"parameter": {
"datasource": "aaa",
"column": [
"PROD_ID",
"name"
],
"where": "",
"splitPk": "",
"encoding": "UTF-8",
"table": "PENGXI.SALES"
},
"name": "Reader",
"category": "reader"
},
{
"stepType": "dm",
"parameter": {
"datasource": "dm_datasource",
"table": "table",
"column": [
"id",
"name"
],
"preSql": [
"delete from XXX;"
]
},
"name": "Writer",
"category": "writer"
}
],
"version": "2.0",
"order": {
"hops": [
{
"from": "Reader",
"to": "Writer"
}
]
},
"setting": {
"errorLimit": {
"record": ""
},
"speed": {
"throttle":true,// Specifies whether to enable throttling. The value false indicates that throttling is disabled, and the value true indicates that throttling is enabled. The mbps parameter takes effect only when the throttle parameter is set to true.
"concurrent":2, // The maximum number of parallel threads.
"mbps":"12"// The maximum transmission rate. Unit: MB/s.
}
}
}
Parameters in code for DM Writer
Parameter | Description | Required | Default value |
datasource | The name of the data source to which you want to write data. For more information about how to add a data source, see Add a DM data source. | Yes | No default value |
table | The name of the table to which you want to write data. If the table uses the default schema for the destination database, the value of this parameter consists of only the name of the table. If the table uses a custom schema, the value of this parameter consists of two parts: the name of the custom schema and the name of the table. Specify the two parts in the | Yes | No default value |
column | The names of the columns to which you want to write data. Separate the names with commas (,). Note We recommend that you do not leave this parameter empty. | Yes | No default value |
preSql | The SQL statement that you want to execute before the synchronization task is run. For example, you can set this parameter to the SQL statement that is used to delete outdated data. You can execute only one SQL statement in a transaction. Note If you specify multiple SQL statements, the statements are not executed in the same transaction. | No | No default value |
postSql | The SQL statement that you want to execute after the synchronization task is run. For example, you can set this parameter to the SQL statement that is used to add a timestamp. You can execute only one SQL statement in a transaction. Note If you specify multiple SQL statements, the statements are not executed in the same transaction. | No | No default value |
batchSize | The number of data records to write at a time. Set this parameter to an appropriate value based on your business requirements. This greatly reduces the interactions between Data Integration and the destination database and increases throughput. If you set this parameter to an excessively large value, an out of memory (OOM) error may occur during data synchronization. | No | 1024 |