Data integration integrates data from different sources and in different formats to provide your business with comprehensive data sharing and give you a unified and more valuable view of all your data. This helps you make better decisions based on complex and heterogeneous data sources in a fast and stable data environment.
Data integration helps you fully utilize data assets to make smart decisions and drive your business.
You can sync data from one or multiple tables from the source application database (MySQL in this example) to the target application database.
Configure the Scheduler. For more information, see Scheduler.
Click the Trigger section in the integration designer canvas and select the Scheduler component. On the pop-up configuration panel on the right, enter the expression for scheduled execution and configure the scheduling rule; for example, you can enter 0 0 12 * * ?
to execute the data integration task at 12:00 every day.
Configure the source application database. Set to extract data from the data source, transfer the extracted data to subsequent components for processing, and load the processed data into the target database. The specific steps are as follows:
a. Click + next to the Scheduler component in the canvas. Search for Database on the component selection panel, and click the Database connector. Click Query to add the connector to the canvas, and the configuration panel pops up on the right:
b. Configure the connection: Select MySQL as the data source type (select an option based on the actual use case), specify the database to be queried, and enter the database address, port number, username, and password.
c. Configure the query operation: The operation modes include Simple mode - single table query, Simple mode - multi-table query, and SQL mode.
After you select the single table query mode, the Database connector will read all tables in the specified database. You can select the data source table to be synced.
Directions
1
.Configure the target database and table for loading the extracted data.
a. In the designer canvas, click + next to the source Database connector. On the component selection panel, search for and select the Database connector and then select Batch insert or Batch merge.
b. Configure on the Database batch insert panel as follows:
return msg.payload
here.Filter the data. If you need to filter the data extracted from the source, you can add the Filter component after the source Database connector for processing. The Filter component configuration panel consists of the following parts:
a. Enter the data set to be filtered. If the data set is of the Recordset
type, it does not need to be processed. The default value is msg.payload
, and you can skip this step.
b. Configure the fields to be output after filtering. If you don't need the data of a field, you can deselect the field, and the component will filter it out.
c. Set the filters. Step b is to filter columns, while this step is to filter rows. You can configure fields and conditions such as ==
, !=
, >=
, <=
, >
, <
, StartWith
, EndWith
, Contain
, In
, NotIn
, IsNull
, or IsNotNull
to filter data; for example, you can filter out canceled orders.
Map the data. Source table and object structures are generally not simply in one-to-one correspondence to the target table and object structures. In this case, you can map and convert the data based on the correspondence between the extracted fields and fields to be loaded.
a. Add the Mapper component between the source and target database connectors and configure the logic for mapping and conversion between the source and target object fields.
b. Drag and drop the input and output information (source and target fields) on the GUI to map them. If you need to process the source data, you can add processing logic for the relevant fields in the logic mapping section.
Was this page helpful?