


rds_pg_url_airflow_database
: The connection URL of the backend database for Airflowrds_pg_url_airflow_demo_database
: The connection URL of the demo database using AirflowNote: The default database port for RDS PostgreSQL is 1921.
N1cetest:
ssh root@ECS_EIP
cd ~
wget https://raw.githubusercontent.com/alibabacloud-howto/opensource_with_apsaradb/main/apache-airflow/setup.sh
sh setup.sh
cd ~/airflow
mkdir ./dags ./logs ./plugins
echo -e "AIRFLOW_UID=$(id -u)\nAIRFLOW_GID=0" > .env
docker-compose.yaml
file to set RDS PostgreSQL as the backend database. Use the connection string of rds_pg_url_airflow_database
in Deploy Resources:cd ~/airflow
vim docker-compose.yaml
docker-compose up airflow-init
docker-compose up
http://ECS_EIP:8080
Note: The default username and password are both
airflow
.
ssh root@ECS_EIP
cd ~
wget http://mirror.centos.org/centos/8/AppStream/x86_64/os/Packages/compat-openssl10-1.0.2o-3.el8.x86_64.rpm
rpm -i compat-openssl10-1.0.2o-3.el8.x86_64.rpm
wget http://docs-aliyun.cn-hangzhou.oss.aliyun-inc.com/assets/attach/181125/cn_zh/1598426198114/adbpg_client_package.el7.x86_64.tar.gz
tar -xzvf adbpg_client_package.el7.x86_64.tar.gz
northwind_ddl.sql
is for both source and target databases.northwind_data_source.sql
is for the source database.northwind_data_target.sql
is for the target database.cd ~/airflow wget https://raw.githubusercontent.com/alibabacloud-howto/opensource_with_apsaradb/main/apache-airflow/northwind_ddl.sql
wget https://raw.githubusercontent.com/alibabacloud-howto/opensource_with_apsaradb/main/apache-airflow/northwind_data_source.sql
wget https://raw.githubusercontent.com/alibabacloud-howto/opensource_with_apsaradb/main/apache-airflow/northwind_data_target.sql
northwind_source
, create the tables northwind_ddl.sql
and load the sample data northwind_data_source.sql
. Replace rds_pg_url_airflow_demo_database
with the RDS PostgreSQL connection string to set up the demo database account with username demo
and password N1cetest
.
cd ~/adbpg_client_package/bin ./psql -h -p1921 -Udemo northwind_source
\i ~/airflow/northwind_ddl.sql
\i ~/airflow/northwind_data_source.sql
select tablename from pg_tables where schemaname='public';
select count(*) from products;
select count(*) from orders;
./psql -h -p1921 -Udemo northwind_target
\i ~/airflow/northwind_ddl.sql
\i ~/airflow/northwind_data_target.sql
select tablename from pg_tables where schemaname='public';
select count(*) from products;
select count(*) from orders;
dags
directory:cd ~/airflow/dags
wget https://raw.githubusercontent.com/alibabacloud-howto/opensource_with_apsaradb/main/apache-airflow/northwind_migration.py
product_id
and order_id
in database northwind_source
and then updates the same product and order tables in database northwind_target
with the rows greater than that maximum id. The job is scheduled to run every minute starting on today’s date (When you run this demo, please update accordingly). If the task runs successfully, the DAG task will be shown on the web console.
Quick Start
Reach Alibaba Cloud experts for support
Contact Us