Migrating Apache Flume Flows to Apache NiFi: Any Relational Database To/From Anywhere
Migrating Apache Flume Flows to Apache NiFi: Any Relational Database To/From Anywhere
Article 8 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_42.html
Article 7 - This
Article 6 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_9.html
Article 5 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_35.html
Article 4 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_8.html
Article 7 - This
Article 6 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_9.html
Article 5 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_35.html
Article 4 - https://www.datainmotion.dev/2019/10/migrating-apache-flume-flows-to-apache_8.html
This is a simple use case of being a gateway between Relational Databases and other sources and sinks. We can do a lot more than that in NiFi. We can SELECT, UPDATE, INSERT, DELETE and run any DML. All with No Code. We can also access metadata from an RDBMS and build dynamical ELT systems from that.
It is extremely easy to do this in NiFi.
SQL DDL
Create MariaDB/MySQL Table
Create Kudu Table
It is extremely easy to do this in NiFi.
Instead of using Flume, Let's Use Apache NiFi to Move Any Tabular Data To and From Databases
From A Relational Database (via JDBC Driver) to Anywhere. In our case, we will pull from an RDBMS and post to Kudu.
Step 1: QueryDatabaseTableRecord (Create Connection Pool, Pick DB Type, Table Name, Record Writer)
Step 2: PutKudu (Set Kudu Master, Table Name,
Done!
Query Database
Connect to Kudu
Let's Write JSON Records That Get Converted to Kudu Records or RDBMS/MySQL/JDBC Records
Schema For The Data
Read All The Records From Our JDBC Database
Let's Create an Apache Kudu table to Put Database Records To
Let's Examine the MySQL Table We Want to Read/Write To and From
Let's Check the MariaDB Table
MySQL Table Information
From Anywhere (Say a Device) to A Relational Database (via JDBC Driver). In our case, we will insert into an RDBMS from Kafka.
Step 1: Acquire or modify data say ConsumeKafkaRecord_2
Step 2: PutDatabaseRecord (Set Record Reader, INSERT or UPDATE, Connection Pool, Table Name)
Done!
Put Database Records in Any JDBC/RDBMS
Setup Your Connection Pool to SELECT, UPDATE, INSERT or DELETE
SQL DDL
Create MariaDB/MySQL Table
CREATE TABLE iot ( uuid VARCHAR(255) NOT NULL PRIMARY KEY,
ipaddress VARCHAR(255),top1pct BIGINT, top1 VARCHAR(255),
cputemp VARCHAR(255), gputemp VARCHAR(255),
gputempf VARCHAR(255),
cputempf varchar(255), runtime VARCHAR(255),
host VARCHAR(255), filename VARCHAR(255),
imageinput VARCHAR(255),hostname varchar(255),
macaddress varchar(255), end VARCHAR(255), te VARCHAR(255), systemtime VARCHAR(255),
cpu BIGINT, diskusage VARCHAR(255), memory BIGINT, id VARCHAR(255));
Create Kudu Table
CREATE TABLE iot ( uuid STRING,
ipaddress STRING,top1pct BIGINT,
top1 STRING,
cputemp STRING,
gputemp STRING,
gputempf STRING,
cputempf STRING, runtime STRING,
host STRING, filename STRING,
imageinput STRING,hostname STRING,
macaddress STRING,
`end` STRING, te STRING, systemtime STRING,
cpu BIGINT, diskusage STRING,
memory BIGINT,
id STRING,
PRIMARY KEY (uuid)
)
PARTITION BY HASH PARTITIONS 16
STORED AS KUDU
TBLPROPERTIES ('kudu.num_tablet_replicas' = '1');
References
- https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.9.2/org.apache.nifi.processors.standard.ListDatabaseTables/index.html
- https://www.marklogic.com/blog/apache-nifi-ingest-relational-data-to-marklogic/
- https://www.batchiq.com/database-extract-with-nifi.html
- https://blog.pythian.com/database-migration-using-apache-nifi/
- https://www.progress.com/blogs/tutorial-access-data-via-jdbc-with-apache-nifi
- https://marklogic.github.io/nifi/get-data-from-a-relational-database
- https://community.cloudera.com/t5/Community-Articles/ETL-With-Lookups-with-Apache-HBase-and-Apache-NiFi/ta-p/248243
- https://community.cloudera.com/t5/Community-Articles/Ingesting-RDBMS-Data-As-New-Tables-Arrive-Automagically-into/ta-p/246214
- https://community.cloudera.com/t5/Community-Articles/QADCDC-Our-how-to-ingest-some-database-tables-to-Hadoop-Very/ta-p/245229
- https://community.cloudera.com/t5/Community-Articles/Exporting-and-Importing-Data-from-MongoDB-in-the-Cloud-with/ta-p/249470
- https://community.cloudera.com/t5/Community-Articles/Easy-Hadoop-Backup/ta-p/245843
- https://community.cloudera.com/t5/Community-Articles/Su-Su-Sussudio-Sudoers-Log-Parsing-with-Apache-NiFi/ta-p/249461
- https://community.cloudera.com/t5/Community-Articles/Ingesting-SQL-Server-Tables-into-Hive-via-Apache-NiFi-and/ta-p/244665
- https://community.cloudera.com/t5/Community-Articles/Reading-Kerberos-KDC-Logs-and-Parsing-Them-for-Events-via/ta-p/249411
- https://community.cloudera.com/t5/Support-Questions/How-can-I-ingest-500-tables-automatically-in-Apache-Nifi/td-p/198397
- https://community.cloudera.com/t5/Community-Articles/Incrementally-Streaming-RDBMS-Data-to-Your-Hadoop-DataLake/ta-p/247927
- https://community.cloudera.com/t5/Community-Articles/Apache-NiFi-Processor-Building-a-SQL-DDL-Schema-From-A-JSON/ta-p/247989