DataFlow Processors Cheat Sheet
Processors Available in Cloudera DataFlow Designer
I put together a list of some of the processors available in the April 2023 version of flow designer as a quick document list.
- https://docs.cloudera.com/dataflow/cloud/release-notes/topics/cdf-supported-component-versions.html
- https://docs.cloudera.com/dataflow/cloud/release-notes/topics/cdf-whats-new-latest.html
- https://docs.cloudera.com/dataflow/cloud/support-matrix/topics/cdf-supported-nifi-processors.html
- https://docs.cloudera.com/dataflow/cloud/howto-using-readyflows.html
Available ReadyFlows
Using a ReadyFlow to build your data flow allows you to get started with CDF quickly and easily. A ReadyFlow is a flow definition template optimized to work with a specific CDP source and destination. So instead of spending your time on building the data flow in NiFi, you can focus on deploying your flow and defining the right KPIs for easy monitoring.
The ReadyFlow Gallery is where you can find out-of-box flow definitions. To use a ReadyFlow, add it to the Catalog and then use it to create a Flow Deployment.
ADLS to ADLS Avro
This ReadyFlow consumes JSON, CSV or Avro data from a source ADLS container and transforms the data into Avro files before writing it to another ADLS container.
Airtable to S3/ADLS
You can use the Airtable to S3/ADLS ReadyFlow to consume objects from an Airtable table, filter them, and write them as JSON, CSV or Avro files to a destination in Amazon S3 or Azure Data Lake Service (ADLS).
Azure Event Hub to ADLS
You can use the Azure Event Hub to ADLS ReadyFlow to ingest JSON, CSV or Avro files from an Azure Event Hub namespace, optionally parsing the schema using CDP Schema Registry or direct schema input. The flow then filters records based on a user-provided SQL query and writes them to a target Azure Data Lake Storage (ADLS) location in the specified output data format.
Box to S3/ADLS
You can use the Box to S3/ADLS ReadyFlow to move data from a source Box location to a destination Amazon S3 bucket or Azure Data Lake Storage (ADLS) location.
Confluent Cloud to S3/ADLS
This ReadyFlow consumes JSON, CSV or Avro data from a source Kafka topic in Confluent Cloud and parses the schema by looking up the schema name in the Confluent Schema Registry. The filtered events are then written to the destination S3 or ADLS location.
Confluent Cloud to Snowflake
This ReadyFlow consumes JSON, CSV or Avro data from a source Kafka topic in Confluent Cloud and filters records based on a user-provided SQL query before writing it to a Snowflake table.
Dropbox to S3/ADLS
You can use the Dropbox to S3/ADLS ReadyFlow to ingest data from Dropbox and write it to a destination in Amazon S3 or Azure Data Lake Service (ADLS).
Google Drive to S3/ADLS
You can use the Google Drive to S3/ADLS ReadyFlow to ingest data from a Google Drive location to a destination in Amazon S3 or Azure Data Lake Service (ADLS).
Hello World
This ReadyFlow consumes change data from the Wikipedia API and converts JSON events to Avro, filtering and merging them before a writing a file to local disk.
HubSpot to S3/ADLS
This ReadyFlow retrieves objects from a Private HubSpot App, converting them to the specified output data format and writing them to the target S3 or ADLS destination.
JDBC to JDBC
This ReadyFlow moves data between database tables, filtering records by means of an SQL query.
JDBC to S3/ADLS
This ReadyFlow consumes data from a source database table and filters events based on a user-provided SQL query before writing it to a destination Amazon S3 or Azure Data Lake Storage (ADLS) location in the specified output data format.
Kafka Filter to Kafka
This ReadyFlow consumes JSON, CSV, or Avro data from a source Kafka topic and parses the schema by looking up the schema name in the CDP Schema Registry. You can filter events by specifying a SQL query in the Filter Rule parameter.
Kafka to ADLS Avro
This ReadyFlow consumes JSON, CSV or Avro data from a source Kafka topic and merges the events into Avro files before writing the data to ADLS. The flow writes out a file every time its size has either reached 100 MB or five minutes have passed.
Kafka to Cloudera Operational Database
This ReadyFlow consumes JSON, CSV or Avro data from a source Kafka topic, parses the schema by looking up the schema name in the CDP Schema Registry and ingests it into an HBase table in COD.
Kafka to Iceberg
This ReadyFlow consumes JSON, CSV, or Avro data from a source Kafka topic, parses the schema by looking up the schema name in the CDP Schema Registry, and ingests data into an Iceberg table in Hive.
Kafka to Kafka
This ReadyFlow consumes JSON, CSV, or Avro data from a source Kafka topic and parses the schema by looking up the schema name in the CDP Schema Registry.
Kafka to Kudu
This ReadyFlow consumes JSON, CSV or Avro data from a source Kafka topic, parses the schema by looking up the schema name in the CDP Schema Registry and ingests it into a Kudu table.
Kafka to S3 Avro
This ReadyFlow consumes JSON, CSV or Avro data from a source Kafka topic and merges the events into Avro files before writing the data to S3. The flow writes out a file every time its size has either reached 100 MB or five minutes have passed.
ListenHTTP filter to Kafka
This ReadyFlow listens to a JSON, CSV or Avro data stream on a specified port and parses the schema by looking up the schema name in the CDP Schema Registry. You can filter events by specifying a SQL query. The filtered events are then converted to the specified output data format and written to the destination CDP Kafka topic.
ListenSyslog filter to S3/ADLS
This ReadyFlow listens to a Syslog data stream on a specified port. You can filter events by specifying a SQL query. The filtered events are then converted to the specified output data format and written to the target S3 or ADLS destination.
ListenTCP filter to S3/ADLS
This ReadyFlow listens to a JSON, CSV or Avro data stream and parses the data based on a specified Avro-formatted schema. You can filter events by specifying a SQL query. The filtered events are then converted to the specified output data format and written to the target S3 or ADLS destination.
MQTT Filter to Kafka
This ReadyFlow consumes JSON, CSV or Avro data from a source MQTT topic. You can filter events by specifying a SQL query. The filtered events are then converted to the specified output data format and written to the destination Kafka topic.
Non-CDP ADLS to CDP ADLS
This ReadyFlow moves data between non-CDP managed source and CDP-managed destination ADLS locations.
Non-CDP S3 to CDP S3
This ReadyFlow moves data between non-CDP managed source and CDP-managed destination S3 locations.
S3 to S3 Avro
This ReadyFlow consumes JSON, CSV or Avro data from a source S3 bucket and transforms the data into Avro files before writing it to another S3 bucket.
S3 to S3 Avro with S3 Notifications
This ReadyFlow consumes JSON, CSV or Avro data from a source S3 bucket and transforms the data into Avro files before writing it to another S3 bucket. The ReadyFLow is configured with notifications about new files that arrive in the sourcce AWS bucket.
Salesforce filter to S3/ADLS
You can use the Salesforce filter to S3/ADLS ReadyFlow to consume objects from a Salesforce database table, filter them, and write the data as JSON, CSV or Avro files to a destination in Amazon S3 or Azure Data Lake Service (ADLS).
Shopify to S3/ADLS
This ReadyFlow consumes objects from a Custom Shopify App, converts them to the specified output data format, and writes them to a CDP managed destination S3 or ADLS location.