Skip to content

Bots From Extension: datanetwork

Data Network - Read, Write and poll data from internal Kafka topics

This extension provides 5 bots.





Bot *dn:list-groups

Bot Position In Pipeline: Source

List consumer groups in the datanetwork.

This bot expects a Full CFXQL.

Bot applies the Query on the data that is already loaded from previous bot or from a source.







Bot *dn:list-topics

Bot Position In Pipeline: Source

List topics in datanetwork. This will only show datanetwork topics.

This bot expects a Full CFXQL.

Bot applies the Query on the data that is already loaded from previous bot or from a source.







Bot @dn:list-topics-partitions

Bot Position In Pipeline: Source

List topics with the partitions in a consumer group in datanetwork. This will only show partitions for datanetwork topics.

This bot expects a Restricted CFXQL.

Each parameter may be specified using '=' operator and AND logical operation
Following are the parameters expected for this Bot

Parameter Name Type Default Value Description
name Text Kafka topic name
group Text Consumer group name
offset_reset Text earliest Stream offset reset position. Valid values are 'earliest' or 'latest'
topic_configurations Text no Collect topic configuration details such as retention.ms, max.message.bytes and flush.ms
context Text dn Prefix the topics with 'dn' or 'external' context.







Bot @dn:read-stream

Bot Position In Pipeline: Source

Read data from specified stream in the data network. This bot implements an infinite loop for receiving the data.

This bot expects a Restricted CFXQL.

Each parameter may be specified using '=' operator and AND logical operation
Following are the parameters expected for this Bot

Parameter Name Type Default Value Description
name* Text Stream name
group* Text Data receiver (consumer) group name
offset_reset Text earliest Stream offset reset position. Valid values are 'earliest' or 'latest'
batch_size Text 100 Maximum rows to read in each batch
batch_wait Text 5 Maximum number of seconds to wait once one or more events are received.
lazy_commit Text no Commit the messages only after the pipeline has completed processing them.
max_poll_interval_ms Text 300000 Milliseconds to wait for the next pipeline steps to complete. If this is exceeded, the bot
will fail. Default is 5 mins
retention_ms Text maximum time in milliseconds to retain a log before discarding old log segments to free up
space. Default 7 days
commit_retries Text 5 If lazy_commit is set to 'yes', number of times commit will be tried before failing.
commit_retry_interval Text 5 If lazy_commit is set to 'yes', number of seconds to wait before retries.
context Text dn Prefix the topics with 'dn' or 'external' context.







Bot @dn:write-stream

Bot Position In Pipeline: Sink

Write data to specified stream in the data network

This bot expects a Restricted CFXQL.

Each parameter may be specified using '=' operator and AND logical operation
Following are the parameters expected for this Bot

Parameter Name Type Default Value Description
name* Text Kafka topic name
partition Text The topic partition ID to write to. If not provided, uses the configured built-in partitioner
partition_key Text The key to be applied to the messages, all messages with the same key will be sent to the same
partition. Only used when partiition parameter is not provided. If partition_key and partition_key_col
are provided, key present in partition_key_col will be used
partition_key_col Text Column name that contains the key to be applied to the messages, all messages with the same
key will be sent to the same partition. Only used when partiition parameter is not provided.
If partition_key and partition_key_col are provided, key present in partition_key_col will
be used
retention_ms Text maximum time in milliseconds to retain a log before discarding old log segments to free up
space. Default 7 days
context Text dn Prefix the topics with 'dn' or 'external' context.