Apache Kafka
To ingest data from your Apache Kafka topic into a table within Upsolver, you must first create a connection that provides the appropriate credentials to access your topic.
Syntax
Jump to
Connection options
HOST[S]
HOST[S]
Type: text | list
A single host or a list of Kafka hosts in the format of hostname:port
.
CONSUMER_PROPERTIES
— editable
CONSUMER_PROPERTIES
— editableType: text_area
(Optional) Extra properties to configure for the consumer.
VERSION
VERSION
Values: { CURRENT | LEGACY }
Default: CURRENT
(Optional) Legacy is for anything before 0.10.
REQUIRE_STATIC_IP
— editable
REQUIRE_STATIC_IP
— editableType: Boolean
Default: true
(Optional) With Upsolver clusters, you can configure how many elastic IPs it should allocate and use within that cluster.
If the cluster running the job has at least one elastic IP set and REQUIRE_STATIC_IP
is enabled, then the job runs on a server that has an elastic IP associated with it
SSL
— editable
SSL
— editableType: Boolean
Default: false
(Optional) If enabled, SSL is used to connect.
Please contact Upsolver to ensure your CA certificate is supported.
TOPIC_DISPLAY_FILTER[S]
— editable
TOPIC_DISPLAY_FILTER[S]
— editableType: text | list
(Optional) A single topic or the list of topics to show. If left empty, all topics are visible.
SASL_USERNAME
SASL_USERNAME
Type: text
(Optional) Configure the SASL username to be used to authenticate with Kafka.
SASL_PASSWORD
SASL_PASSWORD
Type: text
(Optional) Configure the SASL password to be used to authenticate with Kafka.
When this setting is enabled along with SASL_USERNAME the following configuration will get added to the consumer/producer properties of the client created:
For other authentication settings, you may configure the CONSUMER_PROPERTIES directly.
COMMENT
— editable
COMMENT
— editableType: text
(Optional) A description or comment regarding this connection.
Minimum example
Full example
Last updated