Confluent setup guide

Follow these steps to use Kafka hosted on Confluent Cloud as your source.

Step 1 - Connect to Confluent Cloud

Select an existing Confluent Cloud connection, or create a new one. You can find instructions for how to create your Confluent Kafka cluster in this quick start guide.

Create a new Confluent connection

Bootstrap server[s]

Enter a single bootstrap server host in the format of hostname:port or a list of bootstrap hosts in the format of (hostname1:port, hostname2:port).

Ensure the host address is accessible to Upsolver.

Consumer properties

Configure the appropriate consumer properties for your Kafka cluster. You can find all of the available properties in the Kafka Consumer Configuration reference.

To securely access your Confluent Kafka cluster, you must configure your Confluent resource Access Key and Secret Key in the username and password properties, respectively. You can learn more about resource keys in the Using API Keys to Control Access in Confluent Cloud.

For a standard connection, use the following format:

security.protocol = SASL_SSL
sasl.jaas.config =   required username = "API_KEY"   password = "SECRET";
ssl.endpoint.identification.algorithm = https
sasl.mechanism = PLAIN

In order for Upsolver to connect to your Confluent cluster using SSL, follow these steps to Encrypt and Authenticate with TLS.

Step 2 - Select a topic to ingest

After the connection has been successfully established, Upsolver will automatically discover available topics and display them in the drop down. Select the topic you'd like to ingest. By default, Upsolver will automatically detect the format of your data and parse events into the appropriate structure.

When using Avro format, Upsolver allows you to configure Confluent Schema Registry to help in decoding the field names and data types. If you use the Confluent Schema Registry, select the Avro Schema Registry option from the content type drop down box.

In the Schema Registry Url text box enter the full URL to your Confluent Schema Registry using the following format:{id}

Note that the API access and secret keys must be passed as part of the URL. Additionally, to support schema evolution, include {id}, as shown in the example. When it's included, Upsolver will automatically insert the schema ID from the Avro header of the incoming event to fetch the appropriate schema from the Schema Registry.

Step 3 - Check that events are read successfully

As soon as you select a topic, Upsolver will attempt to load a sample of the events.

If Upsolver did not load any sample events, try the following:

  1. Verify that Confluent has events.

  2. Select a Content type that matches the content type of your topic.

Last updated