Confluent Cloud

Follow these steps to use Kafka hosted on Confluent Cloud as your source.

Step 1 - Connect to Confluent Cloud

Create a new connection

Click Create a new connection, if it is not already selected. You can find instructions for how to create your Confluent Kafka cluster in this quick start guide.

In Bootstrap server[s], enter a single bootstrap server host in the format of hostname:port or a list of bootstrap hosts in the format of (hostname1:port, hostname2:port).

Ensure the host address is accessible to Upsolver.

Configure the appropriate consumer properties for your Kafka cluster. You can find all of the available properties in the Kafka Consumer Configuration reference.

To securely access your Confluent Kafka cluster, you must configure your Confluent resource Access Key and Secret Key in the username and password properties, respectively. You can learn more about resource keys in the Using API Keys to Control Access in Confluent Cloud.

For a standard connection, use the following format:

security.protocol = SASL_SSL
sasl.jaas.config = org.apache.kafka.common.security.plain.PlainLoginModule   required username = "API_KEY"   password = "SECRET";
ssl.endpoint.identification.algorithm = https
sasl.mechanism = PLAIN

In order for Upsolver to connect to your Confluent cluster using SSL, follow these steps to Encrypt and Authenticate with TLS.

In the Name your connection field, type in the name for this connection. Please note this connection will be available to other users in your organization.

Use an existing connection

By default, if you have already created a connection, Upsolver selects Use an existing connection, and your Confluent Cloud connection is populated in the list.

For organizations with multiple connections, select the source connection you want to use.

Step 2 - Select a topic to ingest

After the connection has been successfully established, Upsolver will automatically discover available topics and display them in the list, Select a topic to ingest. Choose the topic you'd like to ingest. By default, Upsolver will automatically detect the format of your data and parse events into the appropriate structure.

When using Avro format, Upsolver allows you to configure Confluent Schema Registry to help in decoding the field names and data types. If you use the Confluent Schema Registry, select the Avro Schema Registry option from the content type select list. In the Schema Registry Url text box enter the full URL to your Confluent Schema Registry using the following format:

https://access_key:secret_key@sr-aws.confluent.cloud/schemas/ids/{id}

Note that the API access and secret keys must be passed as part of the URL. Additionally, to support schema evolution, include {id}, as shown in the example. When it's included, Upsolver will automatically insert the schema ID from the Avro header of the incoming event to fetch the appropriate schema from the Schema Registry. Optionally, enter your Username and Password.

Step 3 - Check that events are read successfully

As soon as you select a topic, Upsolver will attempt to load a sample of the events.

If Upsolver did not load any sample events, try the following:

  1. Verify that your topic has recent events. Upsolver attempts to grab the most recently available events from the topic and will not seek back in time to fetch old events.

  2. Verify the content type you selected to ensure it matches the events in your topic.

Last updated