Kafka setup guide

Follow these steps to use Kafka as your source.

Step 1 - Connect to Kafka

Select an existing Kafka connection, or create a new one.

Create a new Kafka connection

Enter a single host in the format of hostname:port. or a list of Kafka hosts in the format of (hostname1:port, hostname2:port).
Ensure the host address is accessible to Upsolver.
When using Upsolver's sandbox cloud, Kafka needs public access. If you have already integrated Upsolver into your environment and you want to avoid exposing the Kafka cluster and save AWS transfer costs, you can peer the VPC where the Kafka cluster runs to the VPC running Upsolver.
Consumer properties
Additional connection options may require configuration in order to provide the correct credentials to read from your Kafka cluster.
For a standard connection, use the following format:
bootstrap.servers = HOST:PORT
security.protocol = SASL_SSL
sasl.jaas.config = required username = "API_KEY" password = "SECRET";
ssl.endpoint.identification.algorithm = https
sasl.mechanism = PLAIN
  • The boostrap.servers value is the Kafka host(s) you entered above.
  • The username and password values are the API_KEY, along with the corresponding secret are configured on Kafka to allow access to the Kafka cluster.
To learn more about consumer properties visit Consumer Configuration.
In order for Upsolver to connect to your Kafka cluster using SSL, follow these steps to Configure SSL for your Kafka connection.

Step 2 - Select a topic to ingest

When the connection is established, select a topic for ingestion.

Step 3 - Check that events are read successfully

As soon as you select a topic, Upsolver will attempt to load a sample of the events.
If Upsolver did not load any sample events, try the following:
  1. 1.
    Verify that Kafka has events.
  2. 2.
    Select a Content type that matches the content type of your topic.