Connecting to a cluster (advanced)

 

This article describes how to connect to an Apache Kafka cluster and lists common configurations, e.g. connecting to Confluent Cloud and Kerberos.

Kerberos: If you are having trouble connecting to a cluster with Kerberos, see both the relevant section in this article and our separate article about SASL/Kerberos.

Basic configuration, Schema Registry, SSL, SASL, and Confluent Cloud

The article starts with covering the mandatory fields followed by the Schema Registry configurations, connecting to a cluster with SSL (custom and official) and SASL configurations.

At the end of the article, you will find a step by step guide that covers connecting to Confluent Cloud.

 

Adding a cluster's connection data

To add new connection data of a cluster, go to the Cluster Overview page in KaDeck and click "Add new cluster". This opens a dialog where you can enter the relevant connection data.

 

Configuration Description

The dialog offers various configuration options for encrypted connections via SSL, authorization via Kerberos, and various security mechanisms for Schema Registry. In the following, we will look at the various configuration options.

Mandatory data

Name

The connection data for your cluster requires a name that you can choose freely. The name allows you to keep track of your cluster if you have several clusters.

Bootstrap Servers

Each broker in your cluster has a specific IP. Even if it is already sufficient to specify the IP address of a single broker, it makes sense to specify multiple IP addresses for reasons of reliability. Please note that you must always specify the IP address followed by a colon and the port. Multiple IP addresses with port are separated by a comma. Example: 192.168.178.01:9092,192.168.178.02:9092

Schema Registry

Enter the IP address followed by a colon and the port if you want to use a SchemaRegistry.

Example: 192.168.178.03:8082

If you need authentication for your SchemaRegistry, you can choose between Basic and Bearer authentication by selecting the Schema Registry tab at the bottom of the dialog in the Security & Authentication section. Please note that this tab is only active once you have entered the IP and port for the Schema Registry.

Confluent_Cloud_Config_2.jpg

Schema Registry with BASIC Authentication (Confluent Cloud).

Connection with SSL

Some clusters require the use of an SSL connection. To enable SSL, select SSL from the Security Protocol drop-down menu. If your cluster has an SSL certificate issued by an official certification authority, there is nothing else to do here.

Confluent_Cloud_Config_3.jpg

Leave the values empty if your cluster uses an SSL certificate issued by an official certification authority. 

Note: To increase the security of the connection and avoid a man-in-middle attack, you should set the SSL Endpoint Identification Algorithm to "https" on the KaDeck Settings page.

Custom SSL Certificate

Select SSL (or SASL_SSL if you want to use SSL in combination with SASL) and click on the SSL tab. To use your own SSL certificate, you must enter the path to your own trust store that contains this certificate in the Truststore Location input field.

Example: C:/kafka/my_truststore.jks

If your Truststore is protected with a password, please enter the corresponding password in the input field Truststore Password.

Two-Way Authentication

If your Kafka Cluster requires a client certificate, i.e. you want to use a two-way authentication, you must enter the path to your Keystore under Keystore Location.

Example: C:/kafka/my_keystore.jks

If your Keystore is protected by a password, enter the password in the Keystore Password input field.

Optionally, your private key in your Keystore can be protected by another password. You can enter this password in the Private Key Password input field.

Connection with SASL

Regardless of whether you use SSL, you can also connect using SASL. For this, in the Security Protocol drop-down menu, select SASL_PLAINTEXT or SASL_SSL if you also want to use SSL and click on the SASL tab.

Confluent_Cloud_Config.jpg

SASL configuration of a Confluent Cloud cluster.

For the settings of SASL and the different mechanisms, you can find a more detailed article here.

 

Connecting with Kerberos

The values relevant to Kerberos, such as the Principal, KeyTab, or Service Name, are all specified in the Sasl Jaas Config input field.

A sample configuration may look like this:

com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true storeKey=true keyTab="C:/Path/toAuthFiles/some-keytab-file.keytab" principal="account@SOMEREALM.COMPANY.COM" serviceName="kafka";

Further information can be found in our dedicated article about SASL settings in KaDeck.

 

Connecting to Confluent Cloud

To connect to the Confluent Cloud, you must configure SASL_SSL connection data in addition to the bootstrap servers and the optional schema registry.

Where can I find my connection data?

Go to your cluster in the Confluent Cloud and click on the "Tools & client configuration" tab. Then select the Clients tab. Click on Java to expand the card containing all configuration data.

Step by step guide to connecting to Confluent Cloud

  1. Give your cluster a name.
    Example: My Confluent Cluster
  2. Enter the bootstrap server as described above. Confluent Cloud uses a load balancer, so you only need to enter one address.
    Example: pkc-412.us-central1.gcp.confluent.cloud:9092
  3. If you want to use a schema registry, enter the corresponding address in the Schema Registry input field. Otherwise, you can skip this point.

    Example: https://psrc-4ad.us-central1.gcp.confluent.cloud

    Please note that no port is required here.

    If you want to use a Schema Registry, you must configure Basic Authentication in the Schema Registry tab in the Security & Authentication area. Otherwise, you can skip this point. Click on the Schema Registry tab and select "BASIC" as the authentication method.

    In the User Info field, enter your Schema Registry Key followed by a colon and your Secret. You can get a key and a secret on the Confluent Cloud website in the Schemas section.

    Example: S4OGSAKAI97ZU3:Dk290asdIDHOPAWnmc9/twASDPOdajsjda/dai
  4. Select SASL_SSL from the Security Protocol drop-down menu and click on the SASL tab.
    Enter the SASL Jaas Config in the corresponding input field.

    Example: org.apache.kafka.common.security.plain.PlainLoginModule required username="{{ CLUSTER_API_KEY }}" password="{{{ CLUSTER_API_SECRET }}";
    Or: org.apache.kafka.common.security.plain.PlainLoginModule required username="HASIX76NIXO7XNU" password="NAsjaSIHsuzIjtyzB2S1BZq+GgF89FasIAODHOU8/VIqTku";
  5. Select PLAIN as the SASL Mechanism.
  6. Click on Confirm. Your connection data is now saved and the dialog is closed. You can now click the Connect button on the card of your cluster in KaDeck to connect to the cluster.
  7. You can optionally set the SSL Endpoint Identification Algorithm to https on the Settings page of KaDeck.
Was this article helpful?
1 out of 1 found this helpful

Comments

0 comments

Please sign in to leave a comment.