Basic configuration, Schema Registry, SSL, SASL, and Confluent Cloud
At the end of the article, you will find a step by step guide that covers connecting to Confluent Cloud.
Creating a connection to a cluster
To add create a new Apache Kafka connection, go to the Connections Overview page in KaDeck and click on Add connection. Select Apache Kafka as the connection type. This opens a dialog where you can enter the relevant connection data.
The dialog offers various configuration options for encrypted connections via SSL, authorization via Kerberos, and various security mechanisms for Schema Registry. In the following, we will look at the various configuration options.
The connection data for your cluster requires a name that you can choose freely. The name allows you to keep track of your cluster if you have several clusters.
Each broker in your cluster has a specific IP. Even if it is already sufficient to specify the IP address of a single broker, it makes sense to specify multiple IP addresses for reasons of reliability. Please note that you must always specify the IP address followed by a colon and the port. Multiple IP addresses with port are separated by a comma. Example: 192.168.178.01:9092,192.168.178.02:9092
Enter the IP address followed by a colon and the port if you want to use a SchemaRegistry.
If you need authentication for your SchemaRegistry, you can choose between Basic and Bearer authentication by selecting the Schema Registry tab at the bottom of the dialog in the Security & Authentication section. Please note that this tab is only active once you have entered the IP and port for the Schema Registry.
Schema Registry with BASIC Authentication (Confluent Cloud).
Connection with SSL
Some clusters require the use of an SSL connection. To enable SSL, select SSL from the Security Protocol drop-down menu. If your cluster has an SSL certificate issued by an official certification authority, there is nothing else to do here.
Leave the values empty if your cluster uses an SSL certificate issued by an official certification authority.
Note: To increase the security of the connection and avoid a man-in-middle attack, you should set the SSL Endpoint Identification Algorithm to "https" on the KaDeck Settings page.
Custom SSL Certificate
Select SSL (or SASL_SSL if you want to use SSL in combination with SASL) and click on the SSL tab. To use your own SSL certificate, you must enter the path to your own trust store that contains this certificate in the Truststore Location input field.
If your Truststore is protected with a password, please enter the corresponding password in the input field Truststore Password.
If your Kafka Cluster requires a client certificate, i.e. you want to use a two-way authentication, you must enter the path to your Keystore under Keystore Location.
If your Keystore is protected by a password, enter the password in the Keystore Password input field.
Optionally, your private key in your Keystore can be protected by another password. You can enter this password in the Private Key Password input field.
Connection with SASL
Regardless of whether you use SSL, you can also connect using SASL. For this, in the Security Protocol drop-down menu, select SASL_PLAINTEXT or SASL_SSL if you also want to use SSL and click on the SASL tab.
SASL configuration of a Confluent Cloud cluster.
For the settings of SASL and the different mechanisms, you can find a more detailed article here.
Connecting with Kerberos
The values relevant to Kerberos, such as the Principal, KeyTab, or Service Name, are all specified in the Sasl Jaas Config input field.
A sample configuration may look like this:
com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true storeKey=true keyTab="C:/Path/toAuthFiles/some-keytab-file.keytab" principal="account@SOMEREALM.COMPANY.COM" serviceName="kafka";
Further information can be found in our dedicated article about SASL settings in KaDeck.