Producing events is one part of message exchange in Kafka, the other being consuming events. You can use ReadyAPI to simulate an event producer and post events to topics on your Kafka broker.
Let's consider an example where you want to check that your payment system works correctly. The system reads events from the
payments topic and, if the payment is successful, publishes events to the
So, to make sure the system works, you can use ReadyAPI to simulate a producer and publish payment events.
|Tip:||If you want to simulate a complex scenario, you would usually want to also simulate an event consumer to check how the event was processed. In our example, we could use ReadyAPI to check that the payment system published a message in the
1. Add a Kafka API to the project
To start testing Kafia APIs, you should first add one to the project. To learn how to do it, see Adding a Kafka API to the Project.
2. Create a test step
To produce a Kafka event in ReadyAPI, you use the Event-Driven test step in Publish mode:
Open a test case and add Event-Driven Test Step.
In the dialog that appears, select a Publish operation and click Add Test Step:
– or –
Right-click a Publish operation in the APIs node in the Navigator and select Add to Test Case.
In the dialog that appears, select a test case and click Add Test Step:
3. Send the event
To publish the event to the topic, enter the message in the Data field and select Connect. ReadyAPI will send the event and then disconnect.
Test step settings
Test step toolbar
The test step toolbar contains settings that allow you to modify the connection and authorization settings for the test step:
To open the connection settings, select Connection Settings in the test step toolbar.
Used to connect to the Confluent schema registry. Only available when the JSON via Avro message format is selected.
Other parameters that your Kafka provider might require.
The message editor is where you enter the actual message that the test step will send to the topic.
Message format – The format in which the message will be sent. Currently, two formats are available:
JSON – The message will be sent as a regular JSON.
JSON via Avro (Confluent) – The message will be serialized using a Confluent schema registry. You can set up a connection to the schema registry in the Connection Settings.
Metadata – The parameters that will be passed with the message. To add a parameter, click .
There are three types of parameters:
Header – Kafka headers that will be sent with the message.
Path – Used to update path parameter values in the channel name.
Kafka – Other parameters specific to Kafka. Currently, two Kafka parameters are supported: Key and Partition.
Tip: You can use property expansions to insert data into parameter values.
Data – The message itself in the JSON format.
You can insert data from other test steps or parameters into the message by using Get Data. To learn more about the feature, see Get Data Dialog.
You can also automatically format the JSON you enter in the field by clicking the Beautify button.