Simulating Producers

Applies to ReadyAPI 3.51, last modified on March 21, 2024

Producing events is one part of message exchange in Kafka, the other being consuming events. You can use ReadyAPI to simulate an event producer and post events to topics on your Kafka broker.

Use case

Let's consider an example where you want to check that your payment system works correctly. The system reads events from the payments topic and, if the payment is successful, publishes events to the payment-confirmation topic.

So, to make sure the system works, you can use ReadyAPI to simulate a producer and publish payment events.

Tip: If you want to simulate a complex scenario, you would usually want to also simulate an event consumer to check how the event was processed. In our example, we could use ReadyAPI to check that the payment system published a message in the payment-confirmation topic.

Produce events

1. Add a Kafka API to the project

To start testing Kafka APIs, you should first add one to the project. To learn how to do it, see Adding a Kafka API to the Project.

2. Create a test step

To produce a Kafka event in ReadyAPI, you use the API Connection test step based on a Publish operation:

  1. Open a test case and add API Connection Test Step.

  2. In the dialog that appears, select a Publish operation and click Add Test Step:

    Example

– or –

  1. Right-click a Publish operation in the APIs node in the Navigator and select Add to Test Case.

  2. In the dialog that appears, select a test case and click Add Test Step:

    Example

3. Send the event

To publish the event to the topic, enter the message in the Data field and select Connect. ReadyAPI will send the event and then disconnect.

Simulating Kafka producer: Publish the event

Click the image to enlarge it.

In the sections below, you can read more about connection settings and the message editor. Kafka authentication settings are covered here: Authentication in Kafka.

Test step settings

Test step toolbar

The test step toolbar contains settings that allow you to modify the connection and authorization settings for the test step:

API Connection test step: Publish toolbar

Click the image to enlarge it.

Connection settings

To open the connection settings, select Connection Settings in the test step toolbar.

Option Description
Confluent Settings

Used to connect to the Confluent schema registry. Only available when the JSON via Avro (Schema Registry) or JSON via Protobuf (Schema Registry) message format is selected.

  • Schema Registry URL – The URL of the schema registry.
  • Registry Authorization – The authorization profile that will be used to authenticate to the schema registry. To learn more about registry schema authentication, see Authentication in Kafka.
Publish Settings

Other parameters that your Kafka provider might require.

Reset to Default

Resets connection settings to default. Default settings depend on the environment selected:

  • If no environment is selected, Confluent Settings and Publish Settings are cleared.

  • If an environment is selected, default settings will be taken from the environment settings.

Property list

You can also change the test step behavior by using the step properties on the API Connection Test Step Properties panel in the Navigator.

Property Description
Name

The test step’s name.

Description

Text that describes the test step.

Message editor

The message editor is where you enter the actual message that the test step will send to the topic.

API Connection test step: Message editor

Click the image to enlarge it.

  • Message format – The format in which the message will be sent. Currently, two formats are available:

    • JSON – The message will be sent as a regular JSON.

    • JSON via Avro (Schema Registry) – The message will be serialized using an Avro schema stored in the Confluent schema registry. You can set up a connection to the schema registry in the Connection Settings.

    • JSON via Protobuf (Schema Registry) – The message will be serialized using a Protobuf schema stored in the Confluent schema registry. You can set up a connection to the schema registry in the Connection Settings.

    • Custom – The message will be serialized using an Avro or Protobuf schema stored in a file. You can set up a connection to the schema registry in the Connection Settings.

  • Metadata – The parameters that will be passed with the message. To add a parameter, click .

    There are three types of parameters:

    • Header – Kafka headers that will be sent with the message.

    • Path – Used to update path parameter values in the channel name.

      Path parameter

      Click the image to enlarge it.

    • Kafka – Other parameters specific to Kafka. Currently, two Kafka parameters are supported: Key and Partition.

    Tip: You can use property expansions to insert data into parameter values.
  • Data – The message itself in the JSON format.

    You can insert data from other test steps or parameters into the message by using Get Data. To learn more about the feature, see Get Data Dialog.

    You can also automatically format the JSON you enter in the field by clicking the Beautify button.

Next Steps

See Also

Kafka Testing
API Connection Test Step

Highlight search results