Apache Kafka is a popular open source project for processing event-driven data streams based on a transaction log.
KaTe Confluent parallel consumer adapter extends the existing bullet proof Kafka Adapter for high throughput optimized scenarios on SAP integration suite or Edge integration cell (EIC).
The adapter inherits all functions of the KaTe Kafka adapter and complements it with parallel processing capabilities for high throughput.
Basic functions
The adapter allows you to publish Iflow messages as Kafka records and subscribe Kafka records as Iflow messages.
Kafka key, headers and data sections are translated into iflow message parts and vice versa. Topic & Partitions and technical reliability settings (Ack Mode/Consumer commits) are customizable per use case.
Message processing is observable and debuggable on channel & message log level without third party tooling fully integrated into SAP Cloud integration functionality.
Work secure with company & compliance standards
Our product supports all commonly used security standards PLAIN, PLAIN_SASL, SASL_SSL, SASL-Plain/SASL-SCRAM, OAuth2 as well as client certificate authentication for broker and schema registry connections.
Support for high performance, resilience & flow control
The adapter allows parallelization on key level instead of only partition level which enables 10-500x higher throughput on single or even more on batched records without loosing functional order of related records.
Stay in charge what you want to process of a topic and how you will put load on your connected backends.
The product supports different methods to allow, filter and control message throughput on topics through:
Full control of operations in one tool
Automated or manual error resolution
One common „challenge“ with ordered streaming systems like Kafka is, if errors occur, this could bring a total halt to processing if order is important. Our product allows you a manual as well as automated way to handle by e.g. forwarding it to an error destination or document & ignore certain cases.
Serialization errors, data transformations errors or any down stream processing error (processing through an iflow).
Avro, Protocol Buffers, Json & schema registries
Our product supports typical binary data formats like Avro and Protobuf with Kafka typical optimizations like a schema registry.
The adapter can translate Avro and Protocol Buffers to XML and vice versa. Necessary translations of Avro/Protobuf schemas are generated with an included schema generation tool.
Transformation from binary (Avro/Protocol Buffers) and Json to XML and vice versa is supported
SAP roadmap compatible
Our Kafka Adapter is available for SAP Process Orchestration & Integration Suite or EIC (Edge integration cell).
This allows you to just import any developed interface and it will work without migration efforts, adjustments, refactoring or anything which is often the case with other adapters or vendors (including SAP).
Choose your own timing and roadmap for a possible migration plan. Your investment in integrations on SAP PO is easily transferable on Integration suite without unknown risks / commercial questions.
Difference to standard adapter for Cloud integration
Summary
| Feature | SAP Standard Kafka Adapter | Confluent Parallel Consumer Adapter |
|---|---|---|
| Parallelism model | Limited by number of Kafka partitions | Configurable worker pool enabling high parallelism within a single consumer |
| Ordering guarantees | Ordering guaranteed only per partition | Ordering guaranteed per message key (optional), with parallel execution across keys |
| Throughput scaling | Requires increasing partition count to scale | Scales without increasing partitions, enabling 10–500× higher throughput |
| Iflow execution | Blocking calls can stall message processing partly or fully | Non-blocking and asynchronous processing without blocking message processing |
| Automated Error handling | Not supported, some errors even not visible (deserialization) | Errorhandling for deserialization / data transformation or iflow execution can be automatically handled for different error cateogries by logging the error and ignore it or foraward it to error / deadletter topics |
| Offset /Group Name control | Not supported | Any choice of start offset possible, Reprocessing of arbitrary offsets possible. Group Names manually configurable |
| Message acknowledgment | Offset commits at partition level | Fine-grained, message-level acknowledgment and processing control |
| Advanced Monitoring | Not supported | Rich monitoring experience (consumer lag, 1:1 detection of which Kafka record produced which Iflow message) out of the box |
Filtering | Not supported, only manual in the iflow | Filtering per header or body, to decerease unnecessary load possible |
| Avro/Protobuf and JSON with schema registry | Not supported | Supported, incl data transformation and schema generation to XML |
Further Information:
SAP Appcenter listing:
KaTe Kafka Adapter for SAP PO im SAP Appcenter
Our Blog on SAP Community
Hook your SAP landscape to Azure Event Hubs with KaTe Kafka Adapter & SAP PO
All of our adapters are subscription licensed, which simplifies use at the project level or in limited use cases.
All our adapters are available for a 30-day trial period.
All our adapters are developed in close cooperation with SAP and our verified SAP experts.
The Confluent Parallel Consumer Adapter for SAP Integration Suite is a powerful extension of the KaTe Kafka adapter to connect SAP systems with Apache Kafka with extreme high throughput requirements. Kafka and SAP can be connected– both on-premises and in the SAP Business Technology Platform (BTP). It enables seamless processing of event-driven data streams and bi-directional message exchange between SAP Cloud Integration Iflows and Kafka Topics at unprecedented scale. As the successor of the Kafka Adapter for SAP Process Orchestration, it provides 1:1 compatability migrating existing SAP-Kafka integration from PO.
The adapter allows you to publish SAP Cloud Integration Iflow messages as Kafka records and subscribe to Kafka records as Iflow messages. Kafka keys, headers, and data are translated into Iflow messages and vice versa.
Different than typical Kafka consumers in order processing capabilities of the adapter are not limited to partitions but can be scaled up to record key level or even no ordering can be used which enables extreme high througput.
Aside it fully supports popular data formats like Avro, Protocol Buffers, and JSON, including Schema Registry integration and automatic schema generation for seamless conversion to and from XML. Topics, partitions, and reliability settings (e.g., acknowledgement mode, consumer commits) are highly configurable. Additionally, message processing is fully integrated into SAP Cloud Integration, making it fully traceable and debuggable without third-party tools.
The Confluent Parallel Consumer Adapter for SAP Integration Suite complies with modern Kafka security standards, including PLAIN, PLAIN_SASL, SASL_SSL, SASL-Plain/SASL-SCRAM, OAuth2 and client certificate authentication. This ensures secure data transmission aligned with corporate security policies.
The adapter is designed for high-performance streaming, offering batch processing, message filtering, and massive parallelization on record key level. Messages can be filtered based on headers, keys, or data content and grouped into optimized batches to improve throughput.
Parallel processing options on key level or even with no order enable extreme hight throughput (10.x-500x) compared to standard Kafka consumers.
For reliable message processing, the adapter provides both automated and manual error handling. Common issues like serialization errors, data transformation errors, or Iflow processing errors can be forwarded to an error target or deduplicated, ensuring uninterrupted data flows.
Unlike the SAP standard Kafka adapter, the KaTe Kafka Adapter offers:
Processing from any stream position with custom group names
Full support for Avro, Protocol Buffers, JSON, and Schema Registries with automatic schema generation
Batch processing within a single Iflow execution
Advanced manual and automated error handling
Filtering and deduplication for optimized data quality and throughput.
KaTe GmbH is your trusted partner for SAP technology consulting and SAP migration projects.
We specialize in ensuring a smooth and secure transformation of existing SAP PO (Process Orchestration) landscapes to the modern SAP Integration Suite, including the SAP Edge Integration Cell for hybrid integration scenarios. With tailored concepts, proven best practices, and deep expertise, we make sure your SAP integration remains sustainable, high-performing, and future-ready.
SAP PO to SAP Integration Suite Migration – including analysis, architecture design, and technical implementation
SAP Edge Integration Cell Implementation and Optimization – enabling flexible and secure hybrid integrations
SAP BTP (Business Technology Platform) Consulting – leveraging modern services and tools for innovation and process automation
Technology-Driven Strategy Consulting – from cloud integration to API management
Contact us today for a personalized consultation and discover how we can modernize your SAP integration and system landscape