Categories
News

KaTe Confluent Parallel Consumer Adapter for SAP IS

KaTe Confluent Parallel Consumer Adapter for SAP Integration Suite

Die KaTe GmbH hat einen Adaper für Apache Kafka by Confluent entwickelt für maximale Messages Übertragung auf der SAP Integration Suite

SAP integrations are great — until throughput becomes the bottleneck!

The Confluent Parallel Consumer Adapter for SAP Integration Suite (incl. Edge Integration Cell) enables massively parallel Kafka consumption for high-throughput scenarios.

Key highlights:
– Parallel processing (down to record-key level, optional without ordering)
– Avro / Protobuf / JSON + Schema Registry support, incl. transformations and XML schema generation
– Fully traceable & debuggable in the iFlow context
– Modern Kafka security options (e.g., SASL, OAuth2, client certificates)

For mor Details: Confluent Parallel Consumer Adapter for SAP Integration Suite

You can find even more news on our social media channels:

Categories
SAP Products

Confluent Parallel Consumer Adapter for SAP Integration Suite

Confluent Parallel Consumer Adapter for SAP Integration Suite

Apache Kafka is a popular open source project for processing event-driven data streams based on a transaction log.

KaTe Confluent parallel consumer adapter extends the existing bullet proof Kafka Adapter for high throughput optimized scenarios on SAP integration suite or Edge integration cell (EIC).

The adapter inherits all functions of the KaTe Kafka adapter and complements it with parallel processing capabilities for high throughput.

Basic functions

The adapter allows you to publish Iflow messages as Kafka records and subscribe Kafka records as Iflow messages.

Kafka key, headers and data sections are translated into iflow message parts and vice versa. Topic & Partitions and technical reliability settings (Ack Mode/Consumer commits) are customizable per use case.

Message processing is observable and debuggable on channel & message log level without third party tooling fully integrated into SAP Cloud integration functionality.

Work secure with company & compliance standards

Our product supports all  commonly used security standards PLAIN, PLAIN_SASL, SASL_SSL, SASL-Plain/SASL-SCRAM, OAuth2 as well as client certificate authentication for broker and schema registry connections.

Support for high performance, resilience & flow control

The adapter allows parallelization on key level instead of only partition level which enables 10-500x higher throughput on single or even more on batched records without loosing functional order of related records.

Stay in charge what you want to process of a topic and how you will put load on your connected backends.

The product supports different methods to allow, filter and control message throughput on topics through:

  • Filtering: Kafka records can be filtered based on record headers, keys and data content before triggering an integration
  • Batching: A set of n Kafka records can be batched as single iflow message to optimize throughput
  • Parallelization can be adjusted with different modes (no order, order on key level) as well as how many threads process the records in parallel.

Full control of operations in one tool

  • Process records from start or end of stream or from any arbitrary position of the stream.
  • Process and re-process records
  • Deduplicate of necessary by offset or functional headers
  • Group names are feely definable and not set by us as vendor

Automated or manual error resolution

One common „challenge“ with ordered streaming systems like Kafka is, if errors occur, this could bring a total halt to processing if order is important. Our product allows you a manual as well as automated way to handle by e.g. forwarding it to an error destination or  document & ignore certain cases.

Serialization errors, data transformations errors or any down stream processing error (processing through an iflow).

Avro, Protocol Buffers, Json & schema registries

Our product supports typical binary data formats like Avro and Protobuf with Kafka typical optimizations like a schema registry.

The adapter can translate Avro and Protocol Buffers to XML and vice versa. Necessary translations of Avro/Protobuf schemas are generated with an included schema generation tool.

Transformation from binary (Avro/Protocol Buffers) and Json to XML and vice versa is supported

SAP roadmap compatible

Our Kafka Adapter is available for SAP Process Orchestration & Integration Suite or EIC (Edge integration cell).

This allows you to just import any developed interface and it will work without migration efforts, adjustments, refactoring or anything which is often the case with other adapters or vendors (including SAP).

Choose your own timing and roadmap for a possible migration plan. Your investment in integrations on SAP PO is easily transferable on Integration suite without unknown risks / commercial questions.

Difference to standard adapter for Cloud integration

  • 10-500x higher throughput as Standard adapter as parallelization and batching can be used on functional levels instead of only glued to partitions.
  • Standard adapter can’t use arbitrary positions or group names in stream for processing / reprocessing which is a very important feature to actually operate with Kafka.
  • Our product supports all data formats (Avro, Protocol buffers and schema registries, automated schema generation)
  • Our adapter allows batching into one iflow execution for hight throughput unlike the standard
  • The KaTe Parallel Consumer adapter allows manual as well as automated error resolution which is not the case in the standard version.
  • Filtering & Deduplication allows control to select exactly what you need on hight troughput topics unlike the standard
  • With the standard adapter there are no simple mechanisms to quickly understand consumer lag or record search in your message processing log. It must be done with third party tools (that might be hard to access for a SAP integration team) or manual developments.

Summary

FeatureSAP Standard Kafka AdapterConfluent Parallel Consumer Adapter
Parallelism modelLimited by number of Kafka partitionsConfigurable worker pool enabling high parallelism within a single consumer
Ordering guaranteesOrdering guaranteed only per partitionOrdering guaranteed per message key (optional), with parallel execution across keys
Throughput scalingRequires increasing partition count to scaleScales without increasing partitions, enabling 10–500× higher throughput
Iflow executionBlocking calls can stall message processing partly or fullyNon-blocking and asynchronous processing without blocking message processing
Automated Error handlingNot supported, some errors even not visible (deserialization)Errorhandling for deserialization / data transformation or iflow execution can be automatically handled for different error cateogries by logging the error and ignore it or foraward it to error / deadletter topics
Offset /Group Name controlNot supportedAny choice of start offset possible, Reprocessing of arbitrary offsets possible. Group Names manually configurable
Message acknowledgmentOffset commits at partition levelFine-grained, message-level acknowledgment and processing control
Advanced MonitoringNot supportedRich monitoring experience (consumer lag, 1:1 detection of which Kafka record produced which Iflow message) out of the box

Filtering

Not supported, only manual in the iflowFiltering per header or body, to decerease unnecessary load possible
Avro/Protobuf and JSON with schema registryNot supportedSupported, incl data transformation and schema generation to XML

Further Information:

SAP Appcenter listing:

KaTe Kafka Adapter for SAP PO im SAP Appcenter

Our Blog on SAP Community

Hook your SAP landscape to Azure Event Hubs with KaTe Kafka Adapter & SAP PO

Pay what you use

All of our adapters are subscription licensed, which simplifies use at the project level or in limited use cases.

30 days trial

All our adapters are available for a 30-day trial period.

Verified SAP Partner

All our adapters are developed in close cooperation with SAP and our verified SAP experts.

FAQ

feel free to
ask away!

The Confluent Parallel Consumer Adapter for SAP Integration Suite is a powerful extension of the KaTe Kafka adapter to connect SAP systems with Apache Kafka with extreme high throughput requirements. Kafka and SAP can be connected– both on-premises and in the SAP Business Technology Platform (BTP). It enables seamless processing of event-driven data streams and bi-directional message exchange between SAP Cloud Integration Iflows and Kafka Topics at unprecedented scale. As the successor of the Kafka Adapter for SAP Process Orchestration, it provides 1:1 compatability migrating existing SAP-Kafka integration from PO.

The adapter allows you to publish SAP Cloud Integration Iflow messages as Kafka records and subscribe to Kafka records as Iflow messages. Kafka keys, headers, and data are translated into Iflow messages and vice versa.

Different than typical Kafka consumers in order processing capabilities of the adapter are not limited to partitions but can be scaled up to record key level or even no ordering can be used which enables extreme high througput.


Aside it fully supports popular data formats like Avro, Protocol Buffers, and JSON, including Schema Registry integration and automatic schema generation for seamless conversion to and from XML. Topics, partitions, and reliability settings (e.g., acknowledgement mode, consumer commits) are highly configurable. Additionally, message processing is fully integrated into SAP Cloud Integration, making it fully traceable and debuggable without third-party tools.

The Confluent Parallel Consumer Adapter for SAP Integration Suite complies with modern Kafka security standards, including PLAIN, PLAIN_SASL, SASL_SSL, SASL-Plain/SASL-SCRAM, OAuth2 and client certificate authentication. This ensures secure data transmission aligned with corporate security policies.

The adapter is designed for high-performance streaming, offering batch processing, message filtering, and massive parallelization on record key level. Messages can be filtered based on headers, keys, or data content and grouped into optimized batches to improve throughput.

Parallel processing options on key level or even with no order enable extreme hight throughput (10.x-500x) compared to standard Kafka consumers.

For reliable message processing, the adapter provides both automated and manual error handling. Common issues like serialization errors, data transformation errors, or Iflow processing errors can be forwarded to an error target or deduplicated, ensuring uninterrupted data flows.

Unlike the SAP standard Kafka adapter, the KaTe Kafka Adapter offers:

  • Extreme high throughput 10-500x compared to the SAP Standard Kafka adapter by parallelization options.
  • Processing from any stream position with custom group names

  • Full support for Avro, Protocol Buffers, JSON, and Schema Registries with automatic schema generation

  • Batch processing within a single Iflow execution

  • Advanced manual and automated error handling

  • Filtering and deduplication for optimized data quality and throughput.

SAP Technology and Migration Consulting –
Future-Proof Integration with KaTe GmbH


KaTe GmbH is your trusted partner for
SAP technology consulting and SAP migration projects.

We specialize in ensuring a smooth and secure transformation of existing SAP PO (Process Orchestration) landscapes to the modern SAP Integration Suite, including the SAP Edge Integration Cell for hybrid integration scenarios. With tailored concepts, proven best practices, and deep expertise, we make sure your SAP integration remains sustainable, high-performing, and future-ready.

Our Core Expertise in SAP Technology Consulting
  • SAP PO to SAP Integration Suite Migration – including analysis, architecture design, and technical implementation

  • SAP Edge Integration Cell Implementation and Optimization – enabling flexible and secure hybrid integrations

  • SAP BTP (Business Technology Platform) Consulting – leveraging modern services and tools for innovation and process automation

  • Technology-Driven Strategy Consulting – from cloud integration to API management

Contact us today for a personalized consultation and discover how we can modernize your SAP integration and system landscape

Contact us - now!

Get in touch with us by calling or emailing us - we'll be happy to help!

Contact
Categories
SAP Products

Kafka Adapter for SAP Integration Suite

Kafka Adapter for SAP Integration Suite

Apache Kafka is a popular open source project for processing event-driven data streams based on a transaction log.

Our adapter enables you to connect your SAP system landscape to Kafka on Premise or on SAP BTP in a simple an real world use case proven way. The product is the logical successor of our Kafka adapter for SAP Process Orchestration.

Basic functions

The adapter allows you to publish Iflow messages as Kafka records and subscribe Kafka records as Iflow messages.

Kafka key, headers and data sections are translated into iflow message parts and vice versa. Topic & Partitions and technical reliability settings (Ack Mode/Consumer commits) are customizable per use case.

Message processing is observable and debuggable on channel & message log level without third party tooling fully integrated into SAP Cloud integration functionality.

Work secure with company & compliance standards

Our product supports all  commonly used security standards PLAIN, PLAIN_SASL, SASL_SSL, SASL-Plain/SASL-SCRAM, OAuth2 as well as client certificate authentication for broker and schema registry connections.

Support for hight perfomance, resilience & flow control

Stay in charge what you want to process of a topic and how you will put load on your connected backends.

The product supports different methods to allow, filter and control message throughput on topics through:

  • Filtering: Kafka records can be filtered based on record headers, keys and data content before triggering an integration
  • Batching: A set of n Kafka records can be batched as single iflow message to optimize throughput
  • Polling & Record count/ time unit can be adjusted to control the exact amount of messages per period of time

Full control of operations in one tool

  • Process records from start or end of stream or from any arbitrary position of the stream.
  • Process and re-process records
  • Deduplicate of necessary by offset or functional headers
  • Group names are feely definable and not set by us as vendor

Automated or manual error resolution

One common „challenge“ with ordered streaming systems like Kafka is, if errors occur, this could bring a total halt to processing if order is important. Our product allows you a manual as well as automated way to handle by e.g. forwarding it to an error destination or  document & ignore certain cases.

Serialization errors, data transformations errors or any down stream processing error (processing through an iflow).

Avro, Protocol Buffers, Json & schema registries

Our product supports typical binary data formats like Avro and Protobuf with Kafka typical optimizations like a schema registry.

The adapter can translate Avro and Protocol Buffers to XML and vice versa. Necessary translations of Avro/Protobuf schemas are generated with an included schema generation tool.

Transformation from binary (Avro/Protocol Buffers) and Json to XML and vice versa is supported

SAP roadmap compatible

Our Kafka Adapter is available for SAP Process Orchestration & Integration Suite.

This allows you to just import any developed interface and it will work without migration efforts, adjustments, refactoring or anything which is often the case with other adapters or vendors (including SAP).

Choose your own timing and roadmap for a possible migration plan. Your investment in integrations on SAP PO is easily transferable on Integration suite without unknown risks / commercial questions. 

Difference to standard adapter for Cloud integration

  • Standard adapter can’t use arbitrary positions or group names in stream for processing / reprocessing which is a very important feature to actually operate with Kafka.
  • Our product supports all data formats (Avro, Protocol buffers and schema registries, automated schema generation)
  • Our adapter allows batching into one iflow execution for hight throughput unlike the standard
  • The KaTe Kafka adapter allows manual as well as automated error resolution which is not the case in the standard version.
  • Filtering & Deduplication allows control to select exactly what you need on hight troughput topics unlike the standard
  • With the standard adapter there are no simple mechanisms to quickly understand consumer lag or record search in your message processing log. It must be done with third party tools (that might be hard to access for a SAP integration team) or manual developments.

Further Information:

SAP Appcenter listing:

KaTe Kafka Adapter for SAP PO im SAP Appcenter

Our Blog on SAP Community

Hook your SAP landscape to Azure Event Hubs with KaTe Kafka Adapter & SAP PO/

Pay what you use

All of our adapters are subscription licensed, which simplifies use at the project level or in limited use cases.

30 days trial

All our adapters are available for a 30-day trial period.

Verified SAP Partner

All our adapters are developed in close cooperation with SAP and our verified SAP experts.

FAQ

feel free to
ask away!

The Kafka Adapter for SAP Integration Suite is a powerful solution to connect SAP systems with Apache Kafka – both on-premises and in the SAP Business Technology Platform (BTP). It enables seamless processing of event-driven data streams and bi-directional message exchange between SAP Cloud Integration Iflows and Kafka Topics. As the successor of the Kafka Adapter for SAP Process Orchestration, it provides modern features for reliable SAP-Kafka integration.

The adapter allows you to publish SAP Cloud Integration Iflow messages as Kafka records and subscribe to Kafka records as Iflow messages. Kafka keys, headers, and data are translated into Iflow messages and vice versa.
It fully supports popular data formats like Avro, Protocol Buffers, and JSON, including Schema Registry integration and automatic schema generation for seamless conversion to and from XML. Topics, partitions, and reliability settings (e.g., acknowledgement mode, consumer commits) are highly configurable. Additionally, message processing is fully integrated into SAP Cloud Integration, making it fully traceable and debuggable without third-party tools.

The Kafka Adapter for SAP Integration Suite complies with modern Kafka security standards, including PLAIN, PLAIN_SASL, SASL_SSL, SASL-Plain/SASL-SCRAM, and client certificate authentication. This ensures secure data transmission aligned with corporate security policies.

The adapter is designed for high-performance streaming, offering batch processing, message filtering, and flow control. Messages can be filtered based on headers, keys, or data content and grouped into optimized batches to improve throughput.
For reliable message processing, the adapter provides both automated and manual error handling. Common issues like serialization errors, data transformation errors, or Iflow processing errors can be forwarded to an error target or deduplicated, ensuring uninterrupted data flows.

Unlike the SAP standard Kafka adapter, the KaTe Kafka Adapter offers:

  • Processing from any stream position with custom group names

  • Full support for Avro, Protocol Buffers, JSON, and Schema Registries with automatic schema generation

  • Batch processing within a single Iflow execution

  • Advanced manual and automated error handling

  • Filtering and deduplication for optimized data quality and throughput.

SAP Technology and Migration Consulting –
Future-Proof Integration with KaTe GmbH


KaTe GmbH is your trusted partner for
SAP technology consulting and SAP migration projects.

We specialize in ensuring a smooth and secure transformation of existing SAP PO (Process Orchestration) landscapes to the modern SAP Integration Suite, including the SAP Edge Integration Cell for hybrid integration scenarios. With tailored concepts, proven best practices, and deep expertise, we make sure your SAP integration remains sustainable, high-performing, and future-ready.

Our Core Expertise in SAP Technology Consulting
  • SAP PO to SAP Integration Suite Migration – including analysis, architecture design, and technical implementation

  • SAP Edge Integration Cell Implementation and Optimization – enabling flexible and secure hybrid integrations

  • SAP BTP (Business Technology Platform) Consulting – leveraging modern services and tools for innovation and process automation

  • Technology-Driven Strategy Consulting – from cloud integration to API management

Contact us today for a personalized consultation and discover how we can modernize your SAP integration and system landscape

Contact us - now!

Get in touch with us by calling or emailing us - we'll be happy to help!

Contact
Categories
SAP Products

Kafka Adapter for SAP PO

Kafka Adapter for SAP PO

Apache Kafka is a popular open source project for processing event-driven data streams based on a transaction log.

Our product enables you to connect your SAP system landscape to Kafka on Premise and on SAP BTP in a simple an real world use case proven way.

Basic functions

The adapter allows you to publish PO messages as Kafka records and subscribe Kafka records as PO messages.

Kafka key, headers and data sections are translated into PO message parts and vice versa. Topic & Partitions and technical reliability settings (Ack Mode/Consumer commits) are customizable per use case.

Message processing is observable and debuggable on channel & message log level without third party tooling fully integrated into SAP PO/NetWeaver functionality.

Work secure with company & compliance standards

Our product supports all  commonly used security standards PLAIN, PLAIN_SASL, SASL_SSL, SASL-Plain/SASL-SCRAM, OAuth Bearer and Kerberos as well as client certificate authentication.

Support for hight perfomance, resilience & flow control

Stay in charge what you want to process of a topic and how you will put load on your connected backends.

The product supports different methods to allow, filter and control message throughput on topics through:

  • Filtering: Kafka records can be filtered based on record headers, keys and data content before triggering an integration
  • Batching: A set of n Kafka records can be batched as single PO message to optimize throughput
  • Polling & Record count/ time unit can be adjusted to control the exact amount of messages per period of time

Full control of operations in one tool

  • Process records from start or end of stream or from any arbitrary position of the stream.
  • Process and re-process records
  • Deduplicate of necessary by offset or functional headers
  • Group names are feely definable and not set by us as vendor
  • Consumer lag of each channel is displayed per sender channel w/o third party tooling.

Automated or manual error resolution

One common „challenge“ with ordered streaming systems like Kafka is, if errors occur, this could bring a total halt to processing if order is important. Our product allows you a manual as well as automated way to handle by e.g. forwarding it to an error destination or  document & ignore certain cases.

Serialization errors, data transformations errors or any down stream processing error (processing through PO).

Ordering on key or header level

Our product allows high performance while maintaining record order to backend systems. SAP PO uses EOIO for record ordering,

Kafka topics use orders within a partition. Our adapter allows a further fine grained ordering from within the partition on key/header level of a record. By that it can be ensured that for each customer record an own order sequence exists when processing through sAP PO. If one record has an error only the particular id is affected and processing of all other partition contents continues.

Avro, Protocol Buffers, Json & schema registries

Our product supports typical binary data formats like Avro and Protobuf with Kafka typical optimizations like a schema registry.

The adapter can translate Avro and Protocol Buffers to XML and vice versa. Necessary translations of Avro/Protobuf schemas are generated with an included schema generation tool.

Transformation from binary (Avro/Protocol Buffers) and Json to XML and vice versa is supported

SAP roadmap compatible

Our Kafka Adapter is available for SAP Process Orchestration & Integration Suite.

This allows you to just import any developed interface and it will work without migration efforts, adjustments, refactoring or anything which is often the case with other adapters or vendors (including SAP).

Choose your own timing and roadmap for a possible migration plan. Your investment in integrations on SAP PO is easily transferable on Integration suite without unknown risks / commercial questions. 

Further Information:

SAP Appcenter listing:

Kate Kafka Adapter for SAP PO im SAP Appcenter

Our Blog on SAP Community

Hook your SAP landscape to Azure Event Hubs with KaTe Kafka Adapter & SAP PO/

Pay what you use

All of our adapters are subscription licensed, which simplifies use at the project level or in limited use cases.

30 days trial

All our adapters are available for a 30-day trial period.

Verified SAP Partner

All our adapters are developed in close cooperation with SAP and our verified SAP experts.

FAQ

feel free to
ask away!

The Kafka Adapter for SAP Process Orchestration (SAP PO) enables seamless integration of your SAP system landscape with Apache Kafka, both on-premise and within the SAP Business Technology Platform (BTP). It streamlines the exchange of PO messages with Kafka records, including bidirectional conversion of Kafka keys, headers, and data into SAP PO messages.
This allows businesses to integrate event-driven data streams efficiently into their SAP processes, ensuring real-time data processing and modern connectivity.

Yes, our Kafka Adapter for SAP PO is designed with top-level security in mind. It supports all major Kafka security standards, including PLAIN, PLAIN_SASL, SASL_SSL, SASL-Plain/SASL-SCRAM, Oauthbearer,  Kerberos, and Client Certificate Authentication.
This ensures compliance with enterprise security policies and guarantees secure data exchange between systems.

The Kafka Adapter provides advanced features such as batch processing, message filtering by header, key, and payload content, and the ability to adjust polling intervals and message volume per time unit.
These capabilities allow for fine-grained control of message throughput, improving performance and efficiency without overloading your backend systems.

Yes, our adapter fully supports binary data formats such as Avro and Protocol Buffers (Protobuf), as well as JSON. It enables seamless conversion of these formats to XML and vice versa.
With an integrated schema registry and schema generation tool, Avro and Protobuf schemas can be easily transformed, making the adapter ideal for complex integration scenarios.

Our Kafka Adapter is fully SAP roadmap-compliant, designed to work with both SAP Process Orchestration (SAP PO) and the SAP Integration Suite.
This means existing interfaces can be migrated without major redevelopment, allowing companies to plan their migration strategy flexibly, with minimal risk and no unexpected costs.

SAP Technology and Migration Consulting –
Future-Proof Integration with KaTe GmbH


KaTe GmbH is your trusted partner for
SAP technology consulting and SAP migration projects.

We specialize in ensuring a smooth and secure transformation of existing SAP PO (Process Orchestration) landscapes to the modern SAP Integration Suite, including the SAP Edge Integration Cell for hybrid integration scenarios. With tailored concepts, proven best practices, and deep expertise, we make sure your SAP integration remains sustainable, high-performing, and future-ready.

Our Core Expertise in SAP Technology Consulting
  • SAP PO to SAP Integration Suite Migration – including analysis, architecture design, and technical implementation

  • SAP Edge Integration Cell Implementation and Optimization – enabling flexible and secure hybrid integrations

  • SAP BTP (Business Technology Platform) Consulting – leveraging modern services and tools for innovation and process automation

  • Technology-Driven Strategy Consulting – from cloud integration to API management

Contact us today for a personalized consultation and discover how we can modernize your SAP integration and system landscape

Contact us - now!

Get in touch with us by calling or emailing us - we'll be happy to help!

Contact