Originally developed at LinkedIn, Apache Kafka is one of the most mature platforms for event streaming. Kafka is used for high-performance data pipelines, streaming analytics, data integration, and ...
IBM's $11B Confluent acquisition completes its hybrid cloud stack, with Kafka streaming joining Red Hat and HashiCorp for enterprise AI infrastructure.
Australian insurance group IAG started using Apache Kafka around 2017 because the emergence of streaming data made sense to the company. The first opportunity to use it was in support of a mobile ...
The Register on MSN
IBM drops $11B on Confluent to feed next-gen AI ambitions
Big Blue’s latest mega-buy hands it a real-time data-streaming powerhouse built on Kafka IBM has cracked open its wallet again, agreeing to shell out $11 billion for Confluent in a bid to glue ...
IBM is reportedly closing in on an $11 billion acquisition of Confluent, a move that would mark one of the company’s largest software deals in years and signal a renewed push to strengthen its ...
Confluent connects data sources and cleans up data. It built its service on Apache Kafka, an open-source distributed event streaming platform, sparing its customers the hassle of buying and managing ...
IBM wants to buy Confluent for a double-digit billion sum. Big Blue aims to reliably link analytics applications and AI agents with data streams.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results