Unified Event-Driven Architecture across AWS and Azure: Leveraging Kafka + Confluent for Real-Time Enterprise Intelligence
DOI:
https://doi.org/10.63282/3050-9416.IJAIBDCMS-V6I4P119Keywords:
Event-Driven Architecture, Multi-Cloud, Kafka, Confluent, Cluster Linking, Stream Governance, Data Contracts, Disaster RecoveryAbstract
Modern enterprise intelligence demands a real-time, unified data plane that abstracts cloud-specific messaging and security complexities. This paper details the implementation of a Unified Event-Driven Architecture (UEDA) using Apache Kafka and Confluent Cluster Linking to bridge AWS and Azure environments. We analyze how this architecture overcomes multi-cloud fragmentation by enforcing centralized data contracts via the Schema Registry, aligning security policies through centralized role-based authorization across components using RBAC (environment-scoped), and achieving high availability through offset-preserving cluster replication. Furthermore, we provide prescriptive guidance on cost optimization, focusing on strategies to minimize cloud egress charges, and analyze the critical tuning levers required to meet stringent Recovery Time Objective (RTO) and Recovery Point Objective (RPO) targets. The UEDA establishes a resilient, scalable foundation for complex cross-cloud streaming applications and future integration with Edge computing and AI/ML initiatives
References
1. P. K. Gupta, et al., "Batch vs. Real-Time Processing: A Comparative Study," IEEE Transactions on Cloud Computing, vol. 10, no. 5, pp. 450-460, 2023.
2. A. B. Chen, "Managing Technical Debt in Hybrid and Multi-Cloud Environments," Journal of Enterprise Information Management, vol. 35, no. 1, pp. 20-35, 2022.
3. J. A. Smith, "The Distributed Commit Log: A Foundation for Stream Processing," ACM Symposium on Distributed Systems, 2021.
4. Figure1 source: Gemini generated diagram
5. Figure2 reference diagram : https://github.com/confluentinc/demo-cross-cloud-replication/blob/master/assets/1-HLD.png Documentation for reference:[1]