Streams of events are changing the integration paradigm. Companies need their systems to run without glitches.

No items found.

Companies today often need their data to be constantly in motion to ensure quality of service and smooth running online applications. This is also why event-driven integration is on the rise in IT, which can ensure efficient exchange of information between systems using a quality streaming platform. How exactly does this concept help companies? In the banking environment, we are seeing an increasing demand for the digitisation of all agendas and services. With this, the number of systems and their data exchange is growing. At the same time, clients expect to have their products and services available at all times with immediate response to their requests.

You might also enjoy

Read more

Events carry informations

Alongside long-standing trends in integrationsuch as decentralization, containerization, the use of API management and thedrive to create smaller and more flexible solutions, event-stream or eventstream is another topic that needs attention. It is an architectural andtechnological concept that helps address the growing demands in the integrationsphere, especially in the area of event-driven integration. Many Czech banksare already using event streams as a critical integration tool.

In this integration scheme, an event isdefined as a description of something that is (a fact) - for example, acustomer address or an accounting balance. However, it also includesnotifications of a change in status, such as a change of customer address ornotification of exceeding the allowed limit, or various instructions for actionwithin a given system. 

The way event streams are dealt with

One way to manage and process eventsefficiently is by engaging a data streaming platform. The predominanttechnology in this regard has become Apache Kafka, an open source distributedstreaming platform that can be used to create applications and feeds forreal-time streaming data. 

Apache Kafka was created by developers atLinkedIn, who needed a tool for fast, secure and scalable transfer of largeamounts of data across the globe. Integration tools and practices at the timefailed to adequately reflect these requirements. Today, Kafka is used in manyindustries and companies from mid-sized ones to global corporations.

When you use this streaming platform as a backbone technology for integration, you can implement service and microservice systems to exchange events in real time while creating new events, for example, in response to what the end user is currently doing in the application, explains Petr Dlouhý, who manages integration competence at Trask.

For example, in practice, an event after adding an item to the cart in the e-shop can lead to the creation of updated cart contents, setting a new corresponding price and updating the stock. Kafka also allows events to be handled in failover mode 24/7. Events are securely stored with their data in the Kafka cluster and are available to many applications and services," explains Petr Dlouhý.

That's key these days, everything has gotten faster and clients now expect services to run without outages, be scalable and everything to be taken care of right away.

Getting to this requirement is possible because all communication between systems in Kafka is asynchronous. This means that the systems are loosely coupled and independent of each other. The benefits of this approach include the ability to easily update, scale and operate the systems in many replicas around the world to handle an ever-increasing number of client requests.

However, it is important that at the beginning of the whole project there is first a data analysis, selection of integration patterns that Kafka allows, proper definition of transmission channels (topics) in Kafka and their schemas. It is also essential to treat errors that may occur when creating and reading messages - as always, proper design is crucial for success.  

When does an event-driven architecture come into play?  

Using Kafka makes sense wherever there is a need to process huge amounts of data in real time and without delay. Typically, event-driven integration thrives in environments that use microservice architecture, which it complements well in terms of scalability, fault tolerance, and agility.

Typical examples we encounter and help our clients address are log processing (application and audit), user activity monitoring and online evaluation of client offerings, or "classic" application integration. Especially in the last case, it is necessary to design the right approach to event distribution and consumption, as event-streaming for existing systems can be more complex to implement than other types of integration.  

The reward for proper implementation is then huge potential for the future - the ability to respond more quickly to new client requests (often without the need for downtime), scale effectively as the client base grows, and ensure stable and consistently available services for customers.

Written by

No items found.