Raiffeisenbank can now use more efficient logs thanks to Trask, which leads to faster, more secure environment

No items found.

If you're a big bank, you need to have a super overview of what is happening in your environment at all times. Along with that, you also need good analytics to help you choose the next steps in the right direction. And that's exactly what we delivered to Raiffeisenbank.

Key Business Benefits and Takeaways

  • The bank can now use one location for logs and not have data fragmented across multiple locations.
  • The individual data can also be used in the form of graphs, which brings additional convenience.
  • Managed to adapt to increasing workload by scaling up the environment from 3 servers to 20 without any problem.
  • Storage amounts to 200+TB of online data and 2PB of offline data (which can be pulled on demand in minutes).
  • The bank environment is now more secure, with no performance issues and low risk of outages.
  • All improvements have been made in line with a demanding compliance policy.

Enterprises today face challenges posed by a plethora of scattered systems, servers, and pods within an organization. Without a central point of control and audit, it becomes increasingly difficult to gain a comprehensive understanding of the system's health and performance. This lack of visibility hampers efficient alerting and analysis, making it challenging to identify and respond to issues in a timely manner. Organizations must also consolidate log data from various sources.

Raiffeisenbank, a member of the Austrian Raiffeisen Group, is an important banking institution that provides a wide range of banking services to private and corporate clientele in Czechia. The presence of logs scattered across numerous places in the bank posed significant challenges. The process of parsing these log files was time-consuming and inefficient, resulting in delays in identifying critical information. This state inhibited proactive monitoring capabilities, as the delay in log parsing prevented real-time analysis of system activities. This lack of proactive monitoring increased the risk of issues going unnoticed, leading to potential downtime, security breaches, or performance bottlenecks.

[.infobox][.infobox-heading]Main pain points Raiffeisenbank faced[.infobox-heading]- Uncomprehensive view of the IT infrastructure was making it challenging to detect and respond to incidents and problems.
- Data from various systems and applications were stored in separate silos.
- Operational logging was operated on several servers making the analytics difficult and tedious.
- Resolving of issues was time-consuming, error-prone, and inefficient.
- High risk of not meeting regulatory compliance requirements and addressing security vulnerabilities promptly due to the infrastructure characteristics.[.infobox]

Our approach was to leverage central logging for integration

We built analytics for reporting, central logging, operational supervision, and business monitoring. We implemented central logging (including log transfer), consolidated log data from various sources and allow fully fledged tracing capacity.

  1. Open search, ELK (Elasticsearch, Logstas, Kibana, FileBeat, FluentBit, FluentID, Kafka), Azure analytics
  2. Business and regulatory reporting (PSD2)
  3. The interconnection of event systems and analytics platforms
  4. Central logging and auditing, SIEM integrations
Raiffeisenbank needed a central logging platform for its environment. And we developed it within the context of the whole complexity of company´s systems.

Martin Citron, Director of Integration Department at Trask

[.infobox][.infobox-heading]Central Logging Principles[.infobox-heading]Central logging plays a pivotal role in modern IT and cybersecurity strategies. It involves the centralization of log data from various systems and applications within an organization, offering a unified platform for efficient monitoring, analysis, and response. By adhering to these principles, business can enhance its security posture, streamline compliance efforts, and improve operational efficiency.

[.infobox-heading]How does it work?[.infobox-heading]Central logging technology collects, stores, and manages log data from various IT components through agents or APIs. It normalizes and correlates this data for easy access via a single interface. SIEM solutions can be integrated with central log management, aiding in processing, analyzing, and using this data for threat detection, issue resolution, and informed decision-making.

[.infobox-heading]Main benefits[.infobox-heading]- Enhanced security
- Regulatory Compliance
- Improved Troubleshooting
- Operational Efficiency
- Data Analysis & Insights[.infobox]

See the Architecture showing complexity of the solution

The bank’s solution is implemented by utilizing industry best practices. In general, it contains three layers: ingestion, log shipping and processing, and finally, log storing and analytics. We leverage Apache Kafka for log shipping because of its unprecedented ability to quickly process large amounts of data.

Lessons Learned: Automate but do not forget to customize

Like every project, we learned something. In this case, it was all about "the exception that proves the rule." Since every logging system is different, it is extremely difficult to standardize everything. Therefore, it is essential to leave some room for customization in the pipeline and be prepared to isolate the source because the environment is too diverse. At the same time, we've seen that using GitOps practices has proven beneficial in increasing automation and, thus, overall efficiency.

Do you want to advance your business to the next level?

If you need something similar or you want to let your business in your company grow, don´t hesitate to contact us, we are here to bring you the best possible solution.

Written by

No items found.