Logo

Security / High Availability

How Distributed Systems Found Their Place – Brick by Brick.

What was previously only visible in fragments was brought together in a central data warehouse: alerts, statuses, correlations – cross-system, traceable, and analyzable. The data was fleeting, the requirements dynamic – so we relied on a modular architecture, robust ETL pipelines, and generic reporting. The result: a consistent view of the entire system – live and retrospective.

Key data at a glance

Tasks

DevelopmentOutsourcingSupport

Roles

ArchitectRequirements EngineerProject LeadDeveloperScrum MasterTest EngineersDevOps Expert

Products

Desktop AppWeb AppClient-Server DevelopmentDatabase

Challenge

Our client worked with a system made up of multiple subsystems – each complex on its own, even more so when combined. What was missing: the overview.
What they needed was a central cockpit that shows:

  • How are the subsystems interconnected?
  • How healthy is each one?
  • How quickly do alerts move through the system?
  • What data is generated where – and what can be inferred from it?

And not just in real time, but also retrospective – as a basis for reports, analyses, and new use cases.

The catch: the subsystems weren’t originally designed to share their data – especially not with a central system. And many of the "interesting" data points were only visible for a short time before disappearing.


The technical challenges: fleeting data, constantly changing requirements, and systems that didn’t want to talk to each other.

Our task: capture data permanently, structure it meaningfully – and make it accessible for anything to come.

Success

A data warehouse was developed that provides access to the collected data via a web interface. It offers the desired insight into the system – down to the detail level of subsystems, if needed.

A key aspect was storing all relevant data to enable later analysis in the warehouse.
For data transfer, specific configurable software components (ETLs) were developed to feed data – usually from log files – into the DWH.

The core was the data processing, where logic linked data across subsystems, allowing system-wide overviews beyond individual boundaries.

To avoid having to custom-code every report on the web UI, a generic approach was used, enabling reports with filters to be assembled quickly. This reduced web development effort – only the report’s purpose needed to be defined.

Of course, the data warehouse also supports user-specific data restrictions – not just in terms of features, but also in terms of which datasets are accessible, addressing privacy and data protection concerns.

Approach

Agile, Not Guesswork
Since the full scope was unclear at the beginning, we used an agile development process. The benefit: the client, as Product Owner, could actively steer – adjusting priorities, adding requirements, and shaping the direction continuously.

Development followed structured sprints. Functioning features emerged step by step – from data source to web interface. Vertically integrated, fully operational, and always extensible.

Tech Stack

Methods & Paradigms

Server Technology

Database Technologies

Languages & Frameworks

Communication Technologies

Web Development Technologies

Communication / Protocols