White shape | Hexiosec Logo
Client Story

Streamlining Cloud Processing with Argo

Summary

We helped a customer test and deploy a new workflow management solution for cloud data processing, which resulted in an easier to manager, more scalable approach to their key daily data processing requirements.

Client need

One of our clients in the financial sector runs Kubernetes on Google Cloud to carry out all their recurring end-of-day data reconciliation and analysis jobs. The processing is complex, and consists of many smaller processing stages - each with their own dependencies - that must be run in a strict sequence.

Many of these stages depend on external data feeds and, unfortunately, these are not as reliable as one might hope! Any failure of a stage needs to pause the processing of dependent stages until errors have been resolved and the prior stage completes successfully.

This seemingly simple workflow concept is quite difficult to architect, deploy and manage in a simple and user-friendly way.

What we did

We worked with our client to deploy and trial Argo Workflows – an open-source, Kubernetes native workflow engine. Argo allows workflows to be specified as a Directed Acyclic Graph (DAG) – put simply, this means that each stage is defined along with its dependencies. Argo then calculates the sequence and manages the execution of the workflow, using parallelism where possible. It also provides a web UI for visualisation, monitoring and control of workflows.

Client benefit

Our client was impressed with Argo Workflows and after the trial decided to deploy to their production environment. They are now using Argo for all their data processing, which has dramatically simplified the management and monitoring of the processing. The parallelisation capability now allows multiple stages to run in parallel, which has reduced the overall processing time considerably.