My UDP Architecture
With the ambition to swiftly establish a UDP, one of our Fortune 500 clients which faced an internal capacity limitation sought a team of certified data professionals and software engineers to augment their team’s capacity, and develop out-of-the-box data source-and-sink connectors for key data systems. Revamping pre-built connectors and at the same time building new ones would fulfill the client’s vision to create a unified platform for all users that would assist in seamlessly manipulating batch data, assessing data lineage, and visualizing information through interactive portals for successful job execution.
In response, Team Confiz embarked on the journey to enhance the pre-existing client UDP framework through Apache Beam and Spark. As part of modernizing the data architecture, a wide set of ingress and egress connectors were developed and revamped to support the data flow from source to destination. Additionally, to create an end-to-end solution, our team introduced special features including monitoring and alerting, and auditing capabilities to adhere to data quality, meta-data and lineage, and governance protocols. Meta-data and lineage focused on enabling data analysts to visually understand data provenance so that they could gain key data insights which are imperative to big data job execution.
To allow users to read directly from complex data files, and migrate data into any product-supported data type, Avro was used as a data communication tool. UI dashboards were also made available to get holistic insights through an interactive portal. In addition, UDP regression suite certifications were exercised to validate features as reliable before they could be published. Costs were saved by using automated connectors/use cases when applicable, hence minimizing manual testing before each release.
In response, Team Confiz embarked on the journey to enhance the pre-existing client UDP framework through Apache Beam and Spark. As part of modernizing the data architecture, a wide set of ingress and egress connectors were developed and revamped to support the data flow from source to destination. Additionally, to create an end-to-end solution, our team introduced special features including monitoring and alerting, and auditing capabilities to adhere to data quality, meta-data and lineage, and governance protocols. Meta-data and lineage focused on enabling data analysts to visually understand data provenance so that they could gain key data insights which are imperative to big data job execution.
To allow users to read directly from complex data files, and migrate data into any product-supported data type, Avro was used as a data communication tool. UI dashboards were also made available to get holistic insights through an interactive portal. In addition, UDP regression suite certifications were exercised to validate features as reliable before they could be published. Costs were saved by using automated connectors/use cases when applicable, hence minimizing manual testing before each release.
Find out how we helped our client achieve their goals and surpass challenges through disruptive technology.
View all case studies