Tag: Cloud

Solution Spotlight – Data Mesh Foundations

Posted November 24, 2022

A Data Mesh is an emerging technology and practice used to manage large amounts of data distributed across multiple accounts and platforms. It is a decentralized approach to data management, in which data remains within the business domain (producers), while also making data available to qualified users in different locations (consumers), without moving data from producer accounts. It is a step forward in the adoption of modern data architecture and aims to improve business outcomes. A Data Mesh is a modern architecture made to ingest, transform, access, and manage analytical data at scale.

Module Spotlight – Experiment Broker

Posted November 2, 2022

Vertical Relevance's Experiment Broker provides the infrastructure to implement automated resiliency experiments via code to achieve standardized resiliency testing at scale. The Experiment broker is a resiliency module that orchestrate experiments with the use of state machines, the input is driven by a code pipeline that kicks off the state machine but also can be executed manually. Coupled with a deep review and design of targeted resiliency tests, it can help ensure your AWS cloud application will meet business requirements in all circumstances.

Use Case: Lakehouse and Data Governance

Posted August 16, 2022

In this use case learn how a leading financial services company obtained a data platform that is capable of scaling to accommodate the various steps of the data lifecycle along with tracking of all the steps involved including cost allocation, parameter capturing, and the providing of metadata required for integration of the client’s third party services.

Use Case: Financial Risk Data Analytics Pipeline and Lakehouse

Posted August 10, 2022

In this use case learn how a leading financial services company obtained a carefully planned, scalable, and maintainable testing framework that dramatically reduced testing time for their mission-critical application and enabled them to constantly test the applications releasability.

Solution Spotlight – Data Pipeline Foundations

Posted July 14, 2022

The Data Pipeline Foundations provide guidance on the fundamental components of a data pipeline such as ingestion and data transformations. For data ingestion, we heavily leaned on the concept of data consolidation to structure our ingestion paths. For transforming your data, be sure to utilize our step-by-step approach to optimize architecting your data for end-user consumption. By following the strategies provided, your organization can create a pipeline to meet your data goals.  

Control Broker Eval Engine 

Posted March 16, 2022

This is the latest example of how Vertical Relevance is a leader in the Policy as Code space. This post outlined how to operationalize PaC with a serverless Evaluation Engine as part of the broader Control Broker solution. Get in touch with us to learn more about the benefits of operationalizing the automated enforcement of security policies

Solution Spotlight – Identity Foundations

Posted March 10, 2022

While there are many different components involved with securing the cloud, a carefully architected IAM strategy is paramount. A solid IAM strategy allows engineers to develop quickly, provides key stakeholders with a comprehensive picture of the actions that can be performed by different IAM principals, and results in a more secure cloud environment overall. Security without reasonable user experience can lead to workarounds and dysfunction, and by implementing this solution, both key stakeholders and engineers can all be satisfied with the result.

Best Practice Guide – Network Foundations

Posted February 9, 2022

Best Practices for Designing and Automating Scalable Networks on AWS. Vertical Relevance highly recommends that the following foundational best practices be implemented to create a sustainable AWS Network that will support an enterprise-level organization. 
Page 2 of 4

Learn More