Solution Spotlight – Pipeline Foundations

The Pipeline Foundation solution enables development teams to request a pipeline that comes with all of the necessary components, integrations, and configurations for development teams.

Self-Service Pipelines

A pipeline is the backbone of a development team’s software release process. Pipelines provide automation, consistency, predictability, and visibility to your software release process. It’s widely accepted that if you’re going to release software, you should be using a pipeline.  

A deployment pipeline consists of a variety of stages and actions that orchestrate testing and deployments as a software release moves from commit to production. This may seem straightforward, but the more application teams looking to build deployment pipelines, the more variations you will start to see to the point it becomes a management and governance nightmare. Thus a framework approach should be used to give a general structure of a pipeline, but also enable the application teams to customize it to their needs. 

The Pipeline Foundation solution enables development teams to request a pipeline that comes with all of the necessary components, integrations, and configurations for development teams. Development teams are able to request pipelines through self-service while also incorporating compliance and governance.

Pipeline Foundations Blueprint 


Skeletons are designed for specific functions and are the building blocks for an organization’s pipeline foundation. Customized pipelines are then built on top of Skeletons for specific use cases (programming language, deployment platform, etc.) 

Skeletons can be used for defining required tests and steps which enable an organization to enforce particular validations and controls are run in every pipeline. They have very little customization and heavily rely on the development team’s code repository which enables them to be generic and support multiple use cases.


  1. CloudFormation – Skeletons are often defined using CloudFormation 
  1. CodeCommit – Stores skeleton code, Skeleton buildspecs, etc. 
  1. CodePipeline – Used to define the structure of the pipeline and orchestrate the progression from stage to stage 
  1. CodeBuild – Used to execute the actual tasks (launching the environments, running the tests, etc.). This is where the tailoring and customization is done 
  1. Service Catalog – Skeleton CloudFormation templates are stored in Service Catalog as products where they can be launched by users or other Service Catalog products 

How it works

An organization might have various Applications and Model Skeletons. 

  • Containerized App Pipeline Skeleton – Defines stages for a containerized application’s building, testing, and deployment process 
  • Serverless App Pipeline Skeleton – Defines stages for a serverless application’s building, testing, and deployment process 
  • Model Pipeline Skeleton – Defines stages for moving a ML model through validation, training, and then deployment 

A Containerized App Pipeline Skeleton consists of different stages for deploying a containerized application. 

Figure – 01

  1. Source: Pulls code from the application’s Git repository containing the application code, infrastructure code, configuration files, etc. 
  1. CustomBuild: Executes the steps defined in the repository’s build.yml to build the application, run unit tests, static analysis, and then deploy to the container repository 
  1. RequiredBuild: Executes the organization’s required SAST security tests 
  1. CustomTest: Executes the steps defined in the repository’s test.yml to deploy into the acceptance environment and runs integration and functional tests 
  1. RequiredTest: Executes the organization’s required DAST security tests 
  1. CutomStaging: Executes the steps defined in the repository’s staging.yml to deploy into the staging environment and run long running and exploratory tests 
  1. RequiredStaging: Executes the organization’s required IAST security tests 
  1. RequiredApproval: Executes the organization’s required approval gate 
  1. CustomProduction: Executes the steps defined in the repository’s production.yml to deploy into the production environment often with a blue-green model 
  1. RequiredProduction: Executes the organization’s required post production deployment steps 


Blueprint/Skeleton Contains a CloudFormation template for a function-specific CI/CD pipeline which will become a Service Catalog product to be used by other users or products 


As the complexity of pipelines increase and organizations want to incorporate more automation, there are situations where multi-step, out-of-band setup steps must be performed before a pipeline can be deployed. For these requirements, an Orchestrator is used. Orchestrators are deployed and can be triggered by new pipeline instantiations.


  1. Step Function – Orchestration logic is defined using a Step Function 
  1. CloudFormation – Orchestrator Step Functions are defined using CloudFormation 
  1. Service Catalog – Orchestrator Step Functions are stored as products in Service Catalog and may be launched by users or other Service Catalog products 

How it works

An application that deploys to a Kubernetes cluster could have various out-of-band steps: 

  • Kubernetes Cluster Namespace 
  • Setup Secrets & Parameters 
  • Registering an application with an internal compliance service 
Figure – 02

If the Orchestrator Step Function workflow fails for any reason, the entire pipeline product deployment halts.


Blueprint/Orchestrator: Contains Step Function Service Catalog products that help automate out-of-band tasks that cannot be handled within a typical CI/CD pipeline.

Pipeline Products

Pipeline products provide development teams a ‘one-click’ mechanism to get a fully provisioned pipeline. Pipeline products take care of the setup and configuration of the stages, various environments, infrastructure, and coordination of third-party services.


  1. CloudFormation – Pipeline products are defined using CloudFormation 
    1. Custom Resources – Custom Resource within the CloudFormation template is used to trigger the orchestrator Step Function 
    2. DependsOn – The CodePipeline resource within the CloudFormation template would contain a DependsOn statement that ensures the Step Function executes successfully before proceeding with the deployment. 
  1. Service Catalog – Pipeline products are stored as portfolios in Service Catalog and may be launched by users

How it works

Pipeline products are basically aggregates of skeletons and orchestrators along with various infrastructure and environment products. 

Figure – 03

  1. User requests a pipeline products through Service Catalog 
  1. A CloudFormation custom resource triggers the Orchestrator Step Function to perform the setup for all the necessary prerequisites for the application pipeline 
  1. Infrastructure, and Environment Service Catalog products are launched 
  1. The Skeleton Pipeline Service Catalog products are launched


Blueprint/Products contains pipeline products that combine other products, skeletons, and a lambda to call the orchestrator in order to fully configure the pipeline.

Pipeline Factories

Pipeline Products can be put through their own deployment pipeline where testing and analysis can be done to ensure they align with organizational policies. When the products clear the pipeline stages, they are deployed to Service Catalog. 


  1. CodeCommit – Stores pipeline code 
  1. CodePipeline – Used to define the structure of the pipeline 
  1. CodeBuild – Used to execute the tests against the pipeline templates 

How it works

Skeletons, Orchestrators, and Pipeline products all are stored git repositories and have pipelines that test and validate that they comply with organization policies. 

Figure – 04

  1. Commit: Pulls the CloudFormation template from its version control repository 
  1. Proactive Security Check: cfn_lint Validate CloudFormation yaml/json templates against the CloudFormation spec and additional checks. Includes checking valid values for resource properties and best practices. 
  1. Proactive Integration Testing: Taskcat deploys the AWS CloudFormation template in multiple AWS Regions and generates a report with a pass/fail grade for each region 
  1. Reactive Security Check (Optional): The CloudFormation template is executed in an isolated AWS Account and AWS Config rules are used to check the built resources for compliance. 
  1. Deployment to Service Catalog: The Service Catalog Product is deployed to Service Catalog, a Service Catalog Portfolio is created with the Product, and access is granted to defined IAM users. Now the infrastructure product is ready for development teams to consume. 


Blueprint/Deploy contains the necessary scripts and CloudFormation templates to automate the deployment and delivery of these self-service Pipeline products.


  • Pipelines continually developed upon with their own Software Delivery Life Cycle  (SDLC) which improvers developer experience 
  • Pipelines are pre-built with connections to all services they require  
  • Pipelines provided via self-service  
  • Pipelines are codified to adhere to corporate standards and continually tested as changes occur 

End Result

The end result is a self-service method for development teams to request pipelines that are delivered in a structured manner that allows for flexibility inside defined constraints. Skeletons, Orchestrators, and Pipeline products are managed by a DevOps team and have their own backlog of features that development teams request. Development teams are able to work with autonomy while the necessary governance controls are applied. 

Figure – 05

  1. User requests a pipeline by launching the Service Catalog pipeline product 
  1. The pipeline product launches other products that are needed for the pipeline 
  1. The orchestrator is called to perform any out-of-band activities 
  1. The fully provisioned and configured pipeline is provided back to the user for use

Interested in learning more?

If you are looking to provide automation, consistency, predictability, and visibility to your software release process contact us today.

Posted August 4, 2021 by Brian Jakovich and Greg Hoggard

Posted in: Solution Spotlights

Tags: , , , , , , , , , , , , ,

About the Authors

Brian Jakovich, Managing Director AWS  
Brian is the Managing Director of the AWS Practice at Vertical Relevance and has over 10 years of AWS industry experience. He is focused on helping Financial Services customers transform digitally while leveraging the AWS Cloud.
Greg Hoggard, Senior Cloud Consultant  
Greg is a Senior Cloud Consultant at Vertical Relevance. He helps large enterprises adopt trending technologies by combining the mix of old and new in a harmonious way through operation orchestration, automation, and monitoring. Greg holds 3 AWS certifications including AWS Certified Solutions Architect – Professional.

About Solution Spotlights

The Solution Spotlight series aims to deep dive into the technical components of Vertical Relevance’s Solutions. These solutions cover the prescriptive guidance that we provide our financial services, customers, for effectively building and deploying applications on AWS. Whether clients are just starting out with their cloud journey or looking to improve efficiency with the cloud, the Solution Spotlight series will provide insights based on the best practices we’ve developed through a combination of 20+ years of Financial Services business experience and 10+ years of AWS experience.

You may also be interested in:

Previous Post
Automating Disaster Recovery on AWS for Financial Services
Next Post
Use Case: Foundational Security and Compliance 

About Vertical Relevance

Vertical Relevance was founded to help business leaders drive value through the design and delivery of effective transformation programs across people, processes, and systems. Our mission is to help Financial Services firms at any stage of their journey to develop solutions for success and growth.

Contact Us

Learn More