Skip to content
● active Role: Creator

Cascade

What It Does

Cloud-native benefits intelligence SaaS platform (Aquaduct Cascade) delivering regulatory intelligence, vendor sentiment analysis, compliance alerts, and plan-design monitoring for benefits professionals through dashboards and API feeds.

Architecture

CloudFront routes traffic into an Application Load Balancer fronting a Next.js 15 application deployed on ECS Fargate. The platform combines server-rendered dashboard experiences with API endpoints backed by PostgreSQL on RDS, while EventBridge Scheduler triggers isolated collector tasks that ingest regulatory, compliance, M&A, and vendor signals into the intelligence model.

Components

Edge Delivery Layer CloudFront / Application Load Balancer

Terminates public traffic and accelerates dashboard delivery

Web Application Next.js 15 / React / ECS Fargate

Server-rendered dashboard and API surface for benefits intelligence workflows

Intelligence Store PostgreSQL RDS / Prisma

Persists normalized regulatory, compliance, and vendor datasets

Collector Runtime EventBridge Scheduler / ECS Fargate Tasks

Schedules and executes ingestion tasks for RSS, compliance, and sentiment sources

Delivery Operations Terraform / CloudWatch / runbooks

Manages blue/green releases, telemetry, and production recovery procedures

Key Decisions

Next.js over plain React SPA

Server-side rendering keeps dashboard and content pages fast while preserving API-route flexibility for authenticated workflows and operational endpoints.

ECS Fargate plus ALB over static hosting

The product needs SSR, authenticated application flows, health checks, and blue/green delivery controls that fit containerized compute behind AWS load balancing.

Prisma ORM over raw SQL

Type-safe database access with auto-generated migrations. Schema changes propagate to TypeScript types automatically.

EventBridge-triggered collectors over manual research

Scheduled collectors surface regulatory changes, M&A activity, and vendor signals within hours instead of waiting on manual review cycles.

Impact & Architecture Signal

Built and deployed a production SaaS platform on AWS with Next.js SSR on ECS Fargate, PostgreSQL RDS, CloudFront, Application Load Balancer routing, and EventBridge-scheduled collector tasks. Operational maturity includes 41 ADRs, 34 postmortems, and 46+ runbooks supporting blue/green releases, incident learning loops, and reliable data collection.

Tech Stack

Next.js 15 React TypeScript Prisma PostgreSQL AWS ECS Fargate AWS RDS AWS EventBridge Scheduler AWS CloudFront AWS Application Load Balancer Docker CloudWatch Terraform RSS/Atom REST API

Capabilities

data-engineering architecture automation saas cloud-platform

Evidence — Case Studies (11)

Multi-Employer Pension System Stabilization & $600K Revenue Recovery

architecture leadership pension problem-solving agile key result
$600K+ change order revenue
Situation

Inherited an unstable pension system for a multi-employer pension plan with critical operational issues. The platform had accumulated technical debt and the client relationship was at risk due to recurring system failures and delayed deliverables.

Task

Stabilize system operations, restore client confidence, and deliver pending change orders worth $600K+ while simultaneously managing ongoing pension administration needs.

Action

Led a structured stabilization initiative by triaging critical defects, implementing modular fixes, and establishing a prioritized backlog. Decomposed complex pension domain into structured system requirements. Designed automation for retroactive coverage processing with downstream life event automation. Directed technical stabilization across development and QA tracks with SQL-heavy backend analysis and automation design, delivering incremental improvements while maintaining operational stability.

Result

Stabilized the pension system, delivered $600K+ in change order revenue, and restored the client relationship. Established repeatable patterns for managing unstable legacy platforms through structured decomposition and agile delivery.

Public Pension Fund Domain Architecture & System Decomposition

architecture data-modeling data-engineering pension agile communication key result
1500+ structured requirements
200K-300K participant lives
3.5 year implementation
Situation

A public pension fund for unionized workers required a comprehensive multi-employer pension implementation covering eligibility, benefit calculations, disability, death benefits, payment methods, and participant statements — a complex game mechanic situation serving 200K-300K participant lives. The pension rules were complex and poorly documented, requiring deep domain expertise to decompose into implementable system architecture.

Task

Decompose the full scope of the pension fund domain into implementable system architecture, ensuring complete coverage of all pension modules across a multi-year implementation.

Action

Decomposed the fund's complex pension domain into implementable system architecture across 1500+ structured requirements spanning eligibility logic, benefit calculations, disability processing, death benefits, payment methods, and participant statements. Mapped pension rules, calculations, and business logic to system modules enabling parallel development streams. Drove first scaled agile program for the value stream, earning SAFe certification through hands-on release train engineering.

Result

Delivered comprehensive pension domain architecture across 1500+ structured requirements enabling parallel development across multiple teams. 3.5-year implementation navigated through SAG-AFTRA strike and COVID, bringing client relationship from red to yellow status.

Enterprise Benefits Data Integration Architecture — 70+ Concurrent Pipelines

architecture data-engineering integrations sql benefits key result
~70 concurrent integrations
15 enterprise clients
1M+ participants
20+ team members
Situation

The benefits platform served 15 enterprise clients requiring data integration pipelines to 10-15 vendors each — insurance carriers, payroll systems, DCFSA administrators. At operational steady state, approximately 70 integrations ran concurrently across SQL/SSIS pipelines handling EDI 834, flat files, position-based formats, feedback loops, and change detection.

Task

Design and maintain the data integration architecture serving the full client portfolio — pipeline design, format engineering, and vendor feed reliability across all 15 enterprise accounts.

Action

Led integration pipeline design across 20+ analysts and developers, establishing SQL/SSIS data export pipelines for vendor integrations after participant elections. Engineered SQL/SSIS integration pipelines for EDI 834, flat files, position-based formats (e.g., a retirement provider with ~5,000 positions per row), multi-row files, feedback loops, and change detection logic. Designed data transformation architectures mapping internal platform data to vendor-specific output formats across insurance carriers, payroll providers, and benefits vendors. Maintained COBRA, ACA, and PHI compliance across all data integration touchpoints.

Result

Architected and maintained data integration platform serving 1M+ benefit plan participants with ~70 concurrent pipelines across 15 enterprise clients. Self-initiated sprint cycles drove continuous pipeline improvement without formal agile mandate.

25+ Vendor Integration Architecture

integrations architecture data-engineering system-design key result
25+ vendor integrations
Situation

The benefits platform required connectivity with 25+ external vendors including insurance carriers, retirement providers, DCFSA administrators, and payroll systems. Each vendor had unique file formats, transmission protocols, and business rules. Integration failures directly impacted employee benefits enrollment and payroll processing.

Task

Design, build, and maintain the complete vendor integration architecture ensuring reliable, accurate, and timely data exchange across all 25+ vendor connections.

Action

Architected a standardized integration framework supporting electronic data interchange (834, 820), flat file, and API-based vendor feeds. Designed data transformation pipelines for each vendor's unique format requirements. Implemented closed-loop file processing with error handling and reconciliation. Managed vendor relationships for technical specifications and issue resolution. Built monitoring and alerting for feed failures.

Result

Delivered and maintained 25+ production vendor integrations with high reliability. The standardized framework reduced new vendor onboarding time and provided consistent error handling across all feeds. Integration architecture supported accurate benefits processing for 1M+ participants.

Data Model Integrity Initiative

data-modeling data-engineering pension problem-solving strong
Situation

The firm's pension administration systems had accumulated data quality issues across client implementations. Data validations were inconsistent, and the data model lacked the rigor needed for accurate pension calculations and regulatory reporting. This created risk for actuarial valuations, payroll processing, and participant communications.

Task

Improve data model integrity across the pension platform by strengthening data validations and enhancing system capability for data updates, ensuring accurate pension calculations and compliant reporting.

Action

Analyzed existing data validation gaps across the pension data model. Designed and implemented enhanced validation rules for pension-critical data elements. Collaborated with the DB admin team to standardize data integrity checks. Performed parallel testing for payroll and actuarial valuations to verify improvements. Documented data model conventions for the team.

Result

Contributing to measurable improvement in data validation coverage and pension system capability. Enhanced data integrity supports more accurate pension calculations and reduces risk in actuarial valuations and payroll processing.

AI-Powered Multi-Agent Orchestration System

architecture automation system-design python key result
Situation

Managing complex multi-agent AI workflows required coordinating multiple specialized agents through structured execution plans with human governance loops. Existing orchestration tools lacked the structured handoff mechanisms, evidence-based review gates, and split-pane architecture needed for reliable autonomous operation with human oversight.

Task

Design and build an autonomous workflow orchestration system that coordinates multiple AI agents through structured execution control plans with STOP gates, evidence requirements, and human-in-the-loop governance.

Action

Architected a model-agnostic orchestration framework with SQLite-backed state management for deterministic orchestration. Implemented tmux split-pane multi-agent coordination with dedicated watcher processes. Built Slack integration for human oversight and cross-agent messaging. Designed the ECP (Execution Control Protocol) pattern for bounded, reviewable work packages. Created file-based and DB-backed watchers for event-driven agent activation.

Result

Deployed a production autonomous orchestration system coordinating 7+ specialized agents across Claude, Codex, and Cursor. The system demonstrates end-to-end architecture capability spanning state management, inter-process communication, event-driven activation, and human governance integration. Powers development workflows across multiple projects.

10+ Data Conversion Projects

data-engineering management benefits batch-processing strong
10+ data conversions
Situation

The benefits administrator regularly onboarded new enterprise clients requiring migration of their existing benefits data from prior administrators. Each conversion involved unique source formats, complex mapping rules, data quality challenges, and tight go-live deadlines tied to open enrollment windows. Conversion failures directly impacted employee benefits access.

Task

Lead data conversion projects for client implementations and migrations, ensuring accurate, complete data migration from source systems to the benefits platform within enrollment-driven deadlines.

Action

Developed standardized conversion methodologies covering data extraction, mapping, transformation, validation, and reconciliation. Led 10+ conversion projects spanning mutual funds, annuities, and insurance products. Built reusable transformation templates that reduced conversion cycle times. Implemented multi-pass validation processes to catch data quality issues before go-live. Coordinated with client teams and prior administrators for source data extraction and verification.

Result

Successfully completed 10+ data conversion projects with accurate, timely migrations. Standardized conversion methodology reduced cycle times for subsequent projects and established best practices for the conversion team.

SQL-Driven Data Reconciliation & Conflict Resolution

sql conflict-resolution decision-making data-engineering ownership strong
Situation

During data conversion projects at the benefits administrator, source data from prior administrators frequently contained inconsistencies, missing records, and conflicting participant information. Client teams and prior administrators often disagreed on data accuracy, creating tension and threatening go-live timelines tied to open enrollment windows.

Task

Resolve data conflicts between source systems and target platform through systematic SQL-based analysis, while managing relationships between client teams and prior administrators to reach agreement on data accuracy and completeness.

Action

Designed SQL-based reconciliation queries to systematically identify and categorize data discrepancies across participant records, coverage elections, and contribution histories. Built automated validation scripts comparing source and target data at the transaction level. Facilitated conflict resolution sessions between client HR teams and prior administrators using data-driven evidence to establish ground truth. Made decisive calls on ambiguous data scenarios by assessing risk and impact. Documented resolution patterns for reuse across subsequent conversion projects.

Result

Resolved data conflicts across 10+ conversion projects with documented evidence trails. SQL-based reconciliation approach reduced data quality disputes by providing objective evidence. Established a repeatable conflict resolution framework that accelerated subsequent conversion timelines.

Digital Evidence Migration Platform

architecture cloud-engineering data-engineering python azure documentation key result
6+ law enforcement agencies migrated
3 years Azure experience
Situation

Joined a law enforcement technology company's digital evidence migration team to migrate agencies from legacy on-premises evidence systems to a cloud evidence platform. Each agency had proprietary vendor formats requiring specialized data transformation.

Task

Engineer the migration framework, tooling, and operational standards — not just execute individual customer migrations.

Action

Built config-driven CLI migration tool abstracting vendor complexity. Engineered high-concurrency Azure Databricks loader scripts. Created Evidence.com API integration layer. Established JSON-based auditing framework with gap analysis. Developed Databricks SDK clients with mocking frameworks. Overhauled team documentation using Diataxis Framework — standardized lifecycle from discovery through post-migration analysis. Created templates for XSLT, runbooks, dry run configs, vendor mappings.

Result

Migrated 6+ law enforcement agencies (Baltimore PD, NM DPS, Albany DA, Lakewood WA, Laguna Beach CA, Anaheim CA). Established reusable migration framework used by entire team. Reduced onboarding time for new engineers.

Enterprise Cloud Migration Architecture

architecture cloud-engineering aws data-engineering migration-architecture key result
30+ customer migrations
30+ per-customer baseline audits
Situation

The firm's entire data exchange infrastructure for 30+ institutional financial services customers ran on legacy hosting with traditional file transmission workflows.

Task

Design and execute a customer-by-customer migration to AWS Transfer Family — not a lift-and-shift but a re-architecture with per-customer baseline audits and environment bifurcation.

Action

Produced per-customer baseline audits with JSON schemas, baseline documentation, and HTML reports for 30+ customers. Designed environment bifurcation strategy enabling single codebase deployment to both legacy and AWS. Created cutover runbooks and master cutover tracker. Built QA audit framework with multi-round validation. Coordinated parallel cutovers across team with different solution patterns (external solutions, daily solutions).

Result

Systematic migration of 30+ customer data exchange infrastructure to AWS. Per-customer audit trail. Mercury database with schema design, validation checklists, and operational runbooks.

RazorBridge ETL Platform

architecture data-engineering react typescript data-modeling key result
70+ business objects (DTOs)
47 model methods
52 legacy templates analyzed
3 unified applications
Situation

The firm's legacy import system used 52 Razor CSHTML templates to process customer data files. Templates were hand-coded, inconsistent, and difficult to maintain. Template creation required deep knowledge of the Razor engine internals.

Task

Conceive and build a next-generation ETL platform to replace the legacy import system with a visual, data-driven approach.

Action

Analyzed 52 legacy templates to identify the 80% reusable standard and 20% configurable fragments. Cataloged 70+ business objects (DTOs) mapped to 47 model methods. Built RazorBridge — a React/Express/TypeScript application with visual field mapping UI that generates production-faithful Razor templates. Created companion LogBridge (monitoring) and DataForge (transformation) applications sharing a unified architecture (Vite, proxy routing, stub auth, route contracts). Drove development through full ECP cycles with human-in-the-loop governance.

Result

70+ DTO catalog, 47 model methods mapped, visual template generation replacing manual coding. Unified 3-application ETL platform with shared architecture.