GCP Architect Data
Cynet Systems - Hartford, CT
Apply NowJob Description
Job Description: Pay Range: $73hr - $78hr We are seeking a GCP Lead / Architect with strong Data Engineering expertise to design and deliver secure, scalable, and cost-optimized data platforms on Google Cloud Platform (GCP). The ideal candidate will lead architecture design, data engineering initiatives, and DevOps automation while enabling advanced analytics and AI/ML capabilities. Key Responsibilities: Architecture And Solution Design: Lead end-to-end architecture for data platforms on GCP including networking, security, compute, storage, and analytics components. Define High-Level Design (HLD) and Low-Level Design (LLD) along with architecture standards and reference patterns. Drive architecture decisions balancing performance, reliability, scalability, security, and cost optimization. Conduct design reviews, technical audits, and architecture governance. Data Engineering And Warehousing: Design and implement scalable pipelines for structured, semi-structured, and unstructured data using batch and streaming patterns. Develop and maintain ETL/ELT workflows, data models, and transformation pipelines. rchitect and optimize BigQuery-based data warehouse/lakehouse solutions. Lead data warehouse design, including dimensional modeling, SCD strategies, conformed dimensions, and data quality rules. Implement partitioning, clustering, and query performance tuning. GCP Platform Engineering (Hands-On): Implement and enforce security controls using IAM, service accounts, and organization policies. Manage and support workloads on GKE and Compute Engine with scalability and monitoring. Utilize Google Cloud Storage (GCS) for governed storage and lifecycle management. Use Dataproc for Spark/Hadoop based processing. DevOps / CI-CD / Automation: Build and manage CI/CD pipelines using Git and GitHub Actions. Establish DevOps best practices including Git branching strategy, environment promotion, and version management. Implement automation standards and infrastructure governance. AI/ML Enablement: Collaborate with ML and Data Science teams to operationalize ML services using Vertex AI. Support ML training and inference integration with data platforms. Ensure compliance with AI governance, security, and audit requirements. Required Skills / Must Have: 15+ years experience as GCP Lead / Architect with strong data engineering background. 6+ years experience with GCP services including: IAM, VPC, GCS, BigQuery Vertex AI GKE Compute Engine Dataproc GitHub Actions Strong expertise in data warehouse design and distributed systems. Experience creating HLD, LLD, and architecture standards. Strong experience building BigQuery-based data warehouse / lakehouse solutions. Experience leading data modeling and DWH design. Experience with DevOps, CI/CD, and automation. Experience building GCP-native data pipelines. I/ML enablement experience using Vertex AI. Technical Skills: Data Engineering dvanced SQL Python / PySpark: Pipeline design Performance tuning Data quality controls Data Warehousing: Dimensional modeling Distributed processing BigQuery optimization DevOps: Git workflows GitHub Actions CI/CD pipeline automation Environment management Good to Have Skills: Infrastructure as Code (Terraform / GCP Deployment Manager). MLOps experience including model lifecycle, CI/CD for ML, experiment tracking, deployment automation, and monitoring. Healthcare domain exposure. Experience with Medicare STAR ratings. Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. GCP Certifications preferred: Professional Cloud Architect Professional Data Engineer.
Created: 2026-03-12