GCP Engineer
The Judge Group - Phoenix, AZ
Apply NowJob Description
Role: GCP Engineer Location: Phoenix, AZ (Onsite) Fulltime Job description: We are seeking an experienced GCP Data Engineer with strong expertise in BigQuery, PySpark, and Python to design, build, and optimize scalable data pipelines and analytics solutions on the Google Cloud Platform. The ideal candidate will have a deep understanding of distributed data processing, performance optimization, and cloud-native architecture. Key Responsibilities Design and implement end-to-end data pipelines and ETL processes using PySpark, BigQuery, and other GCP-native services. Develop scalable and high-performance data solutions to support analytics, reporting, and machine learning workloads. Optimize BigQuery queries and data models for performance, scalability, and cost efficiency. Work closely with cross-functional teams to integrate data from multiple sources and ensure data quality and reliability. Apply best practices for cloud architecture, security, and operational excellence on the Google Cloud Platform. Required Skills & Qualifications 8+ years of experience in data engineering or related roles. Strong hands-on experience with Google Cloud Platform, specifically BigQuery, Dataflow, Dataproc, and Cloud Storage. Proficiency in PySpark and Python for large-scale data processing. Solid understanding of data modelling, ETL orchestration, and distributed systems. Ability to troubleshoot performance issues and optimize cloud-based data workflows. Strong problem-solving skills and ability to work in collaborative, agile environments
Created: 2026-03-04