Data Engineer - GCP
Euclid Innovations - Charlotte, NC
Apply NowJob Description
Key Responsibilities Design and build scalable ETL/data pipelines using Spark and PythonDevelop data workflows to ingest, transform, and move large datasetsImplement data routing logic to direct data to: GCP (BigQuery, Dataflow, Dataproc) On-prem platforms (DPC)Ensure data quality, validation, and reconciliation across systemsCollaborate with data science and platform teams to support predictive model pipelinesOptimize performance and scalability for high-volume data processingRequired Skills • Strong hands-on experience with Apache Spark / PySpark for large-scale data processing• Proficiency in Python for data engineering (ETL pipelines)• Experience designing and developing data pipelines / data engineering workflows• Solid background in ETL, data ingestion, transformation, and data movement• Experience working with big data technologies and handling large datasets (batch/streaming)Experience with cloud platforms - GCP (Google Cloud Platform) o BigQuery, Dataflow, Dataproc, GCS (Google Cloud Storage)• Experience with data migration / data integration projects• Understanding of data pipeline architecture and distributed systems
Created: 2026-05-09