StaffAttract
  • Login
  • Create Account
  • Products
    • Private Ad Placement
    • Reports Management
    • Publisher Monetization
    • Search Jobs
  • About Us
  • Contact Us
  • Unsubscribe

Login

Forgot Password?

Create Account

Job title, industry, keywords, etc.
City, State or Postcode

Senior Data Engineer

Optomi - Tempe, AZ

Apply Now

Job Description

Job Description:Optomi is partnering with a leading organization in the financial services industry to hire a Senior Data Engineer in Tempe, AZ. This role is ideal for a highly skilled data engineering professional with deep expertise in Azure cloud technologies and Databricks Lakehouse architecture.The Senior Engineer will play a key role in designing, building, and optimizing enterprise-scale data pipelines while helping modernize cloud-based data ecosystems. This individual will work with cutting-edge technologies including Delta Lake, Unity Catalog, PySpark, and CI/CD automation to deliver secure, scalable, and high-performing data solutions.This is an opportunity to join a collaborative, innovation-driven environment focused on delivering modern data capabilities and enterprise data governance.Qualifications10+ years of experience in data engineering, ETL development, or data warehousingStrong expertise with Azure cloud services and Databricks Lakehouse platformAdvanced proficiency in SQL, Python, and PySparkHands-on experience building ETL/ELT frameworks and large-scale data pipelinesExperience working with Delta Lake and Unity CatalogKnowledge of CI/CD automation and DevOps practices for data engineering workflowsStrong data analysis and problem-solving skillsExperience optimizing distributed data processing systems and cloud-based architecturesExcellent communication and collaboration skills within cross-functional teamsBachelor's degree in Computer Science, Engineering, Information Systems, or related field preferredResponsibilitiesDesign, develop, and optimize scalable ETL/ELT data pipelinesBuild and maintain cloud-based data solutions within Azure and Databricks Lakehouse environmentsDevelop data transformation workflows using SQL, Python, PySpark, and Spark SQLImprove performance, scalability, and reliability of large-scale data processing systemsImplement and manage Delta Lake architecture and Unity Catalog for data governance and secure data sharingAutomate deployments and data engineering workflows through CI/CD pipelines and DevOps best practicesCollaborate cross-functionally with data architects, analysts, and business stakeholders to deliver enterprise data solutionsPerform data analysis, troubleshooting, and optimization to ensure high-quality data deliverySupport modernization initiatives for enterprise data platforms and cloud infrastructure

Created: 2026-05-13

➤
Footer Logo
Privacy Policy | Terms & Conditions | Contact Us | About Us
Designed, Developed and Maintained by: NextGen TechEdge Solutions Pvt. Ltd.