StaffAttract
  • Login
  • Create Account
  • Products
    • Private Ad Placement
    • Reports Management
    • Publisher Monetization
    • Search Jobs
  • About Us
  • Contact Us
  • Unsubscribe

Login

Forgot Password?

Create Account

Job title, industry, keywords, etc.
City, State or Postcode

Sr. Data Engineer - Analytics

The Scoular Company - Omaha, NE

Apply Now

Job Description

The Senior Data Engineer will be a key contributor in designing, building, and optimizing data solutions that power our analytics ecosystem. This role focuses on hands-on development, ensuring data integrity, performance, and scalability across our platforms. The Senior Data Engineer will collaborate closely with leads, analysts, and business stakeholders to deliver high-quality data products and solutions. Design, develop, and maintain scalable data pipelines and integration solutions by analyzing business requirements and system dependencies to deliver efficient, reliable, and secure data solutions. Build and optimize ETL/ELT workflows using pipeline and PySpark frameworks, following medallion architecture principles. Implement best practices for data modeling, performance tuning, and scalability across data platforms. Work with modern file formats (e.g., Parquet, Delta) to ensure efficient storage and processing. Ensure data quality, security, and compliance across all solutions. Monitor and maintain data systems, proactively identifying and resolving performance issues. Collaborate with cross-functional teams to deliver integrated solutions for complex business problems. Stay current with emerging technologies and recommend improvements to enhance the data ecosystem. Contribute to documentation and knowledge sharing within the team. 5+ years of experience in data engineering and analytics, with strong expertise in ETL/ELT processes including using medallion framework. Proficiency in PySpark and Azure Data Services (Azure Synapse Analytics, Azure Data Factory, Azure Data Lake Storage Gen2, Azure SQL Databases). Experience working with Parquet and Delta file formats. Familiarity with architecture involving Warehouse, Lakehouse, and operational data stores. Knowledge of CI/CD pipelines using Azure DevOps and Agile development methodologies. Bachelor's Degree in Computer Science, MIS, or related field (or equivalent experience). Excellent problem-solving skills and ability to work independently or in a team environment. Strong communication skills to collaborate effectively across technical and business teams. Preferred Experiences Working knowledge of Microsoft Fabric and Power BI (development and optimization). Experience with API connectivity Exposure to Databricks, Snowflake, or similar platforms. Understanding of data governance, lineage, and audit logging.

Created: 2026-03-04

➤
Footer Logo
Privacy Policy | Terms & Conditions | Contact Us | About Us
Designed, Developed and Maintained by: NextGen TechEdge Solutions Pvt. Ltd.