Data Engineer
Kaav Inc. - Seattle, WA
Apply NowJob Description
Data Quality EngineerWho we are? we are a yoga-inspired technical apparel company up to big things. The practice and philosophy of yoga informs our overall purpose to elevate the world through the power of practice. We are proud to be a growing global company with locations all around the world, from Vancouver to Shanghai, and places in between. We owe our success to our innovative product, our emphasis on our stores, our commitment to our people, and the incredible connections we get to make in every community we are in. About this team The Data Quality Engineer will be part of the cross-region data engineering team, leveraging and enhancing test automation frameworks for batch and real-time data ingestion from multiple data sources into Snowflake and BigQuery, as well as reporting data out from these platforms. The ideal candidate will work with cross-functional teams to analyze the data with minimal supervision and perform end-to-end testing. Additionally, the candidate will design and implement automation scripts for executing regression test suites. The Data Quality Engineer will also be responsible for continuously identifying opportunities for automation in data validation across Snowflake and BigQuery environments. Key Responsibilities Understand business requirements and analyzing data. Plan and execute the tests manually using SQL scripts. Identify and execute regression scenarios using automation scripts. Part of building and enhancing the test automation frameworks. Co-ordinate with cross-functional teams to get the functional & technical end-to-end testing done. Collaborate with cross-functional teams - business stakeholders, engineers, program management, project management, etc. - to produce the best testing solutions possible. Contribute to story sizing and work on estimates for test planning, execution, delivery, and documentation. Understand, leverage, and apply best practices effectively. Ensure quality test deliverables with thorough test coverage. Strive for continuous improvement of QA practices. Qualifications Bachelor's degree in computer science or related field. 4+ years of experience as a Data Quality Engineer or a similar role. Experience in ETL/DWH and BI reports testing. Proficient in SQL. Capability to review snowflake stored procedures and data orchestration. Experience in building test automation frameworks and creating automation scripts. Excellent problem-solving skills, combined with the ability to present your findings/insights clearly and compellingly in both verbal and written form. Knowledge of Data Quality Essentials, data management fundamentals and ETL concepts. Experience with any of cloud systems - Azure, AWS or GCP. Desired Qualifications: Experience with Snowflake or BigQuery platform. Experience with data quality tools like Collibra data quality. Experience with Azure cloud. Expertise in testing both batch & real-time data using SQL queries. Familiarity with one of the DevOps tools like Jenkins, Azure DevOps, GitLab. Experience working with large volume data. Retail experience strongly desired. Working experience with reporting layer Able to work as an individual contributor as well as a team player. Communicates with honesty, kindness and creates the space for others to do the same. Fosters connection by putting people first and building trusting relationships. Day in the life: Conduct profiling, anomaly detection, and validation checks for structured, semi-structured, and unstructured data across BigQuery, BigTable, Snowflake, and Azure Data Platforms. Writing and optimizing SQL for data validation, Data deduplication, and transformation within Google BigQuery and Snowflake environments. Implementing automated data validation, schema enforcement, and quality checks within CI/CD pipelines using Azure Data Factory, Data Bricks, and DevOps best practices. Proactively identifying data inconsistencies, managing outages, and leading customer escalations and crisis resolution using tools like JIRA. Ensuring compliance with data policies, lineage tracking, and metadata management for regulatory and business needs. Supporting real-time and batch ClickStream data migrations while implementing data quality controls in cloud storage solutions. Setting up cloud-based monitoring for cost efficiency, data freshness, and integrity alerts in GCP and Azure environments. Working closely with the data engineering team to define quality requirements, improve data pipeline efficiency, and enhance observability. Keeping up with emerging technologies in cloud-based data engineering, automation, and data observability tools. Required Skills : Oracle Basic Qualification : Additional Skills : Background Check : No Drug Screen : No
Created: 2026-03-04