Lead Data Engineer: Big data (Hadoop), AWS & automated ...
Kore1 - Independence, OH
Apply NowJob Description
This is a hybrid role based in Independence, OH, with one day per week onsite required.Candidates willing to relocate are encouraged to apply, and relocation assistance is available.Please note, we are unable to transfer or sponsor visas at this time.KORE1, a nationwide provider of staffing and recruiting solutions, has an immediate opening for a Lead Data Engineer: Big data (Hadoop), AWS & automated deploymentsThe Data Tech Lead is a member of the Insurance Data Engineering Team. They will play a critical role on the team, developing data solutions to solve business and technical problems.What You'll DoLeading the development & delivery of technical solutionsArchitect and implement solutions using big data platforms (Hadoop, Hive, Hbase, Impala, Sqoop) and cloud technologies (AWS: EC2, S3, Lambda, Glue, Redshift, API Gateway)Understanding complex business problems to ensure projects are leveraging the appropriate technology and analytical tools in the delivery of a comprehensive solutionUnderstand the business goals/needs while utilizing big data and cloud technology to ingest, process, and analyze large amounts of dataUnderstand how to transform findings into actionable business opportunitiesUnderstand the changing business needs of the organization/projects and recommend viable strategies for the future.Author and/or Review architecture/design and other technical documents, ensuring high-quality deliverables and systems development across tech stacks and applications teams What You'll NeedBachelor's degree in Computer Science, Mathematics, Statistics, or a related field (required)Be prepared to discuss your project experience & your architecture / design methodologies in interviews5+ years of data engineering/architecture experience, with 3-5+ years in a technical lead or architect roleHands-on experience with big data platforms: Hadoop, Hive, Impala, Hbase, Sqoop, NoSQL databasesExperience developing data pipelines, ETL / ELT processes, and data transformationsProficiency with AWS cloud services: EC2, S3, Lambda, Glue, Redshift, API Gateway; experience with cloud-native architecturesExperience with data modeling, business logic implementation, and optimizing large-scale datasetsExperience with automated deployments, CI/CD, and source code/configuration management tools (GitHub, Jenkins, Terraform, CloudFormation)Strong understanding of data architecture principles and ability to design scalable solutionsExcellent communication skills, able to convey technical concepts to business stakeholders and cross-functional teamsAbility to think strategically, prioritize tasks, and see the 'big picture' while managing day-to-day deliverablesStrong organizational skills, attention to detail, and ability to work independently or as a lead on multi-disciplinary projectsCompensation depends on experience but is typically between $142K and $189K, plus a 15% bonus. However, the client typically doesn't hire above the 80th percentile to ensure room for merit increases with tenure. 80% would be $151,360K
Created: 2025-12-19