StaffAttract
  • Login
  • Create Account
  • Products
    • Private Ad Placement
    • Reports Management
    • Publisher Monetization
    • Search Jobs
  • About Us
  • Contact Us
  • Unsubscribe

Login

Forgot Password?

Create Account

Job title, industry, keywords, etc.
City, State or Postcode

Data Engineer (Spark, Airflow, GCP)

Kaav Inc. - Bentonville, AR

Apply Now

Job Description

Designing and Building ETL pipeline using Sqoop, Hive, Map Reduce and Spark on on-prem and cloud environments. Functional Programming using Python and Scala for complex data transformations and in-memory computations. Using Erwin for Logical/Physical data modeling and Dimensional Data Modeling. Designing and developing UNIX/Linux scripts for handing complex File formats and structures Orchestration of workflows and jobs using Airflow and Automic, Creating Multiple Kafka producers and consumers for data transferring Performing Continous Integration and deployment (CI/CD) using tools like GIT, Jenkin to run test cases and build applications with code coverage using Scala test Analyzing data using SQL, Big Query monitoring the cluster performance, setting up alerts, documenting the designs, workflow. Providing production support, troubleshooting and fixing the issues by tracking the status of Running applications to perform System Administrator tasks. Required Skills : Data Analysis Basic Qualification : Additional Skills : Data Engineer Background Check : No Drug Screen : No

Created: 2026-03-04

➤
Footer Logo
Privacy Policy | Terms & Conditions | Contact Us | About Us
Designed, Developed and Maintained by: NextGen TechEdge Solutions Pvt. Ltd.