StaffAttract
  • Login
  • Create Account
  • Products
    • Private Ad Placement
    • Reports Management
    • Publisher Monetization
    • Search Jobs
  • About Us
  • Contact Us
  • Unsubscribe

Login

Forgot Password?

Create Account

Job title, industry, keywords, etc.
City, State or Postcode

AWS DATA Architect

Incedo Inc. - Dallas, TX

Apply Now

Job Description

Job Title: Senior Data Tech Lead/Architect - AWS | Kafka | | Glue Streaming; API consumptionLocationDallas, Texas (Preferred) Or Florham Park, New Jersey , NJFull time About the Role:We are seeking a Senior Technical Lead Architect with strong expertise in AWS-based streaming data pipelinesApache Kafka (MSK)AWS GlueFlink and PySpark, to help solution, design and implement a scalable data ingestion, data validation, data enrichment and reconciliation processing, and event logs, data observability, operational KPI tracking framework.You will play a key role in solutioning and building out the , event driven capabilities with control gates in place to measure, track, and improve the operational SLAs and drive the data quality and reconciliation workflows for a high-impact data platform serving financial applicationsKey Responsibilities:Provide technical solution discovery effort on any new capabilities or new functionality.Assist PO with technical user stories to ensure healthy backlog featuresLead the development of real-time data pipelines using AWS DMSMSK, Kafka or Glue Streaming and for CDC ingestion from multiple SQL Server sources (RDS/on-prem).Build and optimize streaming and batch data pipelines using AWS Glue (PySpark) to validate, transform, and normalize data to Iceberg and DynamoDBDefine and enforce data quality, lineage, and reconciliation logic with support for both streaming and batch use cases.Integrate with S3 Bronze/Silver layers and implement efficient schema evolution and partitioning strategies using Iceberg.Collaborate with architects, analysts, and downstream application teams to design API and file-based egress layers.Implement monitoring, logging, and event-based alerting using CloudWatch, SNS, and EventBridge.Mentor junior developers and enforce best practices for modular, secure, and scalable data pipeline development.Required Skills:6+ years of hands-on expert level data engineering experience in cloud-based environments (AWS preferred) with event driven implementationStrong experience with Apache Kafka / AWS MSK including topic design, partitioning, and Kafka Connect/DebeziumProficiency in AWS Glue (PySpark) and for both batch and streaming ETLWorking knowledge of AWS DMSS3Lake FormationDynamoDB, and IcebergSolid grasp of schema evolutionCDC patterns, and data reconciliation frameworksExperience with infrastructure-as-code (CDK/Terraform) and DevOps practices (CI/CD, Git)Preferred:Familiarity with PostgreSQLAthenaStep Functions, and LambdaKnowledge of event-driven architectures and microservices-based data APIs (This should be the required skills)Familiar with data lake conceptFamiliar with Financial or Financial Wealth management data domain

Created: 2026-05-09

➤
Footer Logo
Privacy Policy | Terms & Conditions | Contact Us | About Us
Designed, Developed and Maintained by: NextGen TechEdge Solutions Pvt. Ltd.