StaffAttract
  • Login
  • Create Account
  • Products
    • Private Ad Placement
    • Reports Management
    • Publisher Monetization
    • Search Jobs
  • About Us
  • Contact Us
  • Unsubscribe

Login

Forgot Password?

Create Account

Job title, industry, keywords, etc.
City, State or Postcode

REMOTE Staff Data Engineer

Insight Global - Brookfield, WI

Apply Now

Job Description

Job Description Insight Global is seeking a Staff Data Engineer to join our actuarial consultant customer 100% remotely. In this position as a Staff Data Engineer of our clientu2019s Data Platform, you will be responsible for designing and implementing robust Data Platform solutions that meet business objectives while ensuring compliance with industry-leading data privacy standards. You will collaborate closely with cross-functional agile teams to drive data architecture decisions, implement best practices, and contribute to the success of our clientu2019s projects. What you will be doing: u2022u2003Acts as a subject matter expert and thought leader within the Data Platform Domain u2022u2003Data Strategy: Serves as a thought leader in data processing design and implementation, defining advanced structure for moving, storing, and maintaining high-quality data. u2022u2003Team Leadership: Leads projects by managing timelines, coordinating teams, and communicating project statuses. Influences organizational direction through effective leadership and strategic collaboration u2022u2003Data Governance and Security: Serves as a subject matter expert on governance standards, continuously aligning data practices with evolving industry best practices and requirements u2022u2003Project Management and Scope of Work: Contributes to defining the overall vision and strategy for data engineering within the organization, ensuring alignment with organizational goals and long-term objectives u2022u2003Results Orientation: Establishes visionary goals, advises on strategic plans, employs advanced monitoring, influences high-level stakeholders, and delivers transformative results u2022u2003Data Platform: Expansion of our Data Warehouse(s) and Lakehouse solutions for a healthcare data focused enterprise u2022u2003Data Governance: Configuring and maintaining unity catalog to enable enterprise data lineage, data quality, auditability and data stewardship u2022u2003Data Security: Building out Data Security protocols and best practices including the management of identified and de-identified (PHI/PII) solutions u2022u2003Access Management: Always ensure a policy of least privilege is followed for anything being implemented u2022u2003External Data Products: Building data solutions for clients while upholding the best standards for reliability, quality, and performance u2022u2003ETL: Building solutions within Delta Live Tables and automation of transformations u2022u2003Medallion Architecture: Building out performant enterprise-level medallion architecture(s) u2022u2003Streaming and Batch Processing: Building fit-for-purpose near real-time streaming and batch solutions u2022u2003Large Data Management: Building out performant and efficient enterprise solutions for internal and external users for both structured and unstructured healthcare data u2022u2003Platform Engineering: Building out Infrastructure as Code using Terraform and Asset Bundles u2022u2003Costs: Working with the business to build cost effective and cost transparent Data solutions u2022u2003Pipeline/ETL Management: You will help architect, build, and maintain robust and scalable data pipelines, monitoring, and optimizing performance u2022u2003Experience working with Migration tools i.e. Fivetran, AWS technologies and custom solutions u2022u2003Identify and implement improvements to enhance data processing efficiency u2022u2003Design and implement reliable and resilient Event Driven data processing u2022u2003Experience with building out effective pipeline monitoring solutions u2022u2003Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Delta Live Tables, Python, Scala, and cloud based u2018big datau2019 technologies u2022u2003API Development: Drive our design and implementation of internal APIs for integrating data between different systems and applications u2022u2003Integration with external systems utilizing API driven processes to ingest data u2022u2003Develop APIs built on top of datasets for internal systems to consume data from Databricks u2022u2003Experience integrating with external APIs including but not limited to Salesforce, Financial systems, HR systems and other external systems u2022u2003Data Modeling: Lead design, implementation, and maintenance of standards based (FHIR, OMOP, etc.) and efficient data models for both structured and unstructured data u2022u2003Assemble large, complex data sets that meet functional and non-functional business requirements u2022u2003Develop and maintain data models, ensuring they align with business objectives and data privacy regulations u2022u2003Collaboration: Partner internally and externally with key stakeholders to ensure we are providing meaningful, functional, and valuable data u2022u2003Effectively work with Data, Development, Analysts, Data Science, and Business team members to gather requirements, propose, and build solutions. u2022u2003Communicate complex technical concepts to non-technical stakeholders and provide guidance on best practices. u2022u2003Ensure that technology execution aligns with business strategy and provides efficient, secure solutions and systems u2022u2003Gather requirements and build out project plans to implement those requirements with forecasted efforts to implement u2022u2003Processes and Tools: Identify, design, and implement internal process improvements: ou2003automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. u2022u2003Build analytics tools that utilize the data pipeline to provide actionable insights into operational efficiency and other key business performance metrics. u2022u2003Create data tools for clinical, analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. u2022u2003Lead investigation of new tooling, develop implementation plans, and deployment of necessary tooling We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: Skills and Requirements u2022u200315+ years of relevant experience in design, development, and testing of Data Platform solutions, such as Data Warehouses, Data Lakes, and Data Products u2022u2003Expert level experience working in Databricks and AWS u2022u2003Expert level experience working in both relational and non-relational databases such as SQL Server, PostgreSQL, DynamoDB, DocumentDB u2022u2003Experience building and managing solutions on AWS u2022u2003Expert in building out data models, data warehouses, designing of data lakes for enterprise and product use cases u2022u2003Familiarity with designing and building APIs, ETL and data ingestion processes and utilization of tools to support enterprise solutions u2022u2003Experience in performance tuning, query optimization, security, monitoring, and release management u2022u2003Experience working with and managing large, disparate, identified and de-identified data sets from multiple data sources u2022u2003Experience with building and deploying IAC using terraform, asset bundles and GitHub u2022u2003Experience collaborating with Data Science teams and building AI based solutions to drive efficiencies and business value u2022u2003Bachelor's degree or master's degree in computer science, data engineering or related field u2022u2003Experience managing and standardizing clinical data from structured and unstructured sources u2022u2003Health and Life Insurance business experience u2022u2003Knowledge in healthcare standards including FHIR, C-CDA, and traditional HL7 u2022u2003Knowledge in clinical standards/ontologies including ICD10/SNOMED/NDC/LOINC/Rx Norm u2022u2003Associate or Professional level solution architecture certification in Azure and/or AWS u2022u2003Experience in Snowflake u2022u2003Experience in Spark u2022u2003Experience with Salesforce Integration

Created: 2026-04-04

➤
Footer Logo
Privacy Policy | Terms & Conditions | Contact Us | About Us
Designed, Developed and Maintained by: NextGen TechEdge Solutions Pvt. Ltd.