Data Migration Architect
Bramcolm, LLC - Boston, MA
Apply NowJob Description
Founded in 2003, Bramcolm, LLC has been at the forefront of IT solutions for over two decades, consistently delivering cutting-edge services tailored to meet the evolving needs of businesses. Based in Indianapolis, IN, our boutique firm has built a reputation for excellence in the IT services and consulting industry.At Bramcolm, we are committed to leveraging advanced technologies such as AI, machine learning, and cloud computing to deliver efficient, scalable, and user-friendly solutions. Our collaborative approach involves working closely with clients to understand their unique needs and tailor our services accordingly. We value creativity, agility, and excellence, fostering a culture that encourages continuous learning and growth.Position SummaryThe Data Migration Architect will lead the end-to-end planning, design, and execution of data migration efforts for our clients, with a strong emphasis on Salesforce, Snowflake, and Informatica platforms. This role blends hands-on technical work with leadership across cross-functional teams and clients, including government and public health agencies. The Architect will be responsible for developing data migration strategies, managing data quality, ensuring governance and compliance, and enabling user adoption for long-term data success.Key ResponsibilitiesDevelop and refine the Data Migration Plan for each release cycle.Define the approach for ETL of data from source systems into Salesforce, leveraging Informatica and ventory systems and identify structured data categories for migration (e.g., business entities, permits, regulatory codes).Collaborate with stakeholders to document source-to-target mappings and transformation rules.Plan and coordinate cutover events including object sequencing, dry runs, pilot rollouts, and iterative loads.Ensure environments (development, test, UAT, production) are ready for migration activities.Technical Leadership & ExecutionDesign and build secure, scalable ETL pipelines to extract, cleanse, transform, and load data into Salesforce.Collaborate with the Integration team to define bulk data extraction logic from Snowflake pipelines.Support iterative test loads to validate ETL logic and ensure smooth execution in production.Address development, troubleshooting, and DevOps needs from the data migration team.Ensure all pipelines and processes adhere to data governance, privacy, and regulatory standards.Data Quality, Cleansing & ValidationConduct data profiling and quality assessments to identify and resolve issues in legacy data.Develop processes and tools for deduplication, standardization, and correction of source data.Execute comprehensive validation, including record counts, completeness checks, and data quality scoring.Document and deliver structured data migration validation results to key stakeholders.Collaboration & User EnablementWork with Local Public Health (LPH) departments and DPH to gather and cleanse legacy data.Lead and mentor analysts in guiding users through data cleansing and preparation.Collaborate with the OCM workstream to train and upskill LPH users for ongoing data submission post go-live.Create reusable templates and tools for ongoing structured data imports by end users.Partner with external system vendors to facilitate testing and go-live readiness.Ensure compliance with Commonwealth and DPH data governance, privacy, and security standards.Identify and mitigate risks associated with data migration activities across the lifecycle.Required Qualifications7–10+ years of experience in data migration, ETL development, or data architectureHands-on experience with Informatica ETL and Snowflake pipelinesExperience with Salesforce data loading, setup, and automation logicProficiency in Github/Github Actions for CI/CD and DevOps workflowsProven leadership in cross-functional environments with integration and Salesforce teamsStrong written and verbal communication, documentation, and planning skillsDemonstrated experience guiding users through data cleansing, validation, and transformationPreferred QualificationsBachelor’s or Master’s degree in Computer Science, Information Systems, or related fieldExperience with public health, government, or compliance-sensitive data projectsBackground in iterative data load testing, pilot rollouts, and production cutoversKnowledge of data governance best practices and risk management techniquesLocation & RequirementsLocation: Boston, MA (Hybrid work model)Must be legally authorized to work in the United StatesMust pass criminal background, CORI, and CJIS checks #J-18808-Ljbffr
Created: 2025-09-17