Data Migration Architect
About Bramcolm, LLC
Founded in 2003, Bramcolm, LLC has been at the forefront of IT solutions for over two decades, consistently delivering cutting-edge services tailored to meet the evolving needs of businesses. Based in Indianapolis, IN, our boutique firm has built a reputation for excellence in the IT services and consulting industry.
At Bramcolm, we are committed to leveraging advanced technologies such as AI, machine learning, and cloud computing to deliver efficient, scalable, and user-friendly solutions. Our collaborative approach involves working closely with clients to understand their unique needs and tailor our services accordingly. We value creativity, agility, and excellence, fostering a culture that encourages continuous learning and growth.
Position Summary
The Data Migration Architect will lead the end-to-end planning, design, and execution of data migration efforts for our clients, with a strong emphasis on Salesforce, Snowflake, and Informatica platforms. This role blends hands-on technical work with leadership across cross-functional teams and clients, including government and public health agencies. The Architect will be responsible for developing data migration strategies, managing data quality, ensuring governance and compliance, and enabling user adoption for long-term data success.
Key Responsibilities
Data Migration Planning & Strategy
- Develop and refine the Data Migration Plan for each release cycle.
- Define the approach for ETL of data from source systems into Salesforce, leveraging Informatica and Snowflake.
- Inventory systems and identify structured data categories for migration (e.g., business entities, permits, regulatory codes).
- Collaborate with stakeholders to document source-to-target mappings and transformation rules.
- Plan and coordinate cutover events including object sequencing, dry runs, pilot rollouts, and iterative loads.
- Ensure environments (development, test, UAT, production) are ready for migration activities.
Technical Leadership & Execution
- Design and build secure, scalable ETL pipelines to extract, cleanse, transform, and load data into Salesforce.
- Collaborate with the Integration team to define bulk data extraction logic from Snowflake pipelines.
- Support iterative test loads to validate ETL logic and ensure smooth execution in production.
- Address development, troubleshooting, and DevOps needs from the data migration team.
- Ensure all pipelines and processes adhere to data governance, privacy, and regulatory standards.
Data Quality, Cleansing & Validation
- Conduct data profiling and quality assessments to identify and resolve issues in legacy data.
- Develop processes and tools for deduplication, standardization, and correction of source data.
- Execute comprehensive validation, including record counts, completeness checks, and data quality scoring.
- Document and deliver structured data migration validation results to key stakeholders.
Collaboration & User Enablement
- Work with Local Public Health (LPH) departments and DPH to gather and cleanse legacy data.
- Lead and mentor analysts in guiding users through data cleansing and preparation.
- Collaborate with the OCM workstream to train and upskill LPH users for ongoing data submission post go-live.
- Create reusable templates and tools for ongoing structured data imports by end users.
- Partner with external system vendors to facilitate testing and go-live readiness.
Governance, Risk & Compliance
- Ensure compliance with Commonwealth and DPH data governance, privacy, and security standards.
- Identify and mitigate risks associated with data migration activities across the lifecycle.
Required Qualifications
- 7–10+ years of experience in data migration, ETL development, or data architecture
- Hands-on experience with Informatica ETL and Snowflake pipelines
- Experience with Salesforce data loading, setup, and automation logic
- Proficiency in Github/Github Actions for CI/CD and DevOps workflows
- Proven leadership in cross-functional environments with integration and Salesforce teams
- Strong written and verbal communication, documentation, and planning skills
- Demonstrated experience guiding users through data cleansing, validation, and transformation
Preferred Qualifications
- Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field
- Experience with public health, government, or compliance-sensitive data projects
- Background in iterative data load testing, pilot rollouts, and production cutovers
- Knowledge of data governance best practices and risk management techniques
Location & Requirements
- Location: Boston, MA (Hybrid work model)
- Must be legally authorized to work in the United States
- Must pass criminal background, CORI, and CJIS checks