Data Migration Architect
About Bramcolm, LLC
Founded in 2003, Bramcolm, LLC has been at the forefront of IT solutions for two decades, consistently delivering cutting-edge services tailored to meet the evolving needs of businesses. Based in Indianapolis, IN, our boutique firm has built a reputation for excellence in the IT services and consulting industry.
At Bramcolm, we are committed to leveraging advanced technologies such as AI, machine learning, and cloud computing to deliver efficient, scalable, and user-friendly solutions. Our collaborative approach involves working closely with clients to understand their unique needs and tailor our services accordingly. We value creativity, agility, and excellence, fostering a culture that encourages continuous learning and growth.
Position Summary
The Data Migration Architect will lead the planning, design, and execution of data migration initiatives for our clients, with a focus on Informatica, Snowflake, and Salesforce platforms. This role is responsible for source-to-target mapping, requirements gathering, and ensuring the successful migration of bulk data. The Data Migration Architect will collaborate with cross-functional teams, drive technical planning sessions, and provide expertise in ETL development, DevOps, and data transformation logic.
Key Responsibilities
Data Migration Leadership & Planning
- Lead source-to-target mapping requirements gathering and documentation
- Drive planning sessions around data migration with client and internal teams
- Plan and coordinate data migration cutover, including object sequencing, timing, incremental loads, and pilot/cohort rollouts
- Own creation and updates to solution design documentation for the data migration plan
Technical Execution & Collaboration
- Lead the development of Informatica/Snowflake solutions to extract, transform, and load bulk data for Salesforce
- Collaborate with the Integration team to establish bulk data extraction logic for Snowflake data pipelines
- Collaborate with the Salesforce team to compile and practice Salesforce setup data loads, and determine which automation logic to replicate in data transformation logic
- Feed technical user stories to the Data Migration pod to ensure Informatica Developers have a sufficient backlog for all development sprints
- Address development, troubleshooting, or DevOps questions from Informatica Data Migration developers
Data Quality & User Enablement
- Directly execute and lead a team of analysts to guide end users (DPH and LPH) through data cleansing and de-duplication activities
Required Qualifications
- Minimum 7–10 years of experience in data migration, ETL development, or data architecture
- Hands-on experience with Informatica ETL development and Snowflake data pipelines
- Experience with Salesforce data loads and automation logic
- Familiarity with Github and Github Actions for DevOps and CI/CD processes
- Demonstrated ability to lead cross-functional teams and collaborate with integration and Salesforce teams
- Strong documentation, planning, and communication skills
- Experience guiding end users through data cleansing and de-duplication
Preferred Qualifications
- Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field
- Experience in public health or government data migration projects
- Experience with large-scale data migration cutovers, including incremental loads and pilot rollouts
- Knowledge of data quality best practices and data transformation logic
Location & Requirements
- Location: Boston, MA (Hybrid work model)
- Must be legally authorized to work in the United States
- Must pass background check
- Must pass Criminal background check
- Must pass CORI check for CJIS Certification