A Note on Assignment Type:
This position is currently listed as "Onsite"; however, the Assignment under this request will provisionally be "Hybrid", working 7.25 hours per calendar day, sometime between 8:00 AM and 5:00 PM (excluding breaks) Monday to Friday inclusive, unless otherwise identified. These conditions are subject to change as the OPS reflects on its current situation. During the duration of the assignment, you will be under the discretion of the Hiring Manager's requirements for the Project you are assigned to.
Scope
Assignment Deliverables
As a member of the integrations team, the ETL Developer will be responsible for integrating the new Dynamics 365 solution, the data warehouse, and various internal (OPS) and external systems. This team will comprise both internal staff and vendor resources.
A high-level list of deliverables for the ETL Developer includes:
• ETL Solution Design and Development:
o Design, develop, and implement robust ETL (Extract, Transform, Load) processes for data migration and ongoing integrations between diverse source systems (e.g., internal legacy systems, external vendor platforms) and Microsoft Dynamics 365 Customer Engagement (CE) and Finance & Operations (F&O).
o Develop and optimize data transformation logic to ensure data quality, consistency, and adherence to business rules and D365 data models.
o Utilize and recommend appropriate ETL tools and technologies (e.g., Azure Data Factory, SSIS, other cloud-based ETL services) to build efficient and scalable data pipelines.
o Implement data cleansing, validation, and error handling mechanisms within ETL processes.
• Data Migration Planning and Execution:
o Lead and execute all phases of data migration activities from legacy systems to D365 CE and F&O, including data profiling, mapping, cleansing, transformation, and loading.
o Develop and maintain data migration strategies, cutover plans, and rollback procedures.
o Collaborate with data owners and business users to ensure data accuracy and completeness during migration.
• Testing and Quality Assurance:
o Design, develop, and execute comprehensive test plans, cases, scripts, and test data (e.g., manufactured, obfuscated) based on functional and technical specifications to validate ETL solutions and data integrity.
o Create and maintain a full test plan, testing procedures, and an associated library of reusable test cases and scripts, ensuring full traceability from requirements to test outcomes.
o Perform both manual and automated testing to validate system and integration functionality, data accuracy, performance, and scalability. This includes unit testing, integration testing, system testing, and performance testing for ETL processes.
• Collaboration and CI/CD Integration:
o Actively collaborate with stakeholders across business units, development teams, and external vendors to understand integration requirements and ensure proper data flow.
o Ensure proper integration of ETL processes and tests into the continuous integration/continuous delivery (CI/CD) pipeline to support automated deployments and efficient release cycles.
• Support and Documentation:
o Provide analytical, development, and testing support for ETL processes and data integrations throughout the project lifecycle.
o Develop and maintain detailed technical documentation for all ETL processes, data mappings, data dictionaries, and integration architectures.
o Assist in troubleshooting and resolving data integration issues, providing timely support and solutions.
A Note on the VOR Master Service Agreement:
The VOR Master Service Agreement which expires on April 5, 2026, leaves some Contracts with funding unassigned for fiscal 2026-27. If the current statement of work expires on March 31, 2026, the remaining funds can be used to exercise an option to extend the SOW beyond March 31, 2026, based on business case approvals. Such extensions will be allowable, only if the Master Service Agreement is extended beyond April 5, 2026, and be upon the same terms, conditions, and covenants contained in the SOW.
The start date is subject to change based on security clearances and contract signing timelines.
Experience and Skillset Requirements
Mandatory Requirements
Nice to Have Requirements
The ideal candidate for this ETL Developer role will possess a strong blend of technical expertise in data integration, a deep understanding of Microsoft's data ecosystem, and excellent collaborative abilities.
ETL Tool Proficiency:
Database and Data Warehousing Expertise:
Microsoft Dynamics 365 Data Acumen:
Data Quality and Governance:
Programming/Scripting (Desirable):
Version Control and CI/CD:
Analytical and Problem-Solving Skills:
Communication and Collaboration:
Strong verbal and written communication skills to articulate technical concepts clearly to both technical and non-technical stakeholders.
Ability to collaborate effectively within a multidisciplinary team (internal and vendor staff), including data architects, D365 functional consultants, and business users.
Demonstrated ability to document technical designs, data mappings, and ETL processes thoroughly.
Resumes Evaluation/Criteria:
Criteria 1: Technical Skills - 40 Points
Hands-on experience with Microsoft Dynamics 365 cloud environments (both Customer Engagement and Finance & Operations), specifically concerning data extraction, loading, and integration points.
• Extensive experience (8+ years) working across various data platforms, database technologies, and integration patterns, including relational databases (SQL Server, Azure SQL), data lakes (Azure Data Lake Storage), and cloud data warehouses (Azure Synapse Analytics).
•Proven experience with middleware, integration platforms, and APIs, particularly those used for connecting diverse systems to Dynamics 365 (e.g., Azure Data Factory, Logic Apps, API Management, or other enterprise integration tools).
• Deep understanding and practical application of performance optimization techniques for ETL processes, large-scale data migrations, and data synchronization in cloud environments.
• Demonstrated experience with structured methodologies for the design, development, and implementation of data integration solutions, including requirements gathering, data mapping, and detailed technical design documentation.
• Strong background in data analysis and system design within large-scale, complex enterprise environments, focusing on data flow, data quality, and system interoperability.
Criteria 2: Broader Technical Acumen & Methodological Proficiency - 30 Points
• Demonstrated experience integrating diverse enterprise systems beyond D365, leveraging various integration patterns, middleware technologies (e.g., Azure Integration Services, Logic Apps), and communication protocols (e.g., REST, SOAP, SFTP).
• Proven experience in managing and optimizing the performance of large-scale data migrations and continuous data synchronization processes across heterogeneous systems.
• Extensive experience with structured methodologies for the entire data integration lifecycle, from detailed requirements gathering and data mapping to solution design, development, testing, and deployment.
• Strong background in data analysis, data quality management, and troubleshooting complex data discrepancies within large, integrated system landscapes.
• Familiarity with modern software development practices, including version control (e.g., Git, Azure DevOps Repos) and supporting Continuous Integration/Continuous Delivery (CI/CD) pipelines for automated ETL deployments.
Criteria 3: Interpersonal Skill - 30 Points
Exceptional Communication and Collaboration:
• Articulate and concise communication skills, both verbal and written, capable of conveying complex technical information about data integration, ETL processes, and data quality issues to diverse audiences, including technical teams, D365 functional consultants, and non-technical business stakeholders.
• Proven ability to actively participate in and lead technical discussions, offering informed solution recommendations, explaining design choices, and effectively documenting work for clarity and future reference.
• Strong negotiation and influencing skills to align stakeholders on data integration strategies, resolve data mapping discrepancies, and gain buy-in for proposed ETL solutions, ensuring project objectives are met.
• Demonstrated ability to work effectively within a multidisciplinary team environment (comprising internal staff, vendors, and cross-functional departments), fostering a collaborative atmosphere and successfully integrating individual contributions into a cohesive project outcome.
Knowledge Transfer
What needs to be KT
ETL Solution Design Documentation:
• Detailed ETL Process Flows/Pipelines (including end-to-end data flow from source to D365/data warehouse).
• Comprehensive Data Mapping Specifications (source-to-target, transformations, data types, cleansing logic).
• Integration Architecture Diagrams (showing connections between D365, data warehouse, and other systems).
• Documentation of Data Governance & Quality Rules (validation, error handling, reconciliation).
ETL Development Artefacts:
• Fully commented and version-controlled ETL code and scripts (Azure Data Factory pipelines, SSIS packages, SQL scripts, custom code).
• Deployment and Configuration Guides (step-by-step instructions for environments, including environment-specific settings).
• Performance Optimization and Monitoring Artefacts (tuning strategies, monitoring queries/dashboards).
Testing and Validation Assets:
• ETL Test Plans and Test Cases (strategy, unit, integration, data validation test cases).
• Sample Test Data and Data Validation Scripts (examples of test data, scripts/queries for integrity validation).
Operational Runbooks & Troubleshooting Guides:
• Daily Operations Runbook (procedures for routine monitoring, scheduling, common tasks).
• Troubleshooting Guides (identifying/resolving job failures, discrepancies, performance bottlenecks, escalation paths).
Project-Specific Documentation:
• Key Data Integration Decision Logs (rationale for significant design choices).
• Technical Presentations & Walkthroughs (materials used for deep-dives, potentially recorded sessions).
• Relevant technical aspects of Status and Progress Reports (detailing technical progress, challenges, and resolutions).
To whom
When
This position is currently listed as "Onsite"; however, the Assignment under this request will provisionally be "Hybrid", working 7.25 hours per calendar day, sometime between 8:00 AM and 5:00 PM (excluding breaks) Monday to Friday inclusive, unless otherwise identified. These conditions are subject to change as the OPS reflects on its current situation. During the duration of the assignment, you will be under the discretion of the Hiring Manager's requirements for the Project you are assigned to.
Max no of submission- 2 (two)
Must haves:
At least 8 years of hands-on experience in enterprise-level data integration and ETL (Extract, Transform, Load) development, with a significant focus on integrating with Microsoft Dynamics 365 (Customer Engagement and/or Finance & Operations) and related Azure data services.