A Note on the VOR Master Service Agreement:
The VOR Master Service Agreement which expires on April 5, 2026, leaves some Contracts with funding unassigned for fiscal 2026-27. If the current statement of work expires on March 31, 2026, the remaining funds can be used to exercise an option to extend the SOW beyond March 31, 2026, based on business case approvals. Such extensions will be allowable, only if the Master Service Agreement is extended beyond April 5, 2026, and be upon the same terms, conditions, and covenants contained in the SOW.
The start date is subject to change based on security clearances and contract signing timelines.
Experience and Skillset Requirements
Mandatory Requirements
- At least 8 years of hands-on experience in enterprise-level data integration and ETL (Extract, Transform, Load) development, with a significant focus on integrating with Microsoft Dynamics 365 (Customer Engagement and/or Finance & Operations) and related Azure data services.
Desired Skills and Experience
The ideal candidate for this ETL Developer role will possess a strong blend of technical expertise in data integration, a deep understanding of Microsoft's data ecosystem, and excellent collaborative abilities.
Mandatory: Proven hands-on experience with Microsoft's primary ETL tools for enterprise data integration, specifically Azure Data Factory (ADF). This includes designing and implementing pipelines, data flows, activities, datasets, linked services, and integration runtimes.
- Highly Desirable: Experience with SQL Server Integration Services (SSIS) for existing on-premise integrations or migration scenarios.
- Familiarity with other relevant data integration tools and concepts (e.g., Change Data Capture - CDC, data streaming) is a plus.
- Database and Data Warehousing Expertise:
Strong SQL proficiency: Ability to write complex SQL queries, stored procedures, functions, and views for data extraction, transformation, and loading across various database platforms (e.g., SQL Server, Azure SQL Database).
- Solid understanding of data warehousing concepts (e.g., dimensional modeling, star/snowflake schemas, data marts) and experience designing and implementing data warehouse solutions.
- Experience with Azure data services relevant to data warehousing and analytics (e.g., Azure Synapse Analytics, Azure Data Lake Storage).
- Microsoft Dynamics 365 Data Acumen:
Fundamental understanding of Dynamics 365 data models for both Customer Engagement (CRM) and Finance & Operations (ERP). This includes knowledge of key entities, relationships, and common data patterns within D365.
- Ability to extract data from D365 APIs and OData feeds, and load data effectively into D365 (e.g., using Data Management Framework - DMF, KingswaySoft, or custom integrations).
- Data Quality and Governance:
Experience implementing data cleansing, validation, error handling, and reconciliation processes within ETL pipelines to ensure high data quality.
- Understanding of data governance principles and best practices for managing data integrity and consistency.
- Programming/Scripting (Desirable):
Proficiency in scripting languages such as Python, PowerShell, or C# for custom data transformations, automation of ETL tasks, and interacting with APIs.
- Version Control and CI/CD:
Experience with version control systems (e.g., Git, Azure DevOps Repos) for managing ETL code and configurations.
- Familiarity with Continuous Integration/Continuous Delivery (CI/CD) pipelines for automated deployment of ETL solutions.
- Analytical and Problem-Solving Skills:
Excellent analytical and problem-solving skills with a keen eye for detail to identify data discrepancies, troubleshoot complex integration issues, and optimize ETL performance.
- Ability to translate business requirements into technical data integration solutions.
- Communication and Collaboration:
Strong verbal and written communication skills to articulate technical concepts clearly to both technical and non-technical stakeholders.
- Ability to collaborate effectively within a multidisciplinary team (internal and vendor staff), including data architects, D365 functional consultants, and business users.
- Demonstrated ability to document technical designs, data mappings, and ETL processes thoroughly.
Resumes Evaluation/Criteria:
Criteria 1: Technical Skills - 40 Points
- Hands-on experience with Microsoft Dynamics 365 cloud environments (both Customer Engagement and Finance & Operations), specifically concerning data extraction, loading, and integration points.
- Extensive experience (8+ years) working across various data platforms, database technologies, and integration patterns, including relational databases (SQL Server, Azure SQL), data lakes (Azure Data Lake Storage), and cloud data warehouses (Azure Synapse Analytics).
- Proven experience with middleware, integration platforms, and APIs, particularly those used for connecting diverse systems to Dynamics 365 (e.g., Azure Data Factory, Logic Apps, API Management, or other enterprise integration tools).
- Deep understanding and practical application of performance optimization techniques for ETL processes, large-scale data migrations, and data synchronization in cloud environments.
- Demonstrated experience with structured methodologies for the design, development, and implementation of data integration solutions, including requirements gathering, data mapping, and detailed technical design documentation.
- Strong background in data analysis and system design within large-scale, complex enterprise environments, focusing on data flow, data quality, and system interoperability.
Criteria 2: Broader Technical Acumen & Methodological Proficiency - 30 Points
- Demonstrated experience integrating diverse enterprise systems beyond D365, leveraging various integration patterns, middleware technologies (e.g., Azure Integration Services, Logic Apps), and communication protocols (e.g., REST, SOAP, SFTP).
- Proven experience in managing and optimizing the performance of large-scale data migrations and continuous data synchronization processes across heterogeneous systems.
- Extensive experience with structured methodologies for the entire data integration lifecycle, from detailed requirements gathering and data mapping to solution design, development, testing, and deployment.
- Strong background in data analysis, data quality management, and troubleshooting complex data discrepancies within large, integrated system landscapes.
- Familiarity with modern software development practices, including version control (e.g., Git, Azure DevOps Repos) and supporting Continuous Integration/Continuous Delivery (CI/CD) pipelines for automated ETL deployments.
Criteria 3: Interpersonal Skill - 30 Points
Exceptional Communication and Collaboration:
- Articulate and concise communication skills, both verbal and written, capable of conveying complex technical information about data integration, ETL processes, and data quality issues to diverse audiences, including technical teams, D365 functional consultants, and non-technical business stakeholders.
- Proven ability to actively participate in and lead technical discussions, offering informed solution recommendations, explaining design choices, and effectively documenting work for clarity and future reference.
- Strong negotiation and influencing skills to align stakeholders on data integration strategies, resolve data mapping discrepancies, and gain buy-in for proposed ETL solutions, ensuring project objectives are met.
- Demonstrated ability to work effectively within a multidisciplinary team environment (comprising internal staff, vendors, and cross-functional departments), fostering a collaborative atmosphere and successfully integrating individual contributions into a cohesive project outcome.
Knowledge Transfer
What needs to be KT
ETL Solution Design Documentation:
- Detailed ETL Process Flows/Pipelines (including end-to-end data flow from source to D365/data warehouse).
- Comprehensive Data Mapping Specifications (source-to-target, transformations, data types, cleansing logic).
- Integration Architecture Diagrams (showing connections between D365, data warehouse, and other systems).
- Documentation of Data Governance & Quality Rules (validation, error handling, reconciliation).
ETL Development Artefacts:
- Fully commented and version-controlled ETL code and scripts (Azure Data Factory pipelines, SSIS packages, SQL scripts, custom code).
- Deployment and Configuration Guides (step-by-step instructions for environments, including environment-specific settings).
- Performance Optimization and Monitoring Artefacts (tuning strategies, monitoring queries/dashboards).
Testing and Validation Assets:
- ETL Test Plans and Test Cases (strategy, unit, integration, data validation test cases).
- Sample Test Data and Data Validation Scripts (examples of test data, scripts/queries for integrity validation).
Operational Runbooks & Troubleshooting Guides:
- Daily Operations Runbook (procedures for routine monitoring, scheduling, common tasks).
- Troubleshooting Guides (identifying/resolving job failures, discrepancies, performance bottlenecks, escalation paths).
Project-Specific Documentation:
- Key Data Integration Decision Logs (rationale for significant design choices).
- Technical Presentations & Walkthroughs (materials used for deep-dives, potentially recorded sessions).
- Relevant technical aspects of Status and Progress Reports (detailing technical progress, challenges, and resolutions).
To whom
- Project Manager / Team members
When
- 1:1 meetings, team meetings, dedicated knowledge transfer sessions (including hands-on walkthroughs), comprehensive documentation stored on SharePoint site (or other designated knowledge management platform), code repositories (e.g., Azure DevOps Repos) with clear commenting, throughout the duration of the project lifecycle and formally at project milestones/completion.