Loading...

Easy Apply

Please enter a valid email.
Please enter a valid phone number.
Please select a valid country.
Please provide a resume.
You must review and agree before submitting.
Software Developer - ETL - Senior
Summary:
Join our team as a Senior ETL Developer to lead data integration and migration efforts for a Dynamics 365-based solution within the Ontario Public Service. You’ll design and implement scalable ETL pipelines using Azure Data Factory and SSIS, ensuring high data quality, performance, and compliance with integration standards. Responsibilities include mapping and transforming data from legacy systems to D365, supporting system testing, and documenting technical artifacts. Ideal candidates bring 8+ years of enterprise-level experience integrating Dynamics 365 and Azure services, strong SQL/data warehousing expertise, and excellent collaboration skills. This hybrid role (3 days onsite) includes robust knowledge transfer responsibilities.
 
Description

A Note on Assignment Type:

 

This position is currently listed as "Onsite"; however, the Assignment under this request will provisionally be "Hybrid", working 7.25 hours per calendar day, sometime between 8:00 AM and 5:00 PM (excluding breaks) Monday to Friday inclusive, unless otherwise identified. These conditions are subject to change as the OPS reflects on its current situation. During the duration of the assignment, you will be under the discretion of the Hiring Manager's requirements for the Project you are assigned to.

 

Scope

 

  • The Office of the Public Guardian and Trustee (OPGT) requires a Senior Software Developer – ETL to perform activities for interfacing the new Dynamics 365 solution and data warehouse with internal (OPS) and external systems as a member of the integrations team made up of internal and vendor staff.

 

Assignment Deliverables

 

As a member of the integrations team, the ETL Developer will be responsible for integrating the new Dynamics 365 solution, the data warehouse, and various internal (OPS) and external systems. This team will comprise both internal staff and vendor resources.

 

A high-level list of deliverables for the ETL Developer includes:

  • ETL Solution Design and Development:
    • Design, develop, and implement robust ETL (Extract, Transform, Load) processes for data migration and ongoing integrations between diverse source systems (e.g., internal legacy systems, external vendor platforms) and Microsoft Dynamics 365 Customer Engagement (CE) and Finance & Operations (F&O).
    • Develop and optimize data transformation logic to ensure data quality, consistency, and adherence to business rules and D365 data models.
    • Utilize and recommend appropriate ETL tools and technologies (e.g., Azure Data Factory, SSIS, other cloud-based ETL services) to build efficient and scalable data pipelines. I
    • Implement data cleansing, validation, and error handling mechanisms within ETL processes.
  • Data Migration Planning and Execution:
    • Lead and execute all phases of data migration activities from legacy systems to D365 CE and F&O, including data profiling, mapping, cleansing, transformation, and loading.
    • Develop and maintain data migration strategies, cutover plans, and rollback procedures.
    • Collaborate with data owners and business users to ensure data accuracy and completeness during migration.
  • Testing and Quality Assurance:
    • Design, develop, and execute comprehensive test plans, cases, scripts, and test data (e.g., manufactured, obfuscated) based on functional and technical specifications to validate ETL solutions and data integrity.
    • Create and maintain a full test plan, testing procedures, and an associated library of reusable test cases and scripts, ensuring full traceability from requirements to test outcomes.
    • Perform both manual and automated testing to validate system and integration functionality, data accuracy, performance, and scalability. This includes unit testing, integration testing, system testing, and performance testing for ETL processes.
  • Collaboration and CI/CD Integration:
    • Actively collaborate with stakeholders across business units, development teams, and external vendors to understand integration requirements and ensure proper data flow.
    • Ensure proper integration of ETL processes and tests into the continuous integration/continuous delivery (CI/CD) pipeline to support automated deployments and efficient release cycles.
  • Support and Documentation:
    • Provide analytical, development, and testing support for ETL processes and data integrations throughout the project lifecycle.
    • Develop and maintain detailed technical documentation for all ETL processes, data mappings, data dictionaries, and integration architectures.
    • Assist in troubleshooting and resolving data integration issues, providing timely support and solutions.
Skills
Experience and Skill Set Requirements

A Note on the VOR Master Service Agreement:

 

The VOR Master Service Agreement which expires on April 5, 2026, leaves some Contracts with funding unassigned for fiscal 2026-27. If the current statement of work expires on March 31, 2026, the remaining funds can be used to exercise an option to extend the SOW beyond March 31, 2026, based on business case approvals. Such extensions will be allowable, only if the Master Service Agreement is extended beyond April 5, 2026, and be upon the same terms, conditions, and covenants contained in the SOW.  

 

The start date is subject to change based on security clearances and contract signing timelines.

 

Experience and Skillset Requirements

 

Mandatory Requirements

 

  • At least 8 years of hands-on experience in enterprise-level data integration and ETL (Extract, Transform, Load) development, with a significant focus on integrating with Microsoft Dynamics 365 (Customer Engagement and/or Finance & Operations) and related Azure data services. 

 

Desired Skills and Experience

 

The ideal candidate for this ETL Developer role will possess a strong blend of technical expertise in data integration, a deep understanding of Microsoft's data ecosystem, and excellent collaborative abilities. 

  • ETL Tool Proficiency: 

Mandatory: Proven hands-on experience with Microsoft's primary ETL tools for enterprise data integration, specifically Azure Data Factory (ADF). This includes designing and implementing pipelines, data flows, activities, datasets, linked services, and integration runtimes. 

  • Highly Desirable: Experience with SQL Server Integration Services (SSIS) for existing on-premise integrations or migration scenarios. 
  • Familiarity with other relevant data integration tools and concepts (e.g., Change Data Capture - CDC, data streaming) is a plus. 
  • Database and Data Warehousing Expertise: 

Strong SQL proficiency: Ability to write complex SQL queries, stored procedures, functions, and views for data extraction, transformation, and loading across various database platforms (e.g., SQL Server, Azure SQL Database). 

  • Solid understanding of data warehousing concepts (e.g., dimensional modeling, star/snowflake schemas, data marts) and experience designing and implementing data warehouse solutions. 
  • Experience with Azure data services relevant to data warehousing and analytics (e.g., Azure Synapse Analytics, Azure Data Lake Storage). 
  • Microsoft Dynamics 365 Data Acumen: 

Fundamental understanding of Dynamics 365 data models for both Customer Engagement (CRM) and Finance & Operations (ERP). This includes knowledge of key entities, relationships, and common data patterns within D365. 

  • Ability to extract data from D365 APIs and OData feeds, and load data effectively into D365 (e.g., using Data Management Framework - DMF, KingswaySoft, or custom integrations). 
  • Data Quality and Governance: 

Experience implementing data cleansing, validation, error handling, and reconciliation processes within ETL pipelines to ensure high data quality. 

  • Understanding of data governance principles and best practices for managing data integrity and consistency. 
  • Programming/Scripting (Desirable): 

Proficiency in scripting languages such as Python, PowerShell, or C# for custom data transformations, automation of ETL tasks, and interacting with APIs. 

  • Version Control and CI/CD: 

Experience with version control systems (e.g., Git, Azure DevOps Repos) for managing ETL code and configurations. 

  • Familiarity with Continuous Integration/Continuous Delivery (CI/CD) pipelines for automated deployment of ETL solutions. 
  • Analytical and Problem-Solving Skills: 

Excellent analytical and problem-solving skills with a keen eye for detail to identify data discrepancies, troubleshoot complex integration issues, and optimize ETL performance. 

  • Ability to translate business requirements into technical data integration solutions. 
  • Communication and Collaboration: 

Strong verbal and written communication skills to articulate technical concepts clearly to both technical and non-technical stakeholders. 

  • Ability to collaborate effectively within a multidisciplinary team (internal and vendor staff), including data architects, D365 functional consultants, and business users. 
  • Demonstrated ability to document technical designs, data mappings, and ETL processes thoroughly. 

 

Resumes Evaluation/Criteria:

 

Criteria 1: Technical Skills  - 40 Points

 

  • Hands-on experience with Microsoft Dynamics 365 cloud environments (both Customer Engagement and Finance & Operations), specifically concerning data extraction, loading, and integration points.  
  • Extensive experience (8+ years) working across various data platforms, database technologies, and integration patterns, including relational databases (SQL Server, Azure SQL), data lakes (Azure Data Lake Storage), and cloud data warehouses (Azure Synapse Analytics).  
  • Proven experience with middleware, integration platforms, and APIs, particularly those used for connecting diverse systems to Dynamics 365 (e.g., Azure Data Factory, Logic Apps, API Management, or other enterprise integration tools).  
  • Deep understanding and practical application of performance optimization techniques for ETL processes, large-scale data migrations, and data synchronization in cloud environments.  
  • Demonstrated experience with structured methodologies for the design, development, and implementation of data integration solutions, including requirements gathering, data mapping, and detailed technical design documentation.  
  • Strong background in data analysis and system design within large-scale, complex enterprise environments, focusing on data flow, data quality, and system interoperability. 

 

Criteria 2: Broader Technical Acumen & Methodological Proficiency  - 30 Points

 

  • Demonstrated experience integrating diverse enterprise systems beyond D365, leveraging various integration patterns, middleware technologies (e.g., Azure Integration Services, Logic Apps), and communication protocols (e.g., REST, SOAP, SFTP). 
  • Proven experience in managing and optimizing the performance of large-scale data migrations and continuous data synchronization processes across heterogeneous systems. 
  • Extensive experience with structured methodologies for the entire data integration lifecycle, from detailed requirements gathering and data mapping to solution design, development, testing, and deployment. 
  • Strong background in data analysis, data quality management, and troubleshooting complex data discrepancies within large, integrated system landscapes. 
  • Familiarity with modern software development practices, including version control (e.g., Git, Azure DevOps Repos) and supporting Continuous Integration/Continuous Delivery (CI/CD) pipelines for automated ETL deployments. 

 

Criteria 3: Interpersonal Skill  - 30 Points

 

Exceptional Communication and Collaboration:  

  • Articulate and concise communication skills, both verbal and written, capable of conveying complex technical information about data integration, ETL processes, and data quality issues to diverse audiences, including technical teams, D365 functional consultants, and non-technical business stakeholders. 
  • Proven ability to actively participate in and lead technical discussions, offering informed solution recommendations, explaining design choices, and effectively documenting work for clarity and future reference. 
  • Strong negotiation and influencing skills to align stakeholders on data integration strategies, resolve data mapping discrepancies, and gain buy-in for proposed ETL solutions, ensuring project objectives are met. 
  • Demonstrated ability to work effectively within a multidisciplinary team environment (comprising internal staff, vendors, and cross-functional departments), fostering a collaborative atmosphere and successfully integrating individual contributions into a cohesive project outcome. 

 

Knowledge Transfer

What needs to be KT

 ETL Solution Design Documentation: 

  • Detailed ETL Process Flows/Pipelines (including end-to-end data flow from source to D365/data warehouse). 
  • Comprehensive Data Mapping Specifications (source-to-target, transformations, data types, cleansing logic). 
  • Integration Architecture Diagrams (showing connections between D365, data warehouse, and other systems). 
  • Documentation of Data Governance & Quality Rules (validation, error handling, reconciliation). 

ETL Development Artefacts: 

  • Fully commented and version-controlled ETL code and scripts (Azure Data Factory pipelines, SSIS packages, SQL scripts, custom code). 
  • Deployment and Configuration Guides (step-by-step instructions for environments, including environment-specific settings). 
  • Performance Optimization and Monitoring Artefacts (tuning strategies, monitoring queries/dashboards). 

Testing and Validation Assets: 

  • ETL Test Plans and Test Cases (strategy, unit, integration, data validation test cases). 
  • Sample Test Data and Data Validation Scripts (examples of test data, scripts/queries for integrity validation). 

Operational Runbooks & Troubleshooting Guides: 

  • Daily Operations Runbook (procedures for routine monitoring, scheduling, common tasks). 
  • Troubleshooting Guides (identifying/resolving job failures, discrepancies, performance bottlenecks, escalation paths). 

Project-Specific Documentation: 

  • Key Data Integration Decision Logs (rationale for significant design choices). 
  • Technical Presentations & Walkthroughs (materials used for deep-dives, potentially recorded sessions). 
  • Relevant technical aspects of Status and Progress Reports (detailing technical progress, challenges, and resolutions). 

 

To whom

  • Project Manager / Team members 

 

When

  • 1:1 meetings, team meetings, dedicated knowledge transfer sessions (including hands-on walkthroughs), comprehensive documentation stored on SharePoint site (or other designated knowledge management platform), code repositories (e.g., Azure DevOps Repos) with clear commenting, throughout the duration of the project lifecycle and formally at project milestones/completion. 
Supplier Comments

This position is currently listed as "Onsite"; however, the Assignment under this request will provisionally be "Hybrid", working 7.25 hours per calendar day, sometime between 8:00 AM and 5:00 PM (excluding breaks) Monday to Friday inclusive, unless otherwise identified. These conditions are subject to change as the OPS reflects on its current situation. During the duration of the assignment, you will be under the discretion of the Hiring Manager's requirements for the Project you are assigned to.

 

Max no of submission- 1( one)

 

Must haves:

 

  • At least 8 years of hands-on experience in enterprise-level data integration and ETL (Extract, Transform, Load) development, with a significant focus on integrating with Microsoft Dynamics 365 (Customer Engagement and/or Finance & Operations) and related Azure data services.