Loading...

Easy Apply

Please enter a valid email.
Please enter a valid phone number.
Please select a valid country.
Please provide a resume.
You must review and agree before submitting.
Data Science Developer - Senior

Data Science Developer - Deliverables

 

The candidates will perform Data Science Developer work by creating, enhancing, maintaining, supporting, and sustaining existing pipelines, analytics models, and data products for the Ministry of Transportation and Ministry of Labour Immigration Training, and Skills Development.

 

Specific Deliverables

Deliverables expected to be produced could include:

·        Creating, enhancing, maintaining, and supporting structures for storage of data in formats that are suitable for consumption in analytics solutions.

·        Automation of data pipelines used to ingest, prepare, transform, and model data for use in analytics products.

·        Creating, enhancing, maintaining, and supporting dashboards and reports.

·        Creating, enhancing, maintaining, and supporting analytics environments and implementing new technology to improve performance, simplify architecture patterns, and reduce cloud hosting costs.

·        Knowledge transfer sessions and documentation for technical staff related to architecting, designing, and implementing continuous improvement enhancements to analytics solutions. These sessions will be held as needed and on a case by case basis that involve walkthroughs of documentation, code, and environment setups.

 

Skills
 
Experience and Skill Set Requirements

Data Science Developer - Evaluation Criteria

 

Data Storage and Preparation - 35%

  • The candidate must demonstrate their experience with Azure Storage, Azure Data Lake, Azure Databricks Lakehouse, and Azure Synapse structures in real world implementations

Data Pipelines - 35%

  • The candidate must demonstrate their experience with automating data pipelines using appropriate Microsoft Azure Platform/Technologies (Python, Databricks and Azure Data Factory)

Data Analytics - 15%

  • The candidate must demonstrate their experience with Power BI reports and dashboards

Knowledge Transfer - 15%

  • The candidate must demonstrate experience in conducting knowledge transfer sessions and building documentation for technical staff related to architecting, designing, and implementing end to end analytics solutions

 

Supplier Comments

Max submission: 3 (three)

Hybrid: 3 days onsite / 2 days remote.

 

Must Have:

Experience with Azure Storage, Azure Data Lake, Azure Databricks Lakehouse, and Azure Synapse structures.

Experience with Python, Databricks and Azure Data Factory

Power BI reports and dashboards

 

Nice to have:

Prior experience with high data / big data related projects.