(remote) Rq05661 Data Architect/modeller Senior

Toronto, ON, Canada

Job Description


We would like to present to you a new job opportunity and I think you may find it interesting.

If you are interested kindly send the following documents to & by Friday, 7 July at 10 AM ESTif that interests you and matches your profile.

Without mandatory documents, we cannot submit a candidate. * Updated Resume in word format (Mandatory)

  • Skills Matrix and References (Mandatory)
  • Expected hourly rate (Mandatory)
Job Title:RQ05661 - Data Architect/Modeller - Senior

Client: Ministry of Northern Development, Mines, Natural Resources and Forestry

40 St. Clair ,Toronto, Ontario, Remote

Estimated Start Date: 2023-07-31

Estimated End Date: 2024-03-29

Business Days: 220.00

Extension: Probable after the initial mandate

Hours per day or Week: 7.25 hours per day

Security Level: No Clearance Required

If you are interested to learn more about this opportunity, please check it out on our career siteor If not please feel free to send over any names or forward this email to anyone who may be interested.

(Click above for Skill matrix and for more details)

Must haves:
  • Databricks Data Scientist position.
  • 5 years + expertise in Azure Databricks
  • Python and R development experience
Description

Responsibilities:
  • Work with project IT (Information Technology) team to refine data engineering, processing, modeling, and machine learning tasks within the Databricks data scientist platform. Tasks will include, but are not limited to designing data pipelines, optimizing data storage, and processing, and integrating Databricks with other components of the system into Azure Functions, Storage, Azure Data Factory, SQL DB, and Cosmos DB
  • Work with project business team to analyze the business\xe2\x80\x99s Databricks programming requirements to design scalable and efficient processing and pipelines within the Azure storage environment and optimized based on the ingestion of large field, optical and lidar datasets
  • Work with business to operationalize workflows in RStudio, LAStools, python, Tableau, Artificial Intelligence, Machine Learning, and Internet of Things (IOT, Microsoft Planetary, Google Earth Engine, etc.) data models into a Databricks notebook and using Azure Data Pipelines
  • Work with project team (IT and business) to assess resource requirements, estimate costs, and supply correct financial projections.
  • Work with IT team to monitor the performance and health of Azure Databrick resources and resolve issues proactively; and to ensure efficient scaling based on workload demands.
  • Utilize modernized BI (Business Intelligence)to provide project team with reports/analytics on resource usage, cost allocation, performance metric, and meaningful insights.
  • Supply ability in the design, delivery, evaluation and maintenance of leading-edge user experience strategies and online experiences for the business unit\xe2\x80\x99s client (Forest Industry) and stakeholders (internal ministries), and the public (including research, Federal ministries, and other ministry users). using the government\'s official websites
  • Lead and conduct multiple concurrent research, content design and strategy projects to meet the needs of all OPS (Ontario Public Service) clients involved in forest management planning and resource management activities
  • Develop code/script to automate the business workflow, test and implement solution components by using Databricks, standard Azure services, .NET for scripting integration work
  • Support product backlog grooming, task identification and effort estimation
  • consult the tech Lead on all IT solution designs and options before presenting them to the rest of the team to ensure compliance with OPS policies and standards
  • comply and follow all OPS IT policies, standards, and processes e.g. AODA (Accessibility for Ontarians with Disabilities Act), GO IT Standards
  • Attend scrum calls and provide timely updates and raise in case of any blocker
  • Provide timely update to the project manager and project team
  • Provide training and mentoring to the project team members. Possesses good communication skills and the ability to effectively transfer knowledge to team members
  • Provide Knowledge transfer/training to the business and technical team
  • Pick up OPS (Ontario Public Service) laptop from a downtown office in Toronto and return the laptop to the same location upon completion of the contract
Knowledge Transfer Requirements
  • Knowledge transfer is expected to occur throughout the duration of the assignment through regular meetings, touchpoints, and working sessions with FRI (Forest Resources Inventory) business and LRC (Land and Resources Cluster) staff
  • All design, development, artefacts, and source code, including all relevant and complete documents must be transferred to the ministry in the designated repositories supplied
  • Dedicated knowledge transfer sessions will be scheduled to ensure completeness of knowledge transfer and all documentation is shared, and allow FRI business and LRC staff to ask clarification and/or follow up questions
Mandatory Skills:
  • 5 years + in depth technical knowledge, expertise, and experience with Azure Databricks including but not limited to workspace, network, cluster, notebooks, DBFS etc.
  • Demonstrated skills in Python, RStudio, financial, business intelligence and data visualization development experience
  • Experience gathering user requirements to develop a solution for data scientists
  • Basic knowledge and experience with Azure DevOPS
  • Demonstrated team success and for team focused delivery for meeting deadlines, managing competing priorities and client relationship management experience
Desirable Skills:
  • Data Scientist work experience gained by working with a Forest Industry company or completed a graduate degree with an academia/university environment is desirable but not mandatory
  • Skills to analyze field data and incorporate to develop various predictive models and develop associated statistical reporting and confusion matrices d
  • Skills, knowledge, and experience with RStudio, GitHub, other opensource tools as well as Python Geospatial library knowledge are desirable but not mandatory
Evaluation Criteria

Technical skills 70%
  • Experience analyzing business requirements, design scalable and efficient Databricks architecture and coding with Python and R scripts as a data scientist
  • Experience working with Azure Databricks to build data models, implement machine learning tasks as a Databricks expert
  • Working experience in designing data pipelines, optimizing data storage, and processing, Spark (parallel processing), Delta Table and Lake, Cosmos DB, Docker, and object-based and/or pixel image analyses and integrating Databricks with other components of the system, such as Azure Data Factory and Azure SQL DB
  • Skills and experience working with modern geospatial raster and point cloud GPS (Global Positioning Systems) data products process and analysis
  • Skilled in data engineering techniques, including data ingestion, transformation, API (Application Programming Interface), data streaming, and open-source code and tool integration. Have experience working with various data sources and formats and be able to implement data pipelines using Azure Data Factory and Databricks with the context of remote sensing and resource management techniques
  • Solid understanding of data science principles and machine learning algorithms for application in geospatial and resource management scenarios Able to aid with tasks such as data exploration, feature engineering, model development, and model deployment on the Databricks platform
  • Proficient in optimizing Databricks workloads for performance and scalability. Capable of tuning Spark configurations, optimizing data processing operations, and troubleshooting performance issues
  • Expert-level knowledge of cost control strategy, usage pattern analysis, and cost-saving measures.
  • Hand-on experience of generating comprehensive reports and analytics on resource usage, cost allocation, and performance metrics on DataBricks and Azure cloud servic
  • Knowledgeable about best practices for data governance, security, and compliance on the Databricks platform. Should provide guidance on data privacy, access control, and auditing Knowledge of user interface design principles and best practices
  • Strong problem-solving skills to diagnose and resolve issues during the implementation and maintenance of Databricks solutions. Capable of identifying bottlenecks, troubleshooting errors, and proposing workflow optimization solutions
  • End to end responsibility of design, documentation, development, testing and deployment with Azure Databricks
  • Experience with or ability in remote sensing, sustainable resource management, Ontario\xe2\x80\x99s Forest and Resource Industry, the forest resources inventory program (FRI) would be beneficial
Soft skills 30%
  • Excellent analytical, problem solving and decision-making skills
  • Experience with agile methodology
  • Excellent communication skills, both written and verbal
  • Excellent meeting facilitation skills to gather requirements
  • Experience reporting progress on deliverables to team, project leads and management, including proactively raising risks/issues with mitigations
  • Strong stakeholder management skills
  • Ability to integrate with a skilled business unit, and LRC technical team
Powered by JazzHR

S M Software Solutions Inc

Beware of fraud agents! do not pay money to get a job

MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Job Detail

  • Job Id
    JD2208719
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Toronto, ON, Canada
  • Education
    Not mentioned