Remote Rq05661 Data Architect/modeller Senior

Toronto, ON, Canada

Job Description


We have received a new requirement with one of the ministries in Ontario, kindly find the details below:

Note:

This position is currently listed as "Remote"; however, the consultant will be remote due to COVID-19 related Work from Home (WFH) direction. Once the I&IT cluster staff are required to return to the office, the resource under this request will be required to work onsite as well. without below mandatory documents. * Kindly provide updated Resume in word format.

  • Kindly provide attached Skills Matrix and References after filling.
  • Kindly provide your expected hourly rate.
Job Title: RQ05661 - Data Architect/Modeller - Senior

Client: Ministry of Northern Development, Mines, Natural Resources and Forestry

Sault Ste. Marie, Peterborough, Great Toronto AreaOntario, Hybrid

Estimated Start Date: 2023-07-31

Estimated End Date: 2024-03-29

Business Days: 220.00

Extension: Probable after the initial mandate

Hours per day or Week: 7.25 hours per day

Security Level: No Clearance Required

Mandatory requirement:
  • 5 years + in depth technical knowledge, expertise, and experience with Azure Databricks including but not limited to workspace, network, cluster, notebooks, DBFS etc.
  • Demonstrated skills in Python, RStudio, financial, business intelligence and data visualization development experience
  • Experience gathering user requirements to develop a solution for data scientists
  • Basic knowledge and experience with Azure DevOPS
  • Demonstrated team success and for team focused delivery for meeting deadlines, managing competing priorities and client relationship management experience
Kindly share the following documents to by Friday, July 7 at 10:00 AM EST

Description

\xef\xbb\xbfResponsibilities:

\xc2\xb7Work with project IT (Information Technology) team to refine data engineering, processing, modeling, and machine learning tasks within the Databricks data scientist platform. Tasks will include, but are not limited to designing data pipelines, optimizing data storage, and processing, and integrating Databricks with other components of the system into Azure Functions, Storage, Azure Data Factory, SQL DB, and Cosmos DB

\xc2\xb7 Work with project business team to analyze the business\xe2\x80\x99s Databricks programming requirements to design scalable and efficient processing and pipelines within the Azure storage environment and optimized based on the ingestion of large field, optical and lidar datasets

\xc2\xb7Work with business to operationalize workflows in RStudio, LAStools, python, Tableau, Artificial Intelligence, Machine Learning, and Internet of Things (IOT, Microsoft Planetary, Google Earth Engine, etc.) data models into a Databricks notebook and using Azure Data Pipelines

\xc2\xb7Work with project team (IT and business) to assess resource requirements, estimate costs, and supply correct financial projections.

\xc2\xb7Work with IT team to monitor the performance and health of Azure Databrick resources and resolve issues proactively; and to ensure efficient scaling based on workload demands.

\xc2\xb7Utilize modernized BI (Business Intelligence)to provide project team with reports/analytics on resource usage, cost allocation, performance metric, and meaningful insights.

\xc2\xb7 Supply ability in the design, delivery, evaluation and maintenance of leading-edge user experience strategies and online experiences for the business unit\xe2\x80\x99s client (Forest Industry) and stakeholders (internal ministries), and the public (including research, Federal ministries, and other ministry users). using the government\'s official websites

\xc2\xb7Lead and conduct multiple concurrent research, content design and strategy projects to meet the needs of all OPS (Ontario Public Service) clients involved in forest management planning and resource management activities

\xc2\xb7Develop code/script to automate the business workflow, test and implement solution components by using Databricks, standard Azure services, .NET for scripting integration work

\xc2\xb7Support product backlog grooming, task identification and effort estimation

\xc2\xb7consult the tech Lead on all IT solution designs and options before presenting them to the rest of the team to ensure compliance with OPS policies and standards
  • comply and follow all OPS IT policies, standards, and processes e.g. AODA (Accessibility for Ontarians with Disabilities Act), GO IT Standards
  • Attend scrum calls and provide timely updates and raise in case of any blocker
\xc2\xb7Provide timely update to the project manager and project team

\xc2\xb7Provide training and mentoring to the project team members. Possesses good communication skills and the ability to effectively transfer knowledge to team members

\xc2\xb7Provide Knowledge transfer/training to the business and technical team

\xc2\xb7Pick up OPS (Ontario Public Service) laptop from a downtown office in Toronto and return the laptop to the same location upon completion of the contract
  • Data Scientist work experience gained by working with a Forest Industry company or completed a graduate degree with an academia/university environment is desirable but not mandatory
  • Skills to analyze field data and incorporate to develop various predictive models and develop associated statistical reporting and confusion matrices d
  • Skills, knowledge, and experience with RStudio, GitHub, other opensource tools as well as Python Geospatial library knowledge are desirable but not mandatory
Experience and Skill Set Requirements

Technical skills 70%
  • Experience analyzing business requirements, design scalable and efficient Databricks architecture and coding with Python and R scripts as a data scientist
  • Experience working with Azure Databricks to build data models, implement machine learning tasks as a Databricks expert
  • Working experience in designing data pipelines, optimizing data storage, and processing, Spark(parallel processing), Delta Table and Lake, Cosmos DB, Docker, and object-based and/or pixel image analyses and integrating Databricks with other components of the system, such as Azure Data Factory and Azure SQL DB
  • Skills and experience working with modern geospatial raster and point cloud GPS (Global Positioning Systems) data products process and analysis
  • Skilled in data engineering techniques, including data ingestion, transformation, API (ApplicationProgramming Interface), data streaming, and open-source code and tool integration. Have experience working with various data sources and formats and be able to implement data pipelines using Azure Data Factory and Databricks with the context of remote sensing and resource management techniques
  • Solid understanding of data science principles and machine learning algorithms for application in geospatial and resource management scenarios Able to aid with tasks such as data exploration, feature engineering, model development, and model deployment on the Databricks platform
  • Proficient in optimizing Databricks workloads for performance and scalability. Capable of tuning Spark configurations, optimizing data processing operations, and troubleshooting performance issues
  • Expert-level knowledge of cost control strategy, usage pattern analysis, and cost-saving measures.
  • Hand-on experience of generating comprehensive reports and analytics on resource usage, cost allocation, and performance metrics on DataBricks and Azure cloud servic
  • Knowledgeable about best practices for data governance, security, and compliance on the Databricks platform. Should provide guidance on data privacy, access control, and auditing Knowledge of user interface design principles and best practices
  • Strong problem-solving skills to diagnose and resolve issues during the implementation and maintenance of Databricks solutions. Capable of identifying bottlenecks, troubleshooting errors, and proposing workflow optimization solutions
  • End to end responsibility of design, documentation, development, testing and deployment with Azure Databricks
  • Experience with or ability in remote sensing, sustainable resource management, Ontario\xe2\x80\x99s Forest and Resource Industry, the forest resources inventory program (FRI) would be beneficial
Soft skills 30%
  • Excellent analytical, problem solving and decision-making skills
  • Experience with agile methodology
  • Excellent communication skills, both written and verbal
  • Excellent meeting facilitation skills to gather requirements
  • Experience reporting progress on deliverables to team, project leads and management, including proactively raising risks/issues with mitigations
  • Strong stakeholder management skills
  • Ability to integrate with a skilled business unit, and LRC technical team
Powered by JazzHR

S M Software Solutions Inc

Beware of fraud agents! do not pay money to get a job

MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD2207865
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Toronto, ON, Canada
  • Education
    Not mentioned