Location: Edmonton Alberta T5J 3W7 (primarily Remote)
Duration: 12-month contract + 12 possible extensions
Description:
Project Name:
Data Management and Geospatial Services Platforms Continuous Improvement
Duties:
Provides leadership, direction, and advice on the Data Management Platform and Geospatial Services Platform Improvements and Sustainment. Services and project deliverables should evolve as the work progresses, in response to emerging user and business needs, as well as design and technical opportunities. However, the following must be delivered (iteratively) over the course of the project:
Strategic Leadership & Collaboration
Define the strategic vision and multi-year roadmap for the enterprise Data Management Platform, ensuring alignment with business objectives and the organization's data and AI strategy.
Work closely with Service Owners, Service Directors, and senior leadership to prioritize platform capabilities, guide investment decisions, and shape long-term modernization initiatives.
Translate organizational goals into executable platform strategies, influencing enterprise architecture, data governance, and AI enablement across business units.
Drive enterprise adoption of the data platform by enabling self-service capabilities, standardizing data products, and improving cross-functional usability for analytics and AI teams.
Technical Architecture & Engineering Execution
Architect and implement cloud-native data solutions leveraging Azure and Databricks, including Delta Lake, Unity Catalog, and Medallion architectures.
Lead the design and development of scalable ingestion, transformation, and orchestration pipelines using Databricks, Azure Data Factory, APIs, Event Hub, Functions, and other Azure services.
Establish and enforce enterprise-wide data governance, security, and compliance frameworks, including RBAC/ABAC, encryption practices, Key Vault, and secure networking standards.
Implement robust DataOps practices and CI/CD automation for Databricks workflows, notebooks, Unity Catalog assets, and Azure data pipelines.
Optimize performance and cost efficiency across Databricks clusters, ETL workloads, Delta Lake storage, and Azure resources.
AI Enablement & Automation
Identify opportunities to introduce AI-driven automation into data engineering, such as automated pipeline generation, anomaly detection, intelligent observability, and generative code scaffolding.
Lead proofs of concept and evaluate new Azure, Databricks, and AI capabilities, recommending adoption paths for technologies that enhance scalability, automation, and platform intelligence.
Team Enablement & Mentorship
Mentor, coach, and develop data engineers and platform team members, ensuring adherence to best practices, strong engineering standards, and continuous learning.
Provide architectural guidance and technical leadership to teams building data products, analytics solutions, AI workloads, and integration patterns using the enterprise data platform.
Qualification/Must Have:
Bachelor degree in Computer Science or related field of study.
Professional Licenses/Certification
Certification in The Open Group Architecture Framework (TOGAF).
Work Experience
Current advanced hands-on engineering experience operating and optimizing Azure service, Azure Databricks environments, including cluster configuration, autoscaling strategies, security, Key Vault integration, encryption, CI/CD and SDLC/DataOps practices, monitoring/observability, capacity planning, and ensuring cost-effective, high-performing production workloads.
Current hands-on design, development, and operation of end-to-end enterprise data pipelines using Databricks (PySpark/SQL notebooks, workflows, DLT), Azure Data Factory, APIs, Event Hubs, Functions, or related Azure services--including ingestion, transformation, validation, error handling, performance optimization, and production support.
Current hands-on enterprise and solution architecture experience designing and implementing Azure-based data platform architectures--covering compute models, integration patterns, identity/security models, network design, cost governance, and scalability--while collaborating closely with Cloud Engineering, Security, and Infrastructure teams.
Deep data architecture and Azure Databricks expertise with current production-grade implementation of Medallion architecture, Delta Lake optimization, DLT/job orchestration, schema evolution, data quality frameworks, and expert-level working knowledge of Unity Catalog permissions, governance, lineage, and secure data sharing across teams.
Demonstrated ability to work directly with senior leadership. Including Service Owners, Service Directors, Executive Directors, and Architecture Review Boards--actively shaping data platform strategy, validating solution direction, and aligning architectural decisions with organizational priorities and enterprise technology standards.
Extensive systems integration experience with enterprise platforms such as ServiceNow using Talend, including hands-on delivery of API-based ingestion, incremental/CDC pipelines, metadata-driven orchestration, event-driven patterns, secure authentication, and full lifecycle integration aligned with enterprise governance and network standards.
Proven leadership mentoring data engineering and analytics teams by driving architectural best practices, reviewing solution designs, establishing coding and data engineering standards, and enabling teams to rapidly deliver high-quality, scalable data solutions in a complex enterprise environment.
Demonstrated experience engaging directly with business or government clients, effectively explaining complex data concepts, architectural decisions, and platform capabilities in a clear and strategic manner.
Experience designing and implementing enterprise geospatial solutions, including integration of location intelligence into data pipelines, analytical models, and enterprise data platforms
Experience using GitHub for source code management, version control, and collaboration, including leveraging GitHub Copilot assisted development tools to improve coding efficiency, quality, and maintainability in enterprise data engineering projects.
Experience with AI or machine learning applied to data engineering or analytics, such as using ML for anomaly detection, predictive monitoring, or AI-assisted data transformation and automation.
Experience with business intelligence, visualization, or analytics platforms, including dashboards, reporting, and analytical workflows to support decision-making, separate from core data engineering tasks.
Experience with Esri ArcGIS Enterprise, including spatial data management, map services, spatial analytics, and integration with enterprise systems.
Experience with Service Management processes and tools, including onboarding and integrating enterprise data platforms with ServiceNow Enterprise Service Management (ESM) to support service requests, incident management, change control, and operational workflows.
Experience working with environmental datasets, such as air quality, water resources, land data, or mineral resources, and integrating them into analytical or operational data platforms.
Strong applied experience with data modelling concepts and tools, including dimensional modeling, semantic modeling, entity modelling, ontology design, or canonical models to support analytics, integration, or reporting.
Job Type: Fixed term contract
Contract length: 12 months
Pay: $90.44-$100.33 per hour
Licence/Certification:
* TOGAF Certification (preferred)
Beware of fraud agents! do not pay money to get a job
MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.