Data Lakehouse Engineer

Toronto, ON, Canada

Job Description

Job Summary:
We are seeking a skilled Data Lakehouse Engineer to design, implement, and optimize our modern data lakehouse architecture--enabling seamless data ingestion, transformation, storage, and analytics. You will be responsible for building scalable, high-performance data pipelines, ensuring data quality and governance, and integrating structured and unstructured data to support business intelligence, AI, and machine learning use cases.
A key focus of this role will be enabling a comprehensive, data-driven view of our customers by bringing together data from multiple systems--including CRM, CDP, and transactional sources--into an accessible, trusted foundation. To be successful, you will collaborate closely with data engineers, architects, BI analysts, and data scientists to develop a unified platform that supports both real-time streaming and batch processing workloads. You will also play a key role in ensuring security, privacy, and compliance across the data lakehouse environment. Duties & Responsibilities:

  • Design, implement, and maintain a scalable data lakehouse using modern data engineering frameworks
  • Collaborate with business stakeholders to translate requirements into data models,and understand how business processes can translate into data requirements
  • Design and implement Customer 360 data models and transformation logic using SQL stored procedures
  • Design and build ETL/ELT pipelines across lakehouse storage zones for efficient ingestion, transformation, and optimization
  • Develop real-time streaming data pipelines using industry-standard tools and ensure seamless integration into the lakehouse architecture
  • Demonstrate curiosity to deeply understand complex business challenges and opportunities, collaborate on data-driven solutions, and balance an experimentation mindset with an agile implementation approach
  • Implement secure, automated data ingestion pipelines with end-to-end lineage tracking, aligned to enterprise architecture standards and governed by role-based access controls and encryption
  • Apply data masking, anonymization, and auditing practices to meet privacy and compliance standards (e.g., GDPR, PIPEDA)
  • Maintain a centralized data catalog with complete metadata, source tracking, and transformation business rule documentation, alongside up-to-date records of system architecture, data models, and operational workflows
  • Partner with data scientists, analysts, and BI teams to deliver accessible, well-structured, and high-quality datasets
  • Ensure infrastructure scalability, performance, and resiliency, while maintaining high security standards
  • Actively participate in Porter's Safety Management System (SMS) including reporting hazards and incidents encountered in daily operations; understand, comply and promote the Company Safety Policy
Behavioural Competencies:
Concern for Safety: Identifying hazardous or potentially hazardous situations and taking appropriate action to maintain a safe environment for self and others.
Teamwork: Working collaboratively with others to achieve organizational goals.
Passenger/Customer Service: Providing service excellence to internal and/or external customers (passengers).
Initiative: Dealing with situations and issues proactively and persistently, seizing opportunities that arise.
Results Focus: Focusing efforts on achieving high quality results consistent with the organization's standards.
Fostering Communication: Listening and communicating openly, honestly, and respectfully with different audiences, promoting dialogue and building consensus. Qualifications:
  • Strong experience with data lakehouse storage and query technologies
  • 5+ years of experience in data engineering using SQL, Python, or Scala, with advanced SQL skills in stored procedure development, optimization, and complex transformations
  • Experience with versioned data models and incremental processing
  • 3-5 years of hands-on experience with real-time data streaming technologies
  • Hands-on experience with MACH architecture (Microservices, API-first, Cloud-native, Headless), along with strong proficiency in CRM systems, Customer Data Platforms (CDPs)
  • Proficient in ETL/ELT development using AWS Glue, Matillion, Mulesoft or similar tools
  • Familiar with data orchestration tools, governance frameworks, catalogiing solutions and security best practices in modern cloud-based environments
  • Strong problem-solving and debugging skills
  • Collaborative mindset and ability to work across cross-functional teams in an Agile environment
  • Excellent communication and documentation skills, with proven track record of communicating technical contecpts to non-technical stakeholders
  • Airline and/or aviation experience strongly preferred
Company Description:
Since 2006, Porter Airlines has been elevating the experience of economy air travel for every passenger, providing genuine hospitality with style, care and charm. Porter's fleet of Embraer E195-E2 and De Havilland Dash 8-400 aircraft serves a North American network from Eastern Canada. Headquartered in Toronto, Porter is an Official 4 Star Airline in the World Airline Star Rating. Visit or follow @porterairlines on Instagram, Facebook and X.

Beware of fraud agents! do not pay money to get a job

MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.


Related Jobs

Job Detail

  • Job Id
    JD2585460
  • Industry
    Not mentioned
  • Total Positions
    1
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
    Permanent
  • Job Location
    Toronto, ON, Canada
  • Education
    Not mentioned