In this role you will lead the design, development, and optimization of scalable data pipelines and analytics solutions. The ideal candidate will have a strong background in cloud data platforms, data architecture, and team leadership, with a passion for solving complex data challenges in a collaborative environment. You'll collaborate with Product Owners, Data engineers, Analysts, and other stakeholders to understand requirements and deliver solutions in an entrepreneurial culture where teamwork is encouraged, excellence is rewarded, and diversity is valued.
Required Qualifications:
Candidate must be located within commuting distance of
Burnaby, BC
(Canada)
or be willing to relocate to the area.
Candidates authorized to work for any employer in Canada without employer based visa sponsorship are welcome to apply. Infosys is unable to provide immigration sponsorship for this role at this time.
Bachelor's degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
At least 4+ years of Information Technology experience
Basic Qualifications:
Databricks
: Strong experience with PySpark/Scala, Delta Lake.
Snowflake
: Expertise in Snowflake architecture, performance tuning, and SQL development.
Cloud Platforms
: Proficiency in
Azure
.
Data Modeling
: Experience with dimensional modeling, star/snowflake schemas, and data vault.
ETL/ELT Tools
: Hands-on with tools like
Azure Data Factory
.
Programming Languages
: Python, SQL, Sparkand optionally Scala.
DevOps & CI/CD
: Familiarity with Git, Jenkins, Terraform, or similar tools.
Data Governance & Security
: Knowledge of data cataloging, lineage, and access control.
Mandatory Skillsets:
Experience Leading the design and implementation of scalable data pipelines using
Databricks (Spark)
and
Snowflake
.
Experience with optimizing data lake and data warehouse solutions on
cloud platforms
(Azure).
Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality data solutions.
Implement best practices for data governance, security, and performance tuning.
Mentor and guide junior data engineers and contribute to technical leadership within the team.
Manage CI/CD pipelines and orchestrate workflows using tools like
Airflow
, Control-M and
Azure Data Factory
Monitor and troubleshoot data pipelines and ensure data quality and reliability.
Stay current with emerging technologies and recommend improvements to existing systems.
Good to have:
Bachelor's or Master's degree in Computer Science, Engineering, or related field.
Certifications in
Databricks
,
Snowflake
, or
Azure/AWS
.
Experience with
real-time data processing
(Kafka, Spark Streaming).
Familiarity with
BI tools
like Power BI, Tableau, or Looker.
Other Experience:
3-5 years in project development lifecycle activities and maintenance/support.
Experience working in Agile environments.
Ability to translate requirements into technical solutions meeting quality standards.
Collaboration skills in diverse environments to identify and resolve data issues.
Strong problem-solving and analytical abilities.
Experience in global delivery environments.
Commitment to staying current with industry trends in modern data warehousing.
The job entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email or face to face.
Estimated annual compensation range for the candidate based in the below location will be:
British Columbia : $ 81575 to $ 111670
Beware of fraud agents! do not pay money to get a job
MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.