-- designing pipelines, performing advanced analytics, and deriving actionable insights from complex datasets. The ideal candidate will lead end-to-end data initiatives, collaborating with business, product, and technical stakeholders to enable data-driven decision-making.
Key Responsibilities
Coordinate and lead data conversations with cross-functional stakeholders and source-system owners.
Explore and analyze large, complex datasets to identify patterns, anomalies, and opportunities.
Design and implement
ETL / ELT data pipelines
using
Airflow
,
BigQuery
, and
MapR Hadoop
.
Develop, train, and deploy
machine-learning (ML)
and
time-series forecasting
models.
Collaborate with business teams to translate requirements into analytics solutions and insights.
Document requirements, processes, and findings using
Confluence
and
JIRA
.
Drive performance optimization in data pipelines (scheduling, staging, and modeling).
Validate and monitor data quality, availability, and accuracy across pipelines.
Present insights and recommendations clearly to technical and non-technical audiences.
Guide junior team members on analytics best practices and project delivery.
Required Qualifications
Bachelor's / Master's degree
in Data Science, Statistics, Economics, Mathematics, or related field (PhD preferred).
10 years
of experience in
data analytics / engineering
, including
cloud and machine-learning projects
.
Expertise in
Big Data and Hadoop Ecosystem (MapR)
.
Advanced proficiency with
Google Cloud Platform (GCP)
, particularly
BigQuery
,
DataFlow
,
Pub/Sub
, and
Cloud Storage
.
4 - 6 years
of experience in
Python
,
R
, and SQLfor data manipulation and modeling.
Hands-on experience with
Apache Airflow
for data orchestration.
Strong understanding of
regression
,
time-series forecasting
, and
causal inference
models.
Familiarity with
anomaly detection
and other ML algorithms.
Experience with
Agile / Scrum
,
JIRA
, and stakeholder management.
Excellent analytical, communication, and presentation skills.
Job Type: Full-time
Pay: $50.00 per hour
Expected hours: 40 per week
Experience:
Big Data / Hadoop: 6 years (required)
Python / R for Data Analysis & Modeling: 6 years (required)
* Airflow (Data Pipeline Orchestration): 3 years (required)
Beware of fraud agents! do not pay money to get a job
MNCJobz.com will not be responsible for any payment made to a third-party. All Terms of Use are applicable.