We need: A senior (10+ years) Data Engineer / Architect to design, implement, and optimize robust data solutions that power our applications, analytics, and reporting systems. This role requires strong expertise in both SQL and NoSQL technologies, advanced data processing, and the ability to architect scalable and secure data flows. They will Implement and manage data caching solutions using Redis and ElasticSearch for performance optimization, Lead data mapping and transformation initiatives, ensuring data quality, consistency, and integrity and Plan and execute data migration and synchronization strategies between systems and environments. Hands-on experience with Kafka and/or Solace for event streaming and real-time data processing. Banking or Capital Markets experience is a big plus but not required.
:
We are seeking a highly skilled
Senior Data Engineer/Architect
to design, implement, and optimize robust data solutions that power our applications, analytics, and reporting systems. This role requires strong expertise in both SQL and NoSQL technologies, advanced data processing, and the ability to architect scalable and secure data flows. The ideal candidate will be hands-on, detail-oriented, and capable of translating complex business needs into efficient, reliable data architectures.
Key Responsibilities:
Design and develop scalable data architectures and processing pipelines to support business and analytical needs.
Write and optimize complex SQL and NoSQL queries for efficient data retrieval and transformation.
Implement and manage data caching solutions using
Redis
and
ElasticSearch
for performance optimization.
Lead
data mapping
and
transformation
initiatives, ensuring data quality, consistency, and integrity.
Plan and execute
data migration
and
synchronization
strategies between systems and environments.
Integrate and manage
APIs
for data ingestion, transformation, and distribution.
Develop and maintain
data analytics
and
reporting
frameworks to deliver actionable insights.
Work with
graph databases
and
vector data models
to support advanced analytics and AI/ML use cases.
Create clear, comprehensive diagrams of
data architecture, processing workflows, and data flows
for documentation and collaboration.
Collaborate with cross-functional teams, including software engineers, product managers, and analysts, to ensure seamless data operations.
Preferred Skills & Technologies:
Hands-on experience with
Kafka
and/or
Solace
for event streaming and real-time data processing.
Familiarity with data security best practices, compliance requirements, and governance frameworks.
Experience with cloud data platforms (AWS, Azure, or GCP).
Knowledge of containerization and orchestration tools (Docker, Kubernetes).
Qualifications:
Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field.
7+ years of experience in data engineering, data architecture, or related roles.
Proven track record of designing and implementing complex, large-scale data solutions.
Strong analytical skills, attention to detail, and problem-solving ability.
Excellent communication skills and the ability to work collaboratively across teams