Senior ETL Engineer & Data Modeller / Senior Consultant Specialist (NM+)
hsbc | 1 hours ago | pune

In this role, you will:

  • Design Pega NBA source datasets for customer profile, interaction history, offers/eligibility, and outcomes—optimised for low-latency decisioning and explainability.
  • Define canonical data models (facts/dimensions) for interactions and responses, including event taxonomy, identifiers, and channel standardisation.
  • Implement history/auditability patterns (SCD Type 2, bitemporal where needed) to support regulatory traceability and “why did we decide X?” replay.
  • Build near-real-time ingestion pipelines (CDC/streaming/micro-batch) with deterministic ordering, idempotency, and deduplication for event feeds.
  • Establish data quality controls (completeness, validity, timeliness, consistency) with automated checks, thresholds, and exception workflows.
  • Set up monitoring & observability for SLAs/SLOs (freshness, lag, volume anomalies), lineage, and alerting integrated with incident management.
  • Partner with stakeholders (Pega decisioning, product, risk/compliance, analytics, engineering) to align eligibility rules, consent, and governance.
  • Drive continuous improvement on performance, cost, and reliability—capacity planning, schema evolution, and controlled releases with rollback plans.

To be successful in this role, you should meet the following requirements:

  • Strong experience designing decisioning/marketing/next-best-action datasets (customer, interactions, offers, outcomes) for operational use.
  • Proven dimensional modelling skills (star schemas, conformed dimensions) plus event modelling for high-volume interaction data.
  • Hands-on expertise with SCD Type 2 and audit patterns (effective dating, versioning, bitemporal concepts) for traceability.
  • Experience delivering near-real-time data feeds (Kafka/Kinesis/PubSub, CDC tools, micro-batching) with exactly-once/at-least-once handling strategies.
  • Strong data quality engineering background: automated validation frameworks, anomaly detection, reconciliation, and data contracts.
  • Solid understanding of data governance: consent/permissions, PII handling, retention, and access controls in regulated environments.
  • Proficiency in SQL and data engineering tooling (ETL/ELT, orchestration, CI/CD, schema registry, metadata/lineage).
    • Ability to communicate clearly with technical and non-technical teams; comfortable translating business rules into data requirements and SLAs.
Official notification

⚡ Hot Jobs Trending Now

SRE
Sr. SRE Engineer
Stripe | Bangalore, India
DEV
Backend Developer
Coinbase | Remote, India
Infra
Cloud Infra Lead
Datadog | Pune, India
ML
MLOps Architect
Anthropic | Hyderabad
Data
Fivetran Data Eng.
Fivetran | Mumbai
SRE
Sr. SRE Engineer
Stripe | Bangalore, India
DEV
Backend Developer
Coinbase | Remote, India
Infra
Cloud Infra Lead
Datadog | Pune, India
ML
MLOps Architect
Anthropic | Hyderabad
Data
Fivetran Data Eng.
Fivetran | Mumbai
SDE
Staff Software Eng.
Airbnb | Gurgaon, India
Prod
Platform Engineer
Databricks | Bangalore
QA
Quality Assurance
GitLab | Remote
Security
Cloud Security
Zscaler | Mumbai
UX
Product Designer
Figma | Pune, India
SDE
Staff Software Eng.
Airbnb | Gurgaon, India
Prod
Platform Engineer
Databricks | Bangalore
QA
Quality Assurance
GitLab | Remote
Security
Cloud Security
Zscaler | Mumbai
UX
Product Designer
Figma | Pune, India
Contact US

Let's work laptop charging together

Any question or remark? just write us a message

Send a message

If you would like to discuss anything related to payment, account, licensing,
partnerships, or have pre-sales questions, you’re at the right place.