Azure DataOps Engineer (4+)
ey | 5 days ago | Bengaluru

Your key responsibilities

  • Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage.
  • ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources.
  • Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices.
  • Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity.
  • Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues.
  • Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data.
  • DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation.
  • Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow.
  • Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency.

Skills and attributes for success

  • Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks
  • Solid understanding of ETL/ELT design and implementation principles
  • Strong SQL and PySpark skills for data transformation and validation
  • Exposure to Python for automation and scripting
  • Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred)
  • Experience in working with Power BI or Tableau for data visualization and reporting support
  • Strong problem-solving skills, attention to detail, and commitment to data quality
  • Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing.

To qualify for the role, you must have

  • 4–6 years of experience in DataOps or Data Engineering roles
  • Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem
  • Experience working with Informatica CDI or similar data integration tools
  • Scripting and automation experience in Python/PySpark
  • Ability to support data pipelines in a rotational on-call or production support environment
  • Comfortable working in a remote/hybrid and cross-functional team setup
Official notification
Contact US

Let's work laptop charging together

Any question or remark? just write us a message

Send a message

If you would like to discuss anything related to payment, account, licensing,
partnerships, or have pre-sales questions, you’re at the right place.