Job Location:
Bangalore / Gurgaon
Key responsibilities:
Design, implement, and manage the infrastructure for data pipelines and workflows, including data ingestion, processing, storage, and access.
Build and maintain CI/CD pipelines for data engineering projects, automating testing, deployment, and monitoring.
Champion DevOps principles within the data engineering team, fostering a culture of collaboration, automation, and continuous improvement.
Work closely with data engineers and scientists to optimize data processes, ensuring scalability, performance, and reliability.
Implement and manage monitoring and alerting systems to proactively identify and address issues in data pipelines.
Contribute to the development and implementation of data security and governance policies.
Stay up-to-date with the latest technologies and trends in data engineering and DevOps.
Qualifications & skills required:
Experience: 7-10 years
Strong foundation in DevOps principles and practices: CI/CD, IaC, automation, monitoring, and logging.
Experience with cloud platforms: AWS including services relevant to data engineering (e.g., storage, compute, databases, data pipelines).
Proficiency in scripting and infrastructure-as-code tools: Python, Bash, Terraform, Ansible, etc.
Experience with data engineering tools and technologies: Spark, Hadoop, Kafka, Airflow, etc.
Knowledge of containerization and orchestration: Docker, Kubernetes.
Familiarity with data warehousing and data lake concepts.
Strong problem-solving and troubleshooting skills.
Excellent communication and collaboration skills.
Any question or remark? just write us a message
If you would like to discuss anything related to payment, account, licensing,
partnerships, or have pre-sales questions, you’re at the right place.