Key Responsibilities:
• Spreading the DevOps culture across business units by implementing on-commit deployment and automated testing solutions.
• Developing systems using the latest technologies to streamline the release management process into AWS.
• Obtaining an understanding of product offerings and helping improve the customer experience.
• Ensuring application monitoring and metrics are captured for all deployed assets.
• Enforcing quality and security requirements in release pipelines.
• Identifying areas of improvement in the environment and making recommendations on improvements1.
• Design, develop, and maintain scalable data pipelines and ETL processes using AWS services such as Glue, Lambda, and Redshift.
• Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions.
• Implement data integration and transformation processes to ensure data quality and consistency.
• Optimize and tune data pipelines for performance and cost-efficiency.
• Monitor and troubleshoot data pipeline issues to ensure data availability and reliability.
• Develop and maintain documentation for data pipelines, processes, and infrastructure.
• Stay up-to-date with the latest AWS services and best practices in data engineering.
• Leverage Apache Flink for real-time data processing and analytics, ensuring low-latency data handling.
• Employ Apache Kafka for stream processing and integrating data from various sources into the data pipelines.
Preferred Qualifications:
• AWS Certified Developer – Associate or similar certification.
• Experience with microservices architecture.
• Knowledge of security best practices in cloud environments.
Any question or remark? just write us a message
If you would like to discuss anything related to payment, account, licensing,
partnerships, or have pre-sales questions, you’re at the right place.