Strong understanding of AWS environment (PaaS, IaaS) and experience in working with Hybrid model
At least 2 years of project experience on using at least 4 of these components – Pyspark or scala on EMR, Glue, Lambda, Kinesis, Kafka, Scala, Redshift, RDS, Quick Sight
Overall, at least 6 years of experience on Data Lake / Data warehouse projects
Good understanding of DevOps and CI/CD concepts
Good understanding of Data management concepts, esp data modelling, security, data quality and governance, MDM, and Data Lineage
Able to understand requirements, create repeatable design patterns for developers
Nice to have AWS solution Architect Associate or any Specialty certificate