Hands on experience in python and pyspark and strong knowledge on Dataframes, RDD and SparkSQL
Hands on Experience in developing, testing and maintaining applications on AWS Cloud.
Strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake formation, Athena)
Design and implement scalable and efficient data transformation/storage solutions using Snowflake.
Experience in Data ingestion to Snowflake for different storage format such Parquet, Iceberg, JSON, CSV etc
Experience in using DBT (Data Build Tool) with snowflake for ELT pipeline development.
Experience in Writing advanced SQL and PL SQL programs.
Experience in AWS data pipeline development.
Hands On Experience for building reusable components using Snowflake and AWS Tools/Technology
Should have worked at least on two major project implementations.
Exposure to data governance or lineage tools such as Immuta and Alation is added advantage.
Experience in using Orchestration tools such as Apache Airflow or Snowflake Tasks is added advantage.
Knowledge on Abinitio ETL tool is a plus
Any question or remark? just write us a message
If you would like to discuss anything related to payment, account, licensing,
partnerships, or have pre-sales questions, you’re at the right place.