-
Design, develop, and maintain Snowflake data warehouses and data lakes.
-
Integrate and manage data from various sources using Azure Data Lake and Azure Data Factory.
-
Implement and maintain data pipelines, ensuring efficient data flow and transformation.
-
Develop and enforce data quality and metadata management standards.
-
Collaborate with data analysts, data scientists, and other stakeholders to understand data requirements and deliver solutions.
-
Optimize and tune Snowflake environments for performance and cost efficiency.
-
Monitor and troubleshoot data pipelines and workflows to ensure reliability and accuracy.
-
Stay updated with the latest trends and best practices in Snowflake, Azure, and data management.
-
Bachelor's degree in Computer Science, Information Technology, or a related field.
-
7+ years of post-graduate experience in Data Analytics, Data Engineering and/or Data Warehousing required.
-
5+ years of experience with scripting languages such as SQL, Python, PySpark, SparkQL or Scala.
-
2+ years of experience as a Snowflake Cloud Developer including strong expertise in Snowflake, including data modeling, schema design, and query optimization required.
-
2+ years of Azure Data Lake and Azure Data Factory required.
-
2+ years of experience with Metadata management, Data Marts and data quality principles is required.
-
2+ years of experience with CI/CD pipelines and utilization of Gitlab and Automation processes for code deployment required.
-
Strong experience with ETL/ELT processes and tools.
-
Excellent problem-solving skills and attention to detail.
-
Strong communication and collaboration skills