Required Skills: Snowflake, AWS, Azure, GCP, GIT, Apache Airflow, ETL, SQL, Python, Java, Scala
Job Description
Job Overview:
We are looking for a Data Engineer with expertise in Snowflake to join our team. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines and data architecture. You will work closely with cross-functional teams to enable data-driven decision-making within the organization.
Key Responsibilities:
- Design, develop, and maintain data pipelines and data warehouses using Snowflake.
- Work with various data sources, ensuring data quality, integrity, and performance.
- Collaborate with data scientists, analysts, and other teams to meet data needs and support analytical objectives.
- Develop and optimize SQL queries for data extraction, transformation, and loading (ETL).
- Implement best practices for data modeling and architecture in Snowflake.
- Monitor and troubleshoot data pipeline performance and ensure optimal data flow.
- Ensure security, compliance, and governance of data within Snowflake.
- Participate in the creation and improvement of data infrastructure and tools.
Required Skills & Qualifications:
- Proven experience as a Data Engineer, Data Analyst, or similar role with expertise in Snowflake.
- Strong proficiency in SQL and Snowflake architecture.
- Experience in designing and developing ETL processes.
- Familiarity with data modeling, data warehousing, and cloud-based data platforms.
- Knowledge of programming languages such as Python, Java, or Scala is a plus.
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration skills.
Preferred Skills:
-
Familiarity with other cloud platforms (AWS, Azure, GCP).
-
Experience with version control systems such as Git.
-
Knowledge of orchestration tools like Apache Airflow.