-
Must have 6+ years of experience in GCP
-
Must have 5+ years of experience in Pyspark
-
Must have 5+ years of experience in Scala
-
Experience working in Big data, Spark, Scala, Python/Java
-
Working on cloud experience, deploying containerized applications, building data pipelines using GCP
-
DataProc/Azure Data bricks using pySpark.
-
Own code quality and experience in writing extensive unit tests.
-
Excellent written communication.
-
SQL Coding.