GCP Data Engineer
  • micro2MEGA, Inc.
24 Days Ago
NA
NA
Bentonville-AR, Sunnyvale-CA
8-13 Years
Required Skills: GCP, dataproc, GCS, BigQuery, data engineer, Hadoop, hive, spark, airflow, data lakes, RDBMS, python, java, scala
Job Description
Job Title: GCP Data Engineer – Hybrid (Onsite from Day 1)
Locations: Bentonville, AR and Sunnyvale, CA
Duration: Long-Term Contract
Client: Retail
Passport Number: Mandatory for Submission
Linkedin MUST 
 
Job Description:
We seek experienced Senior Data Engineers with strong expertise in Google Cloud Platform (GCP) and Big Data technologies. This role involves designing and maintaining scalable data solutions in an agile environment.
 
Responsibilities:
Develop and optimize big data applications using open-source tools
Design logical and physical data models
Automate workflows with Apache Airflow
Build data pipelines with Hive, Spark, and Kafka
Monitor performance and handle on-call support
Lead stand-ups, mentor junior team members, and manage JIRA backlogs
 
Required Qualifications:
4+ years of hands-on GCP experience (Dataproc, GCS, BigQuery)
10–12 years of total experience in data engineering
6+ years with Hadoop, Hive, Spark, Airflow
5+ years in data modeling (Data Lakes & RDBMS)
Strong skills in Python, Java, and Scala
Familiar with Perl and Shell scripting
Experience with multi-TB/PB data sets
Knowledge of TDD and automation testing
Excellent communication and leadership skills
Bachelor’s degree in Computer Science or equivalent
 
Preferred Skills:
Gitflow and version control
Atlassian tools – JIRA, Confluence, BitBucket
CI/CD tools – Bamboo, Jenkins, or TFS

Please send your updated resume along with your passport number

Jobseeker

Looking For Job?
Search Jobs

Recruiter

Are You Recruiting?
Search Candidates