Required Skills: GCP Bigquery, Python, SQL, Data warehousing, Cloud platforms, Teradata, ETL workflows, data transformation, data modeling, ELT pipelines, data governance
Job Description
Job Title: GCP BIGDATA Engineer
Work Location: Dallas, TX (Onsite)
Must Have Skills:
⦁ GCP Bigquery
⦁ Python
⦁ SQL
Detailed Job Description:
10+ years of experience in Data Engineering with strong expertise in designing, building, and optimizing large-scale data solutions.
Technical Skills:
⦁ Data Warehousing & Cloud Platforms:
⦁ Proficient in Teradata, and Google Cloud Platform (GCP) services including BigQuery, Cloud Storage, Dataflow, and Pub/Sub.
⦁ Expert in SQL for complex query development and performance tuning.
⦁ Strong experience in Python scripting for automation, ETL workflows, and data transformations.
⦁ Thorough understanding of data modeling, ETL/ELT pipelines, data governance, and performance optimization.
Responsibilities:
⦁ Design and implement data pipelines leveraging GCP.
⦁ Collaborate with stakeholders to gather requirements and deliver insightful dashboards.
⦁ Mentor and guide team members on data engineering best practices.
⦁ Ensure data security, quality, and compliance across all solutions.
⦁ Strong communication and leadership skills for stakeholder engagement.
⦁ Proven experience in mentoring teams and driving technical excellence.