Required Skills: GCP Data Architect
Job Description
Key Responsibilities:
- Design and architect scalable telemetry data storage and analytics systems using GCP services.
- Define and implement data architecture strategies for real-time and batch data processing.
- Ensure optimal performance, scalability, and cost-efficiency of data storage and processing solutions.
- Develop and enforce data governance policies and best practices.
- Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
- Provide architectural guidance and mentorship to data engineering teams.
- Set up monitoring, alerting, and automated reporting systems to ensure data quality and system reliability.
- Evaluate and recommend new technologies and tools to enhance data architecture.
Required Skills and Qualifications:
- Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
- 10+ years of experience in data architecture, big data technologies, and analytics.
- Strong expertise in Google Cloud Platform (GCP) services, including but not limited to BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Cloud Functions.
- Proficiency in event streaming platforms such as Apache Kafka or Event Hub.
- Experience with data pipeline orchestration tools like Apache Airflow or Google Cloud Composer.
- Strong programming skills in languages such as Python, Java, or Scala.
- Solid understanding of SQL and experience with database technologies.
- Knowledge of monitoring and logging tools like Prometheus, Grafana, or Stackdriver.
- Excellent problem-solving skills and the ability to work independently and as part of a team.
- Strong communication skills and the ability to convey complex technical concepts to non-technical stakeholders.
Preferred Skills:
-
Familiarity with containerization and orchestration tools like Docker and Kubernetes.
-
Experience with CI/CD pipelines and DevOps practices.
-
Experience with relevant projects in a large retail company