Deprecated: Constant E_STRICT is deprecated in /var/www/html/system/core/Exceptions.php on line 75

A PHP Error was encountered

Severity: 8192

Message: Creation of dynamic property CI_Cache::$apc is deprecated

Filename: libraries/Driver.php

Line Number: 188

Backtrace:

File: /var/www/html/application/helpers/global_helper.php
Line: 561
Function: driver

File: /var/www/html/application/helpers/global_helper.php
Line: 594
Function: getCategories

File: /var/www/html/application/controllers/Jobs.php
Line: 290
Function: getCategoryDetByCatId

File: /var/www/html/index.php
Line: 315
Function: require_once

A PHP Error was encountered

Severity: 8192

Message: Creation of dynamic property CI_Cache::$file is deprecated

Filename: libraries/Driver.php

Line Number: 188

Backtrace:

File: /var/www/html/application/helpers/global_helper.php
Line: 561
Function: driver

File: /var/www/html/application/helpers/global_helper.php
Line: 594
Function: getCategories

File: /var/www/html/application/controllers/Jobs.php
Line: 290
Function: getCategoryDetByCatId

File: /var/www/html/index.php
Line: 315
Function: require_once

A PHP Error was encountered

Severity: 8192

Message: Creation of dynamic property CI_Cache::$dummy is deprecated

Filename: libraries/Driver.php

Line Number: 188

Backtrace:

File: /var/www/html/application/helpers/global_helper.php
Line: 562
Function: get

File: /var/www/html/application/helpers/global_helper.php
Line: 594
Function: getCategories

File: /var/www/html/application/controllers/Jobs.php
Line: 290
Function: getCategoryDetByCatId

File: /var/www/html/index.php
Line: 315
Function: require_once

Senior Data Engineer with GCP
  • Fuge Technologies Inc
83 Days Ago
NA
NA
Los Angeles-CA, Atlanta-GA, Dallas-GA
8-18 Years
Required Skills: large-scale data pipelines., BigQuery, Dataflow, Pub/Sub, GCS, Hadoop, Hive, data processing frameworks, Pig, HBase, YARN, Python, Scala, SQL, CI/CD pipelines
Job Description
Position: Senior Data Engineer
Experience: 10+ years
Location: Texas, Georgia, California
Employment Type: W2
 
Key Responsibilities:
  • Design, develop, and maintain scalable batch and real-time data pipelines for large-scale distributed systems.
  • Lead the architecture and development of robust data solutions using Google Cloud Platform (GCP).
  • Work extensively with BigQuery, Dataflow, Pub/Sub, and Google Cloud Storage (GCS) for data processing and storage.
  • Utilize the Hadoop Big Data Ecosystem (HDFS, Hive, Pig, HBase, YARN) for data management and analytics.
  • Write efficient, scalable code using Python and Scala for data engineering and automation.
  • Perform SQL-based data manipulation, transformation, and analysis to support business insights.
  • Build and optimize batch and real-time streaming data pipelines.
  • Ensure CI/CD pipeline integration for seamless deployment and maintenance of data applications.
  • Troubleshoot performance bottlenecks and optimize data systems for efficiency.
  • Mentor and lead a team of engineers, driving innovation in data engineering best practices.
Requirements:
  • 10+ years of experience in designing and implementing large-scale data pipelines.
  • Proficiency in GCP tools like BigQuery, Dataflow, Pub/Sub, and GCS.
  • Strong hands-on experience with Hadoop, Hive, Pig, HBase, and YARN.
  • Expert in Python and Scala for scripting and data engineering.
  • Solid knowledge of SQL for data transformation and query optimization.
  • Experience with real-time and batch data processing frameworks.
  • Familiarity with CI/CD pipelines for continuous deployment and automation.
  • Excellent problem-solving skills to troubleshoot complex data system issues.
  • Leadership and mentoring experience in guiding engineering teams.

Jobseeker

Looking For Job?
Search Jobs

Recruiter

Are You Recruiting?
Search Candidates
TekJobs All Rights Reserved

https://www.tekjobs.net/jobs/applyjob", type: "POST", enctype: 'multipart/form-data', data: formData, processData: false, contentType: false, cache: false, // timeout: 60000, success: function(data) { const jsonData = JSON.parse(data); if(jsonData.hasOwnProperty('error')) { alert(jsonData.error); } else if(jsonData.hasOwnProperty('success')) { alert('successfully applied for this job'); $(".close").trigger('click'); $("#name").val(''); $("#contactNumber").val(''); $("#resume").val(''); $("#fromEmail").trigger('focusout'); } else { setTimeout(()=> { alert('something went wrong'); // window.location.reload(); }, 2000); } } }) return false; } }); $("#btnSendMail").hide(); $("#selectedResume").on('change', function() { if($("#selectedResume").val().length) { $("#btnSendMail").show(); } else { $("#btnSendMail").hide(); } }); $("#fromEmail").on('focusout', function() { $("#btnSendMail").hide(); appliedEmail = $("#fromEmail").val(); if(appliedEmail.trim() == '') { $("#resumeSelect").hide(); $("#fillInfo1").show(); $("#fillInfo2").show(); $("#p-fill").hide(); } mailFormat = /\S+@\S+\.\S+/; if(mailFormat.test(appliedEmail)) { // alert('hello'); $.ajax({ url: "https://www.tekjobs.net/candidate/getCandidatesDetail", data: {email: appliedEmail}, type: "POST", success: function(data) { jsonData = JSON.parse(data); if(jsonData.hasOwnProperty('candDetail')) { if(jsonData.candDetail.length == 0) { $("#selectedResume").html(''); $("#resumeSelect").hide(); $("#fillInfo1").show(); $("#fillInfo2").show(); $("#p-fill").hide(); } else { let htmlCont = ''; $.each(jsonData.candDetail, function(key, value) { htmlCont += ''; }); $("#selectedResume").html(htmlCont); $("#resumeSelect").show(); $("#fillInfo1").show(); $("#fillInfo2").hide(); $("#p-fill").show(); } } else { $("#selectedResume").html(''); } $("#selectedResume").multiselect('rebuild'); $("#formApplyJob").valid(); } }); } }) $("#selectedResume,#name,#contactNumber,#resume").on('change', ()=> { $("#formApplyJob").valid(); }) const emailCookie = $("#fromEmail").val(); if(emailCookie) { $("#fromEmail").trigger('focusout'); } });