Deprecated: Constant E_STRICT is deprecated in /var/www/html/system/core/Exceptions.php on line 75
A PHP Error was encountered
Severity: 8192
Message: Creation of dynamic property CI_Cache::$apc is deprecated
Filename: libraries/Driver.php
Line Number: 188
Backtrace:
File: /var/www/html/application/helpers/global_helper.php
Line: 561
Function: driver
File: /var/www/html/application/helpers/global_helper.php
Line: 594
Function: getCategories
File: /var/www/html/application/controllers/Jobs.php
Line: 290
Function: getCategoryDetByCatId
File: /var/www/html/index.php
Line: 315
Function: require_once
A PHP Error was encountered
Severity: 8192
Message: Creation of dynamic property CI_Cache::$file is deprecated
Filename: libraries/Driver.php
Line Number: 188
Backtrace:
File: /var/www/html/application/helpers/global_helper.php
Line: 561
Function: driver
File: /var/www/html/application/helpers/global_helper.php
Line: 594
Function: getCategories
File: /var/www/html/application/controllers/Jobs.php
Line: 290
Function: getCategoryDetByCatId
File: /var/www/html/index.php
Line: 315
Function: require_once
A PHP Error was encountered
Severity: 8192
Message: Creation of dynamic property CI_Cache::$dummy is deprecated
Filename: libraries/Driver.php
Line Number: 188
Backtrace:
File: /var/www/html/application/helpers/global_helper.php
Line: 562
Function: get
File: /var/www/html/application/helpers/global_helper.php
Line: 594
Function: getCategories
File: /var/www/html/application/controllers/Jobs.php
Line: 290
Function: getCategoryDetByCatId
File: /var/www/html/index.php
Line: 315
Function: require_once
Azure Databricks Engineer
76 Days Ago
55-58 per Hourly
Required Skills: Databricks, Pyspark, Azure Cloud Services, Asset Bundle
Job Description
Role : Azure Databricks Engineer
Location : New York, NY Onsite – day 1
Must have : Databricks , Pyspark , Azure Cloud Services, Asset Bundle
Technical Skills:
- Strong expertise in Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime etc.)
- Proficiency in Azure Cloud Services.
- Solid Understanding of Spark and PySpark for big data processing.
- Experience in relational databases.
- Knowledge on Databricks Asset Bundles and GitLab.
Key Responsibilities:
- Data Pipeline Development:
- Build and maintain scalable ETL/ELT pipelines using Databricks.
- Leverage PySpark/Spark and SQL to transform and process large datasets.
- Integrate data from multiple sources including Azure Blob Storage, ADLS and other relational/non-relational systems.
- Collaboration & Analysis:
- Work Closely with multiple teams to prepare data for dashboard and BI Tools.
- Collaborate with cross-functional teams to understand business requirements and deliver tailored data solutions.
- Performance & Optimization:
- Optimize Databricks workloads for cost efficiency and performance.
- Monitor and troubleshoot data pipelines to ensure reliability and accuracy.
- Governance & Security:
- Implement and manage data security, access controls and governance standards using Unity Catalog.
- Ensure compliance with organizational and regulatory data policies.
- Deployment:
- Leverage Databricks Asset Bundles for seamless deployment of Databricks jobs, notebooks and configurations across environments.
- Manage version control for Databricks artifacts and collaborate with team to maintain development best practices.
Preferred Experience:
- Familiarity with Databricks Runtimes and advanced configurations.
- Knowledge of streaming frameworks like Spark Streaming.
- Experience in developing real-time data solutions.
Certifications:
TekJobs All Rights Reserved