image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Technical Lead Databricks Azure Data Lake

  • ... Posted on: Feb 04, 2025
  • ... iSoftTek Solutions Inc
  • ... Greenfield, Indiana
  • ... Salary: Not Available
  • ... Full-time

Technical Lead Databricks Azure Data Lake   

Job Title :

Technical Lead Databricks Azure Data Lake

Job Type :

Full-time

Job Location :

Greenfield Indiana United States

Remote :

No

Jobcon Logo Job Description :

We are looking for a highly experienced Data Engineering Specialist to join our team. The ideal candidate should have a strong background in cloud technologies, DevOps practices, and data engineering to support and enhance our RDAP initiatives.

Requirements

Key Responsibilities:

Databricks Lakehouse Solutions: Design, develop, and maintain Databricks-based solutions utilizing cloud platforms such as Azure Synapse and GCP.

DevOps & CI/CD: Implement and manage CI/CD pipelines using tools like GitHub while ensuring best practices in test-driven development, code reviews, and branching strategies.

Python Development: Build, manage, and optimize Python packages using tools such as setup, Poetry, wheels, and artifact registries.

Data Pipelines & Workflows: Develop and optimize workflows in Databricks (PySpark, Databricks Asset Bundles) for data ingestion, processing, and transformation.

Database Management: Work with SQL databases, including Unity Catalog, SQL Server, Hive, and Postgres.

Orchestration: Implement data orchestration solutions using tools like Databricks Workflows, Airflow, and Dagster.

Event-Driven Architecture: Manage event streaming solutions using Kafka, Azure Event Hub, and Google Cloud Pub/Sub.

Change Data Capture (CDC): Implement CDC strategies with tools like Debezium.

Data Migration: Design and execute data migration projects for Azure Synapse and Databricks Lakehouse.

Cloud Storage Management: Handle cloud storage solutions like Azure Data Lake Storage and Google Cloud Storage.

Identity & Access Management: Configure and manage Azure Active Directory (AD Groups, Service Principals, Managed Identities) for security and authentication.

---

Primary Skills (Must-Have):

Python Package Development (setup, poetry, wheels, artifact registries)

Databricks & PySpark (Databricks Asset Bundles)

Open File Formats (Delta, Parquet, Iceberg, etc.)

SQL Databases (Unity Catalog, SQL Server, Hive, Postgres)

Orchestration Tools (Databricks Workflows, Airflow, Dagster)

Azure Data Lake Storage

Azure Active Directory (AD Groups, Service Principals, Managed Identities)

---

Secondary Skills (Good to Have):

Kafka, Azure Event Hub, Google Cloud Pub/Sub

Change Data Capture (Debezium)

Google Cloud Storage

---

Soft Skills & Leadership Responsibilities:

Communication Skills:

Ability to articulate complex technical concepts to both technical and non-technical stakeholders.

Strong documentation skills for process guidelines, technical workflows, and reports.

Problem-Solving & Analytical Thinking:

Strong troubleshooting abilities and the capability to resolve issues effectively.

Analytical mindset to optimize data workflows and system performance.

Leadership & Collaboration:

Client Interactions: Understand business requirements, contribute to design discussions, and translate them into actionable deliverables.

Team Collaboration: Work closely with cross-functional teams across development, operations, and business units.

Stakeholder Engagement: Build and maintain strong relationships with internal and external stakeholders.

Benefits

.

Jobcon Logo Position Details

Posted:

Feb 04, 2025

Employment:

Full-time

Salary:

Not Available

Snaprecruit ID:

SD-WOR-b0a061b4f2a160dd17de3ea0ed0ad7f867305026d649b2cac4b6fac28ccfbcf2

City:

Greenfield

Job Origin:

WORKABLE_ORGANIC_FEED

Share this job:

  • linkedin

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Technical Lead Databricks Azure Data Lake    Apply

Click on the below icons to share this job to Linkedin, Twitter!

We are looking for a highly experienced Data Engineering Specialist to join our team. The ideal candidate should have a strong background in cloud technologies, DevOps practices, and data engineering to support and enhance our RDAP initiatives.

Requirements

Key Responsibilities:

Databricks Lakehouse Solutions: Design, develop, and maintain Databricks-based solutions utilizing cloud platforms such as Azure Synapse and GCP.

DevOps & CI/CD: Implement and manage CI/CD pipelines using tools like GitHub while ensuring best practices in test-driven development, code reviews, and branching strategies.

Python Development: Build, manage, and optimize Python packages using tools such as setup, Poetry, wheels, and artifact registries.

Data Pipelines & Workflows: Develop and optimize workflows in Databricks (PySpark, Databricks Asset Bundles) for data ingestion, processing, and transformation.

Database Management: Work with SQL databases, including Unity Catalog, SQL Server, Hive, and Postgres.

Orchestration: Implement data orchestration solutions using tools like Databricks Workflows, Airflow, and Dagster.

Event-Driven Architecture: Manage event streaming solutions using Kafka, Azure Event Hub, and Google Cloud Pub/Sub.

Change Data Capture (CDC): Implement CDC strategies with tools like Debezium.

Data Migration: Design and execute data migration projects for Azure Synapse and Databricks Lakehouse.

Cloud Storage Management: Handle cloud storage solutions like Azure Data Lake Storage and Google Cloud Storage.

Identity & Access Management: Configure and manage Azure Active Directory (AD Groups, Service Principals, Managed Identities) for security and authentication.

---

Primary Skills (Must-Have):

Python Package Development (setup, poetry, wheels, artifact registries)

Databricks & PySpark (Databricks Asset Bundles)

Open File Formats (Delta, Parquet, Iceberg, etc.)

SQL Databases (Unity Catalog, SQL Server, Hive, Postgres)

Orchestration Tools (Databricks Workflows, Airflow, Dagster)

Azure Data Lake Storage

Azure Active Directory (AD Groups, Service Principals, Managed Identities)

---

Secondary Skills (Good to Have):

Kafka, Azure Event Hub, Google Cloud Pub/Sub

Change Data Capture (Debezium)

Google Cloud Storage

---

Soft Skills & Leadership Responsibilities:

Communication Skills:

Ability to articulate complex technical concepts to both technical and non-technical stakeholders.

Strong documentation skills for process guidelines, technical workflows, and reports.

Problem-Solving & Analytical Thinking:

Strong troubleshooting abilities and the capability to resolve issues effectively.

Analytical mindset to optimize data workflows and system performance.

Leadership & Collaboration:

Client Interactions: Understand business requirements, contribute to design discussions, and translate them into actionable deliverables.

Team Collaboration: Work closely with cross-functional teams across development, operations, and business units.

Stakeholder Engagement: Build and maintain strong relationships with internal and external stakeholders.

Benefits

.

Loading
Please wait..!!