Databricks Engineer Apply
Job Role : Databricks Engineer
Location : Maryland
Location : Maryland
Client : University of Maryland Global Campus
We are seeking a Databricks Engineer to design, build, and operate a Data & AI platform with a strong
foundation in the Medallion Architecture (raw/bronze, curated/silver, and mart/gold layers). This
platform will orchestrate complex data workflows and scalable ELT pipelines to integrate data from
enterprise systems such as PeopleSoft, D2L, and Salesforce, delivering high-quality, governed data
for machine learning, AI/BI, and analytics at scale.
foundation in the Medallion Architecture (raw/bronze, curated/silver, and mart/gold layers). This
platform will orchestrate complex data workflows and scalable ELT pipelines to integrate data from
enterprise systems such as PeopleSoft, D2L, and Salesforce, delivering high-quality, governed data
for machine learning, AI/BI, and analytics at scale.
You will play a critical role in engineering the infrastructure and workflows that enable seamless data
flow across the enterprise, ensure operational excellence, and provide the backbone for strategic
decision-making, predictive modeling, and innovation.
flow across the enterprise, ensure operational excellence, and provide the backbone for strategic
decision-making, predictive modeling, and innovation.
Responsibilities:
1. Data & AI Platform Engineering (Databricks-Centric):
Design, implement, and optimize end-to-end data pipelines on Databricks, following the
Medallion Architecture principles.
Build robust and scalable ETL/ELT pipelines using Apache Spark and Delta Lake to transform
raw (bronze) data into trusted curated (silver) and analytics-ready (gold) data layers.
Operationalize Databricks Workflows for orchestration, dependency management, and
pipeline automation.
Apply schema evolution and data versioning to support agile data development.
1. Data & AI Platform Engineering (Databricks-Centric):
Design, implement, and optimize end-to-end data pipelines on Databricks, following the
Medallion Architecture principles.
Build robust and scalable ETL/ELT pipelines using Apache Spark and Delta Lake to transform
raw (bronze) data into trusted curated (silver) and analytics-ready (gold) data layers.
Operationalize Databricks Workflows for orchestration, dependency management, and
pipeline automation.
Apply schema evolution and data versioning to support agile data development.
2. Platform Integration & Data Ingestion:
Connect and ingest data from enterprise systems such as PeopleSoft, D2L, and Salesforce using
APIs, JDBC, or other integration frameworks.
Implement connectors and ingestion frameworks that accommodate structured, semi
structured, and unstructured data.
Design standardized data ingestion processes with automated error handling, retries, and
alerting.
Connect and ingest data from enterprise systems such as PeopleSoft, D2L, and Salesforce using
APIs, JDBC, or other integration frameworks.
Implement connectors and ingestion frameworks that accommodate structured, semi
structured, and unstructured data.
Design standardized data ingestion processes with automated error handling, retries, and
alerting.
3. Data Quality, Monitoring, and Governance:
Develop data quality checks, validation rules, and anomaly detection mechanisms to ensure
data integrity across all layers.
Integrate monitoring and observability tools (e.g., Databricks metrics, Grafana) to track ETL
performance, latency, and failures.
Implement Unity Catalog or equivalent tools for centralized metadata management, data
lineage, and governance policy enforcement.
Develop data quality checks, validation rules, and anomaly detection mechanisms to ensure
data integrity across all layers.
Integrate monitoring and observability tools (e.g., Databricks metrics, Grafana) to track ETL
performance, latency, and failures.
Implement Unity Catalog or equivalent tools for centralized metadata management, data
lineage, and governance policy enforcement.
4. Security, Privacy, and Compliance:
Enforce data security best practices including row-level security, encryption at rest/in transit,
and fine-grained access control via Unity Catalog.
Design and implement data masking, tokenization, and anonymization for compliance with
privacy regulations (e.g., GDPR, FERPA).
Work with security teams to audit and certify compliance controls.
Enforce data security best practices including row-level security, encryption at rest/in transit,
and fine-grained access control via Unity Catalog.
Design and implement data masking, tokenization, and anonymization for compliance with
privacy regulations (e.g., GDPR, FERPA).
Work with security teams to audit and certify compliance controls.

