Data Architect Apply
Job Description
Job Description
Robert Half is seeking an experienced Data Architect to design and lead scalable, secure, and high-performing enterprise data solutions. This role will focus on building next-generation cloud data platforms, driving adoption of modern analytics technologies, and ensuring alignment with governance and security standards.
You’ll serve as a hands-on technical leader, partnering closely with engineering, analytics, and business teams to architect data platforms that enable advanced analytics and AI/ML initiatives. This position blends deep technical expertise with strategic thinking to help unlock the value of data across the organization.
Key Responsibilities:
- Design and implement end-to-end data architecture for big data and advanced analytics platforms.
- Architect and build Delta Lake–based lakehouse environments from the ground up, including DLT pipelines, PySpark jobs, workflows, Unity Catalog, and Medallion architecture.
- Develop scalable data models that meet performance, security, and governance requirements.
- Configure and optimize clusters, notebooks, and workflows to support ETL/ELT pipelines.
- Integrate cloud data platforms with supporting services such as data storage, orchestration, secrets management, and analytics tools.
- Establish and enforce best practices for data governance, security, and cost optimization.
- Collaborate with data engineers, analysts, and stakeholders to translate business requirements into technical solutions.
- Provide technical leadership and mentorship to team members.
- Monitor, troubleshoot, and optimize data pipelines to ensure reliability and efficiency.
- Ensure compliance with organizational and regulatory standards related to data privacy and security.
- Create and maintain documentation for architecture, processes, and governance standards.
Required Qualifications:
- 7+ years of experience in data engineering and data modeling, including at least 2 years in an architecture or senior technical leadership role.
- Proven experience designing enterprise cloud data platforms (Azure).
- Hands-on experience with Databricks (or comparable cloud-based data engineering platforms) for building, orchestrating, and optimizing data pipelines.
- Hands-on experience building lakehouse architectures using Delta Lake, DLT, PySpark, workflows, Unity Catalog, and Medallion patterns.
Preferred Skills and Experience:
- Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Engineering, or a related field.
- Strong understanding of data security principles and best practices.
- Experience with platform monitoring and performance tuning.
- Solid knowledge of DevOps and Infrastructure-as-Code practices, including CI/CD pipelines and automated deployments.
- Strong analytical and problem-solving skills with the ability to resolve complex technical issues.
- Experience creating clear technical documentation and operating standards.
- Relevant cloud or data engineering certifications.
- Excellent organizational and communication skills, with the ability to work independently while managing multiple priorities.
- Comfortable working in a fast-paced, collaborative environment.

