Data Engineer Governance Apply
Data Engineer / Governance (Data Integration/Pipeline Dev. with Azure Data Factory, DataBricks, Python)
Work Location | St. Louis, MO |
| On-site Monday-Thursday; Fridays Remote (4 days in the office) Able to relocate to St louis but must start onsite from Day 1. |
Term: | 1 Year |
Rate: | $75/hr on W2 |
Required:
- SQL Skills
- Demonstrated ability to troubleshoot and resolve technical and data-related issues.
Desired Data modeling, data architecture, data analysis and data management.
Desired Master data management and solution configuration, preferably Profisee.
Desired Alation, Collibra, and Purview
Desired Developing and interacting with APIs.
RESPONSIBILITIES:
Design, construct, deploy, test, and support solutions and applications using the MDM platform and related technologies, including Python, Azure Data Factory, Databricks, and APIs.
Serve as the technical subject matter expert (SME) for the MDM platform, overseeing its integrations and data flows to ensure optimal performance and reliability.
Develop and maintain data pipelines in a cloud environment, preferably Azure, utilizing cloud-based data environments like Azure Data Lake for storage and management.
Design, construct, deploy, test, and support data storage in big data constructs, relational databases, and other database technologies to meet data governance needs.
Assess and evaluate system designs to ensure adherence to architecture and design best practices, optimizing for efficiency and scalability.
Translate and execute use cases into effective data stewardship models, ensuring compliance with all applicable Nestl standards and regulatory requirements.
Monitor resources and troubleshoot technical and data-related issues, providing timely resolution to ensure uninterrupted data management operations.
Communicate issues, risks, concerns, and status to management promptly, ensuring transparency and effective risk management.
Collaborate with various teams to align data governance processes with business needs and objectives, contributing to team goals through iterative and agile development practices.
Develop and interact with APIs, performing data mapping of incoming data sources and configuring MDM and Data Catalog tools accordingly.
Connect and integrate data sources with MDM and data catalog tools, ensuring accurate and reliable data integration and management.
BASIC QUALIFICATIONS (Minimum):
Education: A.S. Degree or equivalent experience in a related field
REQUIRED SKILLS:
Experience with cloud-based data environments, such as Azure Data Lake
Experience in building, deploying, and maintaining data pipelines in a cloud environment, preferably Azure.
Experience with Databricks for enhancing capabilities in data processing, analysis and automation tasks.
DESIRED SKILLS:
Extensive experience in data modeling, data architecture, data analysis and data management.
Moderate to expert level SQL skills.
Demonstrated ability to troubleshoot and resolve technical and data-related issues.
Excellent interpersonal and communication skills, both verbal and written.
Ability to create and convey detailed and accurate requirements to stakeholders, partners, and management.
Experience with master data management and solution configuration, preferably Profisee.
Experience with related technologies such as Alation, Collibra, and Purview is a plus.
Proficiency in developing and interacting with APIs.
Familiarity with data quality concepts and tools to ensure accurate and reliable data integration and management