Data Engineer Apply
Job Title: HVR Data Engineer
Location: Remote but must be located in one of these states (AL, AK, AR, AZ, DE, FL, GA, IA, ID, IL, IN, LA, KS, KY, ME, MI, MO, MS, MT, NC, ND, NE, NH, NM, NV, OH, OK, PA, SC, SD, TN, TX, UT, VA, WI, WV, WY)
Rate: Only W2
Department: IT/Data Engineering
Data Engineer with HVR Reports To: Data Engineering Manager
Job Summary:
We are seeking a skilled ADF Data Engineer to join our dynamic healthcare team. In this role, you will leverage your expertise in Azure Data Factory (ADF) and EPIC systems to design, implement, and maintain data integration and transformation solutions that support our healthcare operations. Your work will ensure seamless data flow, enhance data quality, and enable actionable insights that drive better patient care and operational efficiency.
- Top 3 requirements:
- HVR (real-time data replication)
- Python
- SQL
- Azure Cloud Experience
- Snowflake
- Healthcare experience (plus, not necessary)
i. Are any of them flexible?
- Day to Day Responsibilities/project specifics:
- Additional selling points/info?
- The HVR engineer focused on real-time data replication with Snowflake.
- The Engineers will work with CSV, JSON, and YAML files mostly coming from Epic but there will be other data sources as well.
- The primary focus for the engineers will be data ingestion and transformation, with the overarching goal of automating the processes using Python scripts/code.
- Siva wants the engineers to use Python for these - he wants them to use python libraries to leverage and streamline the process
- D2D:
- Pivot and unpivot data
- Parsing unstructured changes and vendor upgrades
- Use Python to automate tasks - to transform and filter large datasets and extract changes
- Python goal is to automate data ingestion and data transformation
- HVR engineer will establish real time data replication