Data Engineer Apply
Position: Data Engineer
Location: St. Louis, MO/Remote
Duration: 1 Years
Job Description
Experience Required 10+ Years with at least 7+ Yrs as Data engineer
Competencies:
Matillion - REQUIRED, candidates without hands-on experience will be rejected Digital : Amazon Web Service(AWS) Cloud Computing Digital : Snowflake
Role Summary We are seeking a Data Engineer (ELT oriented) to design, build, and support cloud data pipelines with a primary focus on Matillion and Snowflake. The engineer will own ingestion, transformation, orchestration, quality, and performance for high value data products, collaborating closely with product, analytics, and platform teams
Technical/Functional Skills
8 12 years total experience with 5+ years in ETL/ELT engineering using industry tools; 1+ year hands on Matillion (or strong equivalent with rapid ramp up)
2+ years building and operating Snowflake (warehouse objects, stages, tasks, streams, performance tuning)
Advanced SQL: Stored procedures, functions, triggers, error handling; proven query tuning on large datasets
Demonstrated experience implementing CDC patterns (batch/near real time).
Experience with AWS components in data pipelines (Lambda, storage/services) is a plus per requirement
Strong problem solving, communication, and ability to multitask in a team oriented environment
Excellent verbal and written communications skills and ability to multi-task within a team-oriented environment
Demonstrated proficiency developing and supporting extract, transform, and load (ETL) and extract, load, transform (ELT) processes
Experience building self-service reporting solutions using business intelligence software (e.g., OBIEE, Looker, Power BI, etc.)
Ability to maintain documentation on ELT process and procedures
Experience conducting project-specific data analysis that includes analyzing and mapping required data and supporting all facets of the ELT process
Ability to assist with the creation and maintenance of ELT specifications (e.g., source-to-target maps)
Roles & Responsibilities
Design & Build ELT Pipelines: Develop end to end ELT pipelines that extract from APIs, RDBMS, files, and third party sources, and load into Snowflake/data lake targets; operationalize using Matillion (or equivalent).
Modeling & Curation: Create and maintain schemas/data models (staging, curated, data marts); document source to target mappings, lineage, and standards.
CDC & Incremental Loads: Implement Change Data Capture (CDC) strategies (Snowflake features/Matillion patterns/AWS services) to minimize latency and cost.
Data Quality & Governance: Embed validation rules, recon checks, exception handling, and error trapping; ensure security and compliance (RBAC/masking where applicable).
Performance & Cost Optimization: Tune SQL/warehouse sizing, job parallelism, partitioning, and caching to meet SLAs with predictable spend.
Ops & Reliability: Configure schedules/monitors, own incident triage and on call rotation for ELT runs; provide root cause analysis and preventive fixes.
Documentation & KT: Maintain runbooks, SOPs, design notes; deliver KT to peers and cross functional stakeholders.
Stakeholder Management: Present progress/risks to project and business teams; align deliverables to priorities and release timelines

