DATA ENGINEER Apply
Job Description
Job Description
Our client is looking for a Data Engineer to overhaul and expand the systems that power analytics and internal operations. This role centers on upgrading legacy workflows, strengthening reliability, and building scalable pipelines.
Key Responsibilities
- Develop and maintain data ingestion pipelines feeding into BigQuery
- Replace outdated ETL/ELT processes with modern solutions
- Set up monitoring, alerting, and workflow visibility
- Contribute to orchestration design and implementation (Airflow, Dagster, Prefect)
- Support DBT development and CI/CD practices
- Tune BigQuery performance and manage access/security controls
- Maintain documentation, data flow diagrams, and catalogs
Required Experience
- 3+ years in data engineering
- Strong SQL skills with BigQuery, Snowflake, or Redshift
- Hands-on work with orchestration tools
- Familiarity with GCP services (Cloud SQL, Pub/Sub, Cloud Storage)
- Experience with DBT, Docker, CI/CD pipelines
- Ingestion tooling experience (e.g., Fivetran)
- Knowledge of data governance or SOC 2 standards
Tech Stack Snapshot
BigQuery, DBT/DBT Cloud, Fivetran, Datastream, GCP (Pub/Sub, Cloud Storage, CloudSQL), Databricks, Omni, Amplitude, Hightouch, Datadog
Compensation
Starting around $130K OTE, adjusted for experience and location.
Location
U.S. remote (limited state availability).

