image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Data Engineer

  • ... Posted on: Mar 15, 2026
  • ... TechDoQuest
  • ... Renton, null
  • ... Salary: Not Available
  • ... Full-time

Data Engineer   

Job Title :

Data Engineer

Job Type :

Full-time

Job Location :

Renton null United States

Remote :

No

Jobcon Logo Job Description :

Develop and maintain data ingestion pipelines for service and repair data using Confluent Kafka for event streaming. Implement connectors and integrations between Kafka, AWS S3, Google Dataflow, and Snowflake to facilitate batch and real-time data flows. Work with APIs and Apigee to securely ingest and distribute data across internal and external systems, including dealer networks. Data Cleansing & Transformation Build and optimize data cleansing, normalization, and transformation pipelines in Google Dataflow for real-time processing. Design and implement batch transformation jobs within Snowflake, building and maintaining the Operational Data Store (ODS). Ensure data quality, consistency, and integrity across all processing stages. Data Publishing & Reporting Support Publish transformed and aggregated data to internal and external dashboards using APIs, Kafka topics, and Tableau. Collaborate with data analysts and business stakeholders to support reporting and analytics requirements. Monitor and troubleshoot data pipelines to ensure high availability and performance. Partner with data architects, analysts, and external dealer teams to understand data requirements and source systems. Document data workflows, processing logic, and integration specifications. Adhere to best practices in data security, governance, and compliance. Required Technologies & Skills Cloud Storage & Data Warehousing: AWS S3, Snowflake Data Processing: Google Dataflow Batch & Real-Time Pipeline Development Data Visualization Support: Tableau (basic understanding for data publishing) Experience building Operational Data Stores (ODS) and data transformation pipelines in Snowflake Familiarity with truck industry aftersales or automotive service and repair data is a plus Qualifications Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field. 3+ years of proven experience in data engineering, especially with streaming and batch data pipelines. Hands‑on experience with Kafka ecosystem (Confluent Kafka, Connectors) and cloud data platforms (Snowflake, AWS). Skilled in Python programming for data processing and automation. Experience with Google Cloud Platform services, especially Google Dataflow, is highly desirable. Strong understanding of data modeling, ETL/ELT processes, and data quality principles. Ability to work collaboratively in cross‑functional teams and communicate technical concepts effectively. #J-18808-Ljbffr

View Full Description

Jobcon Logo Position Details

Posted:

Mar 15, 2026

Reference Number:

14660_217781118C2C7114D4575AD484CF4EC0

Employment:

Full-time

Salary:

Not Available

City:

Renton

Job Origin:

APPCAST_CPC

Share this job:

  • linkedin

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Data Engineer    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Develop and maintain data ingestion pipelines for service and repair data using Confluent Kafka for event streaming. Implement connectors and integrations between Kafka, AWS S3, Google Dataflow, and Snowflake to facilitate batch and real-time data flows. Work with APIs and Apigee to securely ingest and distribute data across internal and external systems, including dealer networks. Data Cleansing & Transformation Build and optimize data cleansing, normalization, and transformation pipelines in Google Dataflow for real-time processing. Design and implement batch transformation jobs within Snowflake, building and maintaining the Operational Data Store (ODS). Ensure data quality, consistency, and integrity across all processing stages. Data Publishing & Reporting Support Publish transformed and aggregated data to internal and external dashboards using APIs, Kafka topics, and Tableau. Collaborate with data analysts and business stakeholders to support reporting and analytics requirements. Monitor and troubleshoot data pipelines to ensure high availability and performance. Partner with data architects, analysts, and external dealer teams to understand data requirements and source systems. Document data workflows, processing logic, and integration specifications. Adhere to best practices in data security, governance, and compliance. Required Technologies & Skills Cloud Storage & Data Warehousing: AWS S3, Snowflake Data Processing: Google Dataflow Batch & Real-Time Pipeline Development Data Visualization Support: Tableau (basic understanding for data publishing) Experience building Operational Data Stores (ODS) and data transformation pipelines in Snowflake Familiarity with truck industry aftersales or automotive service and repair data is a plus Qualifications Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field. 3+ years of proven experience in data engineering, especially with streaming and batch data pipelines. Hands‑on experience with Kafka ecosystem (Confluent Kafka, Connectors) and cloud data platforms (Snowflake, AWS). Skilled in Python programming for data processing and automation. Experience with Google Cloud Platform services, especially Google Dataflow, is highly desirable. Strong understanding of data modeling, ETL/ELT processes, and data quality principles. Ability to work collaboratively in cross‑functional teams and communicate technical concepts effectively. #J-18808-Ljbffr

Loading
Please wait..!!