image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Data Engineer With Project Management

  • ... Posted on: Mar 25, 2026
  • ... Optimuss Inc
  • ... DC metro area, Washington
  • ... Salary: Not Available
  • ... Full-time

Data Engineer With Project Management   

Job Title :

Data Engineer With Project Management

Job Type :

Full-time

Job Location :

DC metro area Washington United States

Remote :

No

Jobcon Logo Job Description :

About the Role

We are seeking a highly skilled Data Engineer with demonstrated project management capabilities to join our growing data and analytics team in the Washington, DC metropolitan area. This role bridges technical data engineering with cross-functional coordination, making it ideal for professionals who thrive at the intersection of technology and delivery leadership. You will architect and build enterprise-scale data infrastructure while concurrently managing project timelines, stakeholder expectations, and team deliverables in a dynamic, mission-focused environment.

Key Responsibilities

Data Engineering

  • Design, develop, and maintain scalable data pipelines using Apache Spark, Airflow, dbt, and cloud-native services (AWS, Azure, or GCP)
  • Build and optimize data warehouse and Lakehouse architectures (Redshift, Snowflake, BigQuery, Databricks Delta Lake)
  • Implement ELT/ETL processes to ingest structured and unstructured data from diverse federal, commercial, and third-party sources
  • Develop and enforce data quality frameworks, validation rules, and monitoring alerts using tools such as Great Expectations or dbt tests
  • Collaborate with data scientists and analysts to productionize ML models and analytical datasets
  • Design and maintain data models (star/snowflake schemas, OBT) in alignment with business and reporting needs
  • Ensure data platform security, access control, and compliance with federal regulations (FedRAMP, FISMA, NIST) where applicable

Project Management

  • Lead end-to-end delivery of data engineering projects - from scoping and requirements gathering through deployment and post-launch support
  • Develop and maintain project plans, roadmaps, risk registers, and status reports for executive and stakeholder audiences
  • Facilitate Agile ceremonies (sprint planning, standups, retrospectives) and manage backlog prioritization in Jira or Azure DevOps
  • Coordinate cross-functional teams including data scientists, analysts, DevOps engineers, and business stakeholders
  • Proactively identify and mitigate technical and schedule risks; escalate blockers with proposed solutions
  • Manage vendor and contractor relationships, SOWs, and deliverable acceptance criteria
  • Track and report on project KPIs including velocity, burn rate, and milestone completion

Required Qualifications

  • 3 to 5+ years of experience in data engineering roles with progressively increasing responsibility
  • 2+ years of project or program management experience in a technical environment
  • Proficiency in SQL and at least one scripting/programming language (Python or Scala strongly preferred)
  • Hands-on experience with cloud platforms - AWS (Glue, Redshift, S3, Lambda), Azure (ADF, Synapse, ADLS), or GCP (Dataflow, BigQuery, Cloud Composer)
  • Experience with workflow orchestration tools (Apache Airflow, Prefect, or Dagster)
  • Solid understanding of data modeling principles, data warehousing, and lakehouse paradigms
  • Demonstrated experience managing Agile/Scrum delivery using Jira, Azure DevOps, or similar tools
  • Excellent written and verbal communication skills with the ability to present technical concepts to non-technical audiences
  • Bachelor's degree in computer science, Information Systems, Engineering, or a related field

Preferred Qualifications

  • Experience supporting federal government clients or working in a cleared environment (clearance a plus)
  • Familiarity with data governance frameworks, data cataloging tools (Collibra, Alation, Apache Atlas), and lineage tracking
  • PMP, PMI-ACP, or Scrum Master (CSM/PSM) certification
  • AWS Certified Data Analytics, Azure Data Engineer Associate, or equivalent cloud certification
  • Experience with real-time streaming data (Apache Kafka, Kinesis, or Pub/Sub)
  • Knowledge of CI/CD pipelines and infrastructure-as-code (Terraform, CloudFormation) for data platform deployments
  • Master's degree in a quantitative or technical discipline

Technical Stack

Languages

Python, SQL, Scala, Bash

Cloud Platforms

AWS, Azure, GCP

Data Warehouses

Snowflake, Redshift, BigQuery, Synapse

Orchestration

Apache Airflow, dbt, Prefect

Streaming

Apache Kafka, AWS Kinesis

BI & Visualization

Power BI, Tableau, Looker

DevOps / Infra

Terraform, Docker, Kubernetes, GitHub Actions

PM Tools

Jira, Confluence, Azure DevOps, MS Project

Jobcon Logo Position Details

Posted:

Mar 25, 2026

Reference Number:

677-40829

Employment:

Full-time

Salary:

Not Available

City:

DC metro area

Job Origin:

CIEPAL_ORGANIC_FEED

Share this job:

  • linkedin

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Data Engineer With Project Management    Apply

Click on the below icons to share this job to Linkedin, Twitter!

About the Role

We are seeking a highly skilled Data Engineer with demonstrated project management capabilities to join our growing data and analytics team in the Washington, DC metropolitan area. This role bridges technical data engineering with cross-functional coordination, making it ideal for professionals who thrive at the intersection of technology and delivery leadership. You will architect and build enterprise-scale data infrastructure while concurrently managing project timelines, stakeholder expectations, and team deliverables in a dynamic, mission-focused environment.

Key Responsibilities

Data Engineering

  • Design, develop, and maintain scalable data pipelines using Apache Spark, Airflow, dbt, and cloud-native services (AWS, Azure, or GCP)
  • Build and optimize data warehouse and Lakehouse architectures (Redshift, Snowflake, BigQuery, Databricks Delta Lake)
  • Implement ELT/ETL processes to ingest structured and unstructured data from diverse federal, commercial, and third-party sources
  • Develop and enforce data quality frameworks, validation rules, and monitoring alerts using tools such as Great Expectations or dbt tests
  • Collaborate with data scientists and analysts to productionize ML models and analytical datasets
  • Design and maintain data models (star/snowflake schemas, OBT) in alignment with business and reporting needs
  • Ensure data platform security, access control, and compliance with federal regulations (FedRAMP, FISMA, NIST) where applicable

Project Management

  • Lead end-to-end delivery of data engineering projects - from scoping and requirements gathering through deployment and post-launch support
  • Develop and maintain project plans, roadmaps, risk registers, and status reports for executive and stakeholder audiences
  • Facilitate Agile ceremonies (sprint planning, standups, retrospectives) and manage backlog prioritization in Jira or Azure DevOps
  • Coordinate cross-functional teams including data scientists, analysts, DevOps engineers, and business stakeholders
  • Proactively identify and mitigate technical and schedule risks; escalate blockers with proposed solutions
  • Manage vendor and contractor relationships, SOWs, and deliverable acceptance criteria
  • Track and report on project KPIs including velocity, burn rate, and milestone completion

Required Qualifications

  • 3 to 5+ years of experience in data engineering roles with progressively increasing responsibility
  • 2+ years of project or program management experience in a technical environment
  • Proficiency in SQL and at least one scripting/programming language (Python or Scala strongly preferred)
  • Hands-on experience with cloud platforms - AWS (Glue, Redshift, S3, Lambda), Azure (ADF, Synapse, ADLS), or GCP (Dataflow, BigQuery, Cloud Composer)
  • Experience with workflow orchestration tools (Apache Airflow, Prefect, or Dagster)
  • Solid understanding of data modeling principles, data warehousing, and lakehouse paradigms
  • Demonstrated experience managing Agile/Scrum delivery using Jira, Azure DevOps, or similar tools
  • Excellent written and verbal communication skills with the ability to present technical concepts to non-technical audiences
  • Bachelor's degree in computer science, Information Systems, Engineering, or a related field

Preferred Qualifications

  • Experience supporting federal government clients or working in a cleared environment (clearance a plus)
  • Familiarity with data governance frameworks, data cataloging tools (Collibra, Alation, Apache Atlas), and lineage tracking
  • PMP, PMI-ACP, or Scrum Master (CSM/PSM) certification
  • AWS Certified Data Analytics, Azure Data Engineer Associate, or equivalent cloud certification
  • Experience with real-time streaming data (Apache Kafka, Kinesis, or Pub/Sub)
  • Knowledge of CI/CD pipelines and infrastructure-as-code (Terraform, CloudFormation) for data platform deployments
  • Master's degree in a quantitative or technical discipline

Technical Stack

Languages

Python, SQL, Scala, Bash

Cloud Platforms

AWS, Azure, GCP

Data Warehouses

Snowflake, Redshift, BigQuery, Synapse

Orchestration

Apache Airflow, dbt, Prefect

Streaming

Apache Kafka, AWS Kinesis

BI & Visualization

Power BI, Tableau, Looker

DevOps / Infra

Terraform, Docker, Kubernetes, GitHub Actions

PM Tools

Jira, Confluence, Azure DevOps, MS Project

Loading
Please wait..!!