image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Data Engineer Remote

  • ... Posted on: Feb 25, 2026
  • ... American IT Systems
  • ... Sanjose, California
  • ... Salary: Not Available
  • ... CTC

Data Engineer Remote   

Job Title :

Data Engineer Remote

Job Type :

CTC

Job Location :

Sanjose California United States

Remote :

No

Jobcon Logo Job Description :

Data Engineer

Duration: 11 months

Location: US PST remote

Domain preferred Finance, Banking

Role Overview:

We are seeking a highly skilled Data Engineer with strong experience in Snowflake and Python to support the design, modernization, and optimization of compliance data platforms.

This role will focus on building scalable data models, streamlining legacy-to-target data pipelines, and implementing robust ETL workflows within Snowflake. The ideal candidate brings a strong understanding of data architecture principles, regulatory/compliance data requirements, and hands-on engineering capabilities.

Key Responsibilities

Data Modeling & Architecture (Compliance Focus)

  • Design and implement scalable data models to support compliance, regulatory, and risk reporting needs
  • Develop logical and physical data models within Snowflake
  • Define data domains, lineage, and element-level mapping across systems
  • Ensure data consistency, traceability, and auditability
  • Align data architecture with enterprise governance standards

Current State & Target State Analysis

  • Perform detailed analysis of existing data pipelines at the data element level
  • Document source-to-target mappings and transformation logic
  • Identify inefficiencies, redundancies, and data quality gaps
  • Design streamlined target-state architecture and optimized data flows
  • Support data rationalization and consolidation initiatives

ETL / ELT Engineering (Hands-On)

  • Develop and optimize ETL/ELT pipelines using Python and Snowflake
  • Implement transformation logic using Snowflake SQL and Python-based frameworks
  • Build scalable ingestion and transformation workflows
  • Optimize performance and cost within Snowflake
  • Implement incremental loading and validation strategies

Data Quality & Controls

  • Implement validation checks and reconciliation processes
  • Ensure regulatory-grade accuracy and completeness
  • Support audit and compliance requirements
  • Maintain documentation for data lineage and transformations

Required Qualifications

Technical Skills

  • Strong programming expertise in Python
  • Hands-on experience with Snowflake (data modeling, performance tuning, query optimization)
  • Advanced SQL skills
  • Experience building production-grade ETL/ELT pipelines
  • Strong understanding of data modeling (3NF, dimensional modeling)
  • Experience working with large, complex datasets

Architecture & Analysis

  • Experience performing source-to-target mapping at the data element level
  • Understanding of data lineage and metadata management
  • Ability to analyze and redesign legacy pipelines into modern architectures
  • Familiarity with compliance or regulatory data environments (preferred)

Preferred Qualifications

  • Experience in financial services, risk, or regulatory reporting environments
  • Experience with workflow orchestration tools (Airflow or similar)
  • Familiarity with data quality frameworks
  • Exposure to cloud platforms (AWS/Azure/GCP)
  • Experience integrating structured and semi-structured data

Jobcon Logo Position Details

Posted:

Feb 25, 2026

Reference Number:

527-44639

Employment:

CTC

Salary:

Not Available

City:

Sanjose

Job Origin:

CIEPAL_ORGANIC_FEED

Share this job:

  • linkedin

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Data Engineer Remote    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Data Engineer

Duration: 11 months

Location: US PST remote

Domain preferred Finance, Banking

Role Overview:

We are seeking a highly skilled Data Engineer with strong experience in Snowflake and Python to support the design, modernization, and optimization of compliance data platforms.

This role will focus on building scalable data models, streamlining legacy-to-target data pipelines, and implementing robust ETL workflows within Snowflake. The ideal candidate brings a strong understanding of data architecture principles, regulatory/compliance data requirements, and hands-on engineering capabilities.

Key Responsibilities

Data Modeling & Architecture (Compliance Focus)

  • Design and implement scalable data models to support compliance, regulatory, and risk reporting needs
  • Develop logical and physical data models within Snowflake
  • Define data domains, lineage, and element-level mapping across systems
  • Ensure data consistency, traceability, and auditability
  • Align data architecture with enterprise governance standards

Current State & Target State Analysis

  • Perform detailed analysis of existing data pipelines at the data element level
  • Document source-to-target mappings and transformation logic
  • Identify inefficiencies, redundancies, and data quality gaps
  • Design streamlined target-state architecture and optimized data flows
  • Support data rationalization and consolidation initiatives

ETL / ELT Engineering (Hands-On)

  • Develop and optimize ETL/ELT pipelines using Python and Snowflake
  • Implement transformation logic using Snowflake SQL and Python-based frameworks
  • Build scalable ingestion and transformation workflows
  • Optimize performance and cost within Snowflake
  • Implement incremental loading and validation strategies

Data Quality & Controls

  • Implement validation checks and reconciliation processes
  • Ensure regulatory-grade accuracy and completeness
  • Support audit and compliance requirements
  • Maintain documentation for data lineage and transformations

Required Qualifications

Technical Skills

  • Strong programming expertise in Python
  • Hands-on experience with Snowflake (data modeling, performance tuning, query optimization)
  • Advanced SQL skills
  • Experience building production-grade ETL/ELT pipelines
  • Strong understanding of data modeling (3NF, dimensional modeling)
  • Experience working with large, complex datasets

Architecture & Analysis

  • Experience performing source-to-target mapping at the data element level
  • Understanding of data lineage and metadata management
  • Ability to analyze and redesign legacy pipelines into modern architectures
  • Familiarity with compliance or regulatory data environments (preferred)

Preferred Qualifications

  • Experience in financial services, risk, or regulatory reporting environments
  • Experience with workflow orchestration tools (Airflow or similar)
  • Familiarity with data quality frameworks
  • Exposure to cloud platforms (AWS/Azure/GCP)
  • Experience integrating structured and semi-structured data

Loading
Please wait..!!