image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Data Developer W P C Insurance

  • ... Posted on: Feb 18, 2026
  • ... Connvertex Technologies Inc
  • ... Nyc, New York
  • ... Salary: Not Available
  • ... Full-time

Data Developer W P C Insurance   

Job Title :

Data Developer W P C Insurance

Job Type :

Full-time

Job Location :

Nyc New York United States

Remote :

No

Jobcon Logo Job Description :

Data Developer with Snowflake, P&C Insurance, Data Engineering, Python, and TALEND

Role Overview

We are seeking an experienced Data Developer / Data Engineer with strong insurance domain expertise to design, develop, and maintain scalable data solutions supporting analytics, reporting, and operational use cases. This role requires hands-on experience with modern cloud data platforms-including Snowflake-and strong programming skills in Python, along with deep understanding of insurance data.

The ideal candidate will bridge technical data engineering capabilities with insurance business knowledge to deliver high-quality, reliable data assets.

Key Responsibilities

  • Design, build, and maintain end-to-end data engineering pipelines (ETL/ELT) for insurance data

  • Develop and optimize data solutions using Snowflake as a cloud data warehouse

  • Use Python to support data ingestion, transformation, automation, and orchestration

  • Integrate data from core insurance systems (Policy, Claims, Billing, Underwriting, Reinsurance)

  • Model insurance data for analytical, actuarial, financial, and regulatory reporting use cases

  • Write complex, high-performance SQL queries and transformations

  • Ensure data quality, validation, lineage, and governance standards

  • Collaborate with business stakeholders, analysts, and architects to translate insurance requirements into technical solutions

  • Troubleshoot and resolve data pipeline, performance, and data integrity issues

  • Document data models, pipelines, and best practices

Required Qualifications

  • 5 years of experience in Data Development or Data Engineering roles

  • Strong insurance domain experience (P&C, Life, Health, or Specialty Insurance)

  • Hands-on experience with Snowflake (data modeling, performance tuning, security, cost optimization)

  • Advanced SQL skills

  • Proficiency in Python for data processing and automation

  • Experience building and maintaining scalable data pipelines

  • Strong understanding of insurance data concepts (policies, premiums, claims, losses, exposures)

  • Experience working with large, complex datasets

  • Strong analytical, troubleshooting, and communication skills

Preferred / Nice-to-Have Skills

  • Cloud platforms: Azure, AWS, or GCP

  • Azure Data Factory, Azure Data Lake, Databricks, Synapse

  • Experience with orchestration tools (Airflow, Azure Data Factory, dbt, or similar)

  • Familiarity with BI and reporting tools (Power BI, Tableau, Looker)

  • Experience with insurance platforms such as Guidewire, Duck Creek, Majesco

  • Knowledge of data governance, metadata management, and regulatory reporting

  • Experience working in Agile/Scrum environments

Education

Bachelor's degree in Computer Science, Data Engineering,Information Systems, or a related field (or equivalent experience)

Jobcon Logo Position Details

Posted:

Feb 18, 2026

Employment:

Full-time

Salary:

Not Available

City:

Nyc

Job Origin:

CIEPAL_ORGANIC_FEED

Share this job:

  • linkedin

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Data Developer W P C Insurance    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Data Developer with Snowflake, P&C Insurance, Data Engineering, Python, and TALEND

Role Overview

We are seeking an experienced Data Developer / Data Engineer with strong insurance domain expertise to design, develop, and maintain scalable data solutions supporting analytics, reporting, and operational use cases. This role requires hands-on experience with modern cloud data platforms-including Snowflake-and strong programming skills in Python, along with deep understanding of insurance data.

The ideal candidate will bridge technical data engineering capabilities with insurance business knowledge to deliver high-quality, reliable data assets.

Key Responsibilities

  • Design, build, and maintain end-to-end data engineering pipelines (ETL/ELT) for insurance data

  • Develop and optimize data solutions using Snowflake as a cloud data warehouse

  • Use Python to support data ingestion, transformation, automation, and orchestration

  • Integrate data from core insurance systems (Policy, Claims, Billing, Underwriting, Reinsurance)

  • Model insurance data for analytical, actuarial, financial, and regulatory reporting use cases

  • Write complex, high-performance SQL queries and transformations

  • Ensure data quality, validation, lineage, and governance standards

  • Collaborate with business stakeholders, analysts, and architects to translate insurance requirements into technical solutions

  • Troubleshoot and resolve data pipeline, performance, and data integrity issues

  • Document data models, pipelines, and best practices

Required Qualifications

  • 5 years of experience in Data Development or Data Engineering roles

  • Strong insurance domain experience (P&C, Life, Health, or Specialty Insurance)

  • Hands-on experience with Snowflake (data modeling, performance tuning, security, cost optimization)

  • Advanced SQL skills

  • Proficiency in Python for data processing and automation

  • Experience building and maintaining scalable data pipelines

  • Strong understanding of insurance data concepts (policies, premiums, claims, losses, exposures)

  • Experience working with large, complex datasets

  • Strong analytical, troubleshooting, and communication skills

Preferred / Nice-to-Have Skills

  • Cloud platforms: Azure, AWS, or GCP

  • Azure Data Factory, Azure Data Lake, Databricks, Synapse

  • Experience with orchestration tools (Airflow, Azure Data Factory, dbt, or similar)

  • Familiarity with BI and reporting tools (Power BI, Tableau, Looker)

  • Experience with insurance platforms such as Guidewire, Duck Creek, Majesco

  • Knowledge of data governance, metadata management, and regulatory reporting

  • Experience working in Agile/Scrum environments

Education

Bachelor's degree in Computer Science, Data Engineering,Information Systems, or a related field (or equivalent experience)

Loading
Please wait..!!