Find Data Engineer Remote Work Job in Dallas, Texas | Snaprecruit

Find Data Engineer Work Jobs in Dallas
image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Data Engineer Remote Work

  • ... Posted on: Jul 22, 2024
  • ... PRIMUS Global Services Inc
  • ... Dallas, Texas
  • ... Salary: Not Available
  • ... Contract

Data Engineer Remote Work   

Job Title :

Data Engineer Remote Work

Job Type :

Contract

Job Location :

Dallas Texas United States

Remote :

No

Jobcon Logo Job Description :

Data Engineer - REMOTE WORK - 56813

We have an immediate long-term opportunity with one of our prime clients for a position of Data Engineer to work on Remote basis.

Required Skills

4+ years of enterprise data warehouse development with SQL, Python, Azure Data Factory (ADF), Azure Databricks, Kafka, and ideally Snowflake experience

Additional Skills

Job Description

Looking for a Data Engineer with SQL, Python, ADF, Databricks, Kafka and ideally Snowflake experience to work 100% remote.

  • Data Engineer to develop CI/D data pipelines and ETL processes to curate and transform pharmacy data from different sources (both on premise and cloud).
  • Build large scale databases that are robust and secure to be stored in a data lake or data warehouse.
  • Automate data workflows and processes including ingestion, cleaning, structuring, formatting of data.
  • Build engineering solutions that support ML/data science projects.
  • Collaborate with Data Scientists and business partners to deploy machine learning models in production (preferred).
  • Extensively work on different kinds of datasets including text, voice, images, unstructured, structured.

Requirements:

  • 4-6 years of enterprise data warehouse development - preferably on SQL or Snowflake
  • 4-6 years of experience creating, enhancing, and maintaining ETL frameworks using ETL tools.
  • Hands-on experience with SQL, PL/SQL, Python and/or Shell Scripting is a must.
  • Experience migrating from RDBMS to Snowflake is a plus.
  • End-to-end dataflow design and development experience is a plus.
  • Hands-on experience with Azure Cloud, including Azure Data Factory and Databricks.
  • Kafka experience to setup/attach to Databricks.

**ALL successful candidates for this position are required to work directly for PRIMUS. No agencies please only W2**

For immediate consideration, please contact:

Tejaswini
PRIMUS Global Services Inc
Direct (972) 798-2662
Desk (972) 753-6500 x204
Email: jobs@primusglobal.com

Jobcon Logo Position Details

Posted:

Jul 22, 2024

Employment:

Contract

Salary:

Not Available

Snaprecruit ID:

SD-20240728143007-56798-8707

City:

Dallas

Job Origin:

CIEPAL_ORGANIC_FEED

Share this job:

  • linkedin

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Similar Jobs

Data Engineer Remote Work    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Data Engineer - REMOTE WORK - 56813

We have an immediate long-term opportunity with one of our prime clients for a position of Data Engineer to work on Remote basis.

Required Skills

4+ years of enterprise data warehouse development with SQL, Python, Azure Data Factory (ADF), Azure Databricks, Kafka, and ideally Snowflake experience

Additional Skills

Job Description

Looking for a Data Engineer with SQL, Python, ADF, Databricks, Kafka and ideally Snowflake experience to work 100% remote.

  • Data Engineer to develop CI/D data pipelines and ETL processes to curate and transform pharmacy data from different sources (both on premise and cloud).
  • Build large scale databases that are robust and secure to be stored in a data lake or data warehouse.
  • Automate data workflows and processes including ingestion, cleaning, structuring, formatting of data.
  • Build engineering solutions that support ML/data science projects.
  • Collaborate with Data Scientists and business partners to deploy machine learning models in production (preferred).
  • Extensively work on different kinds of datasets including text, voice, images, unstructured, structured.

Requirements:

  • 4-6 years of enterprise data warehouse development - preferably on SQL or Snowflake
  • 4-6 years of experience creating, enhancing, and maintaining ETL frameworks using ETL tools.
  • Hands-on experience with SQL, PL/SQL, Python and/or Shell Scripting is a must.
  • Experience migrating from RDBMS to Snowflake is a plus.
  • End-to-end dataflow design and development experience is a plus.
  • Hands-on experience with Azure Cloud, including Azure Data Factory and Databricks.
  • Kafka experience to setup/attach to Databricks.

**ALL successful candidates for this position are required to work directly for PRIMUS. No agencies please only W2**

For immediate consideration, please contact:

Tejaswini
PRIMUS Global Services Inc
Direct (972) 798-2662
Desk (972) 753-6500 x204
Email: jobs@primusglobal.com

Loading
Please wait..!!