Find Data Engineer Gcp Job in Phoenix, Arizona | Snaprecruit

Find Data Engineer Jobs in Phoenix
image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Data Engineer Gcp

  • ... Posted on: Sep 30, 2024
  • ... Conch Technologies Inc
  • ... Phoenix, Arizona
  • ... Salary: Not Available
  • ... Full-time

Data Engineer Gcp   

Job Title :

Data Engineer Gcp

Job Type :

Full-time

Job Location :

Phoenix Arizona United States

Remote :

No

Jobcon Logo Job Description :


Maintain and build on our data warehouse and analytics environment.
Design, implement, test, deploy, and maintain stable, secure, and scalable data engineering solutions and pipelines in support of data and analytics projects,
General data manipulation skills : read in data, process and clean it, transform and recode it, merge different data sets together, reformat data between wide and long, etc.
Use APIs to push and pull data from various data systems and platforms.
Comfort with data management techniques and ETL platforms
Strong SQL programming skills
Hands-on experiences GCP Native Services-GCS, GKE, Composer, Cloud Functions,etc
Expert skill in writing Python code for data processing, familiarity with MySQL or other relational databases, and navigating Unix or Linux

Actively working on Serverless ETL with Python experience.
Cloud Datawarehouse- Snowflake (Must) or RDS.
Proficient with Big Query and GCP Data Warehousing tools
At least one Cloud DW project is a must with Snowflake.
Demonstrated ability to partner with peers and leaders using strong written and verbal communication

Jobcon Logo Position Details

Posted:

Sep 30, 2024

Employment:

Full-time

Salary:

Not Available

Snaprecruit ID:

SD-CIE-8b93947c8ebe2add057437604980e2be0fdb8a0d64f175b204ea346b3754af15

City:

Phoenix

Job Origin:

CIEPAL_ORGANIC_FEED

Share this job:

  • linkedin

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Similar Jobs

Data Engineer Gcp    Apply

Click on the below icons to share this job to Linkedin, Twitter!


Maintain and build on our data warehouse and analytics environment.
Design, implement, test, deploy, and maintain stable, secure, and scalable data engineering solutions and pipelines in support of data and analytics projects,
General data manipulation skills : read in data, process and clean it, transform and recode it, merge different data sets together, reformat data between wide and long, etc.
Use APIs to push and pull data from various data systems and platforms.
Comfort with data management techniques and ETL platforms
Strong SQL programming skills
Hands-on experiences GCP Native Services-GCS, GKE, Composer, Cloud Functions,etc
Expert skill in writing Python code for data processing, familiarity with MySQL or other relational databases, and navigating Unix or Linux

Actively working on Serverless ETL with Python experience.
Cloud Datawarehouse- Snowflake (Must) or RDS.
Proficient with Big Query and GCP Data Warehousing tools
At least one Cloud DW project is a must with Snowflake.
Demonstrated ability to partner with peers and leaders using strong written and verbal communication

Loading
Please wait..!!