• Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,

Chat with the recruiter

...Minimize

Hey I'm Online! Leave me a message.
Let me know if you have any questions.

Data Engineer - w2

In Texas / United States

Save this job

Data Engineer - w2   

Click on the below icons to share this job to Linkedin, Twitter!
JOB TITLE:
Data Engineer - w2
JOB TYPE:

JOB SKILLS:
JOB LOCATION:
Irving Texas / United States

JOB DESCRIPTION :

Title: Data Engineer – 2 Positions

Location: Bay Area , CA, Remote

Duration: 6+ Month contract

Pay: $60/Hr on c2c

Mode of Interview: 2 Rounds of Client Interview

Responsibilities:

·         Develop and automate large scale, high-performance data processing systems (batch and/or streaming) to drive Airbnb business growth and improve the product experience.

·         Build scalable Spark data pipelines leveraging Airflow scheduler/executor framework

 

Minimum Requirements:

·         4+ years of relevant industry experience

·         Demonstrated ability to analyze large data sets to identify gaps and inconsistencies, provide data insights, and advance effective product solutions

·         Working knowledge of relational databases and query authoring (SQL).

·         Good communication skills, both written and verbal

·         Strong experience using ETL framework (ex: Airflow, Flume, Oozie etc.) to build and deploy production-quality ETL pipelines.

·         Experience building batch data pipelines in Spark Scala.

·         Strong understanding of distributed storage and compute (S3, Hive, Spark)

·         General software engineering skills (Java or Python, Github)

 

 



Position Details

Jun 04, 2021
Information Technology (IT)
S1620313884439675
Texas / United States
Irving
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Data Engineer - w2    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Title: Data Engineer – 2 Positions

Location: Bay Area , CA, Remote

Duration: 6+ Month contract

Pay: $60/Hr on c2c

Mode of Interview: 2 Rounds of Client Interview

Responsibilities:

·         Develop and automate large scale, high-performance data processing systems (batch and/or streaming) to drive Airbnb business growth and improve the product experience.

·         Build scalable Spark data pipelines leveraging Airflow scheduler/executor framework

 

Minimum Requirements:

·         4+ years of relevant industry experience

·         Demonstrated ability to analyze large data sets to identify gaps and inconsistencies, provide data insights, and advance effective product solutions

·         Working knowledge of relational databases and query authoring (SQL).

·         Good communication skills, both written and verbal

·         Strong experience using ETL framework (ex: Airflow, Flume, Oozie etc.) to build and deploy production-quality ETL pipelines.

·         Experience building batch data pipelines in Spark Scala.

·         Strong understanding of distributed storage and compute (S3, Hive, Spark)

·         General software engineering skills (Java or Python, Github)