• Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,

Chat with the recruiter

...Minimize

Hey I'm Online! Leave me a message.
Let me know if you have any questions.

Ingestion Framework – Python, pySpark GCP (REMOTE)

In United States

Save this job

Ingestion Framework – Python, pySpark GCP (REMOTE)   

Click on the below icons to share this job to Linkedin, Twitter!

JOB TITLE:

Ingestion Framework – Python, pySpark GCP (REMOTE)

JOB TYPE:

JOB SKILLS:

JOB LOCATION:

San Jose, CA United States

JOB DESCRIPTION:

Position:  Ingestion Framework – Python, pySpark GCP Client:  TCS / CISCO  Duration: 12 months  Need 9+ Years of experience   Skills Ingestion Framework – Python, pySpark GCP A batch data ingestion framework was the method used for all data ingested before the rise of big data, and it continues to be commonly used. Batch processing ... Python Developer responsibilities include writing and testing code, debugging programs, and integrating applications with third-party web services. To be ... Hands-On experience in PySpark, ETL development experience. Working experience in Spark for data processing, aggregation, and transformation with unit tests and design data processing pipelines. Exception Handling and performance optimization techniques on python scripts using spark data frames. Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform. Work with Agile and DevOps techniques and implementation approaches in the delivery.

Position Details

POSTED:

Nov 29, 2022

EMPLOYMENT:

INDUSTRY:

SNAPRECRUIT ID:

S16588656450592412

LOCATION:

United States

CITY:

San Jose, CA

Job Origin:

OORWIN_ORGANIC_FEED

A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Ingestion Framework – Python, pySpark GCP (REMOTE)    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Position:  Ingestion Framework – Python, pySpark GCP Client:  TCS / CISCO  Duration: 12 months  Need 9+ Years of experience   Skills Ingestion Framework – Python, pySpark GCP A batch data ingestion framework was the method used for all data ingested before the rise of big data, and it continues to be commonly used. Batch processing ... Python Developer responsibilities include writing and testing code, debugging programs, and integrating applications with third-party web services. To be ... Hands-On experience in PySpark, ETL development experience. Working experience in Spark for data processing, aggregation, and transformation with unit tests and design data processing pipelines. Exception Handling and performance optimization techniques on python scripts using spark data frames. Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform. Work with Agile and DevOps techniques and implementation approaches in the delivery.


Please wait..!!