image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Big Data Developer

  • ... Posted on: Feb 20, 2026
  • ... San R&D Business Solutions LLC
  • ... Alpharetta, Georgia
  • ... Salary: Not Available
  • ... Full-time

Big Data Developer   

Job Title :

Big Data Developer

Job Type :

Full-time

Job Location :

Alpharetta Georgia United States

Remote :

No

Jobcon Logo Job Description :

Job Description

Job Description

Job Title: Big Data Developer (GCP / Dataflow)

Experience: 8+ Years
Location: Alpharetta, GA

Work Type: Hybrid

Employment Type: Contract (C2C)

Visa Requirement: Only USC / GC


Open to local GA candidates only


Job Summary:


We are seeking an experienced Big Data Developer with strong expertise in Google Cloud Platform (GCP) and Dataflow to design, develop, and maintain scalable data processing systems. The ideal candidate will have around 8+ years of experience in Big Data technologies and a strong background in building high-performance data pipelines and distributed systems.


Key Responsibilities:


  • Design and develop scalable Big Data solutions using GCP services.
  • Build and optimize data pipelines using Google Cloud Dataflow (Apache Beam).
  • Work with large datasets in distributed environments.
  • Develop batch and real-time data processing workflows.
  • Integrate data from multiple sources (structured and unstructured).
  • Optimize data processing performance and cost efficiency in GCP.
  • Collaborate with data engineers, architects, and business stakeholders.
  • Implement data quality checks, monitoring, and logging frameworks.
  • Ensure data security, governance, and compliance standards are met.
  • Troubleshoot and resolve performance bottlenecks.


Required Skills & Qualifications:


  • 8+ years of experience in Big Data development.
  • Strong hands-on experience with:
    • Google Cloud Platform (GCP)
    • Cloud Dataflow
    • Apache Beam
    • BigQuery
    • Cloud Storage
    • Pub/Sub
  • Strong programming skills in Java or Python.
  • Experience with Apache Spark, Hadoop, Hive, or similar Big Data frameworks.
  • Experience building ETL/ELT pipelines.
  • Knowledge of SQL and NoSQL databases.
  • Understanding of data modeling and data warehousing concepts.
  • Experience with CI/CD pipelines and DevOps practices.
  • Strong problem-solving and analytical skills.


Preferred Qualifications:


  • GCP certifications (Professional Data Engineer preferred).
  • Experience with Kubernetes and containerization.
  • Experience in streaming technologies.
  • Knowledge of Airflow or other workflow orchestration tools.
  • Experience working in Agile/Scrum environments.


Soft Skills:


  • Strong communication skills.
  • Ability to work independently and in a team environment.
  • Strong documentation and stakeholder management skills.


View Full Description

Jobcon Logo Position Details

Posted:

Feb 20, 2026

Employment:

Full-time

Salary:

Not Available

City:

Alpharetta

Job Origin:

ziprecruiter

Share this job:

  • linkedin

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Big Data Developer    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Job Description

Job Description

Job Title: Big Data Developer (GCP / Dataflow)

Experience: 8+ Years
Location: Alpharetta, GA

Work Type: Hybrid

Employment Type: Contract (C2C)

Visa Requirement: Only USC / GC


Open to local GA candidates only


Job Summary:


We are seeking an experienced Big Data Developer with strong expertise in Google Cloud Platform (GCP) and Dataflow to design, develop, and maintain scalable data processing systems. The ideal candidate will have around 8+ years of experience in Big Data technologies and a strong background in building high-performance data pipelines and distributed systems.


Key Responsibilities:


  • Design and develop scalable Big Data solutions using GCP services.
  • Build and optimize data pipelines using Google Cloud Dataflow (Apache Beam).
  • Work with large datasets in distributed environments.
  • Develop batch and real-time data processing workflows.
  • Integrate data from multiple sources (structured and unstructured).
  • Optimize data processing performance and cost efficiency in GCP.
  • Collaborate with data engineers, architects, and business stakeholders.
  • Implement data quality checks, monitoring, and logging frameworks.
  • Ensure data security, governance, and compliance standards are met.
  • Troubleshoot and resolve performance bottlenecks.


Required Skills & Qualifications:


  • 8+ years of experience in Big Data development.
  • Strong hands-on experience with:
    • Google Cloud Platform (GCP)
    • Cloud Dataflow
    • Apache Beam
    • BigQuery
    • Cloud Storage
    • Pub/Sub
  • Strong programming skills in Java or Python.
  • Experience with Apache Spark, Hadoop, Hive, or similar Big Data frameworks.
  • Experience building ETL/ELT pipelines.
  • Knowledge of SQL and NoSQL databases.
  • Understanding of data modeling and data warehousing concepts.
  • Experience with CI/CD pipelines and DevOps practices.
  • Strong problem-solving and analytical skills.


Preferred Qualifications:


  • GCP certifications (Professional Data Engineer preferred).
  • Experience with Kubernetes and containerization.
  • Experience in streaming technologies.
  • Knowledge of Airflow or other workflow orchestration tools.
  • Experience working in Agile/Scrum environments.


Soft Skills:


  • Strong communication skills.
  • Ability to work independently and in a team environment.
  • Strong documentation and stakeholder management skills.


Loading
Please wait..!!