image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Gcp Data Engineer

  • ... Posted on: Feb 11, 2025
  • ... Stellent IT LLC
  • ... Parsippany, New Jersey
  • ... Salary: Not Available
  • ... Full-time

Gcp Data Engineer   

Job Title :

Gcp Data Engineer

Job Type :

Full-time

Job Location :

Parsippany New Jersey United States

Remote :

No

Jobcon Logo Job Description :

Job Title:- GCP Data Engineer

Job Location:- Parsippany NJ (Onsite)

Long Term Contract

Job Summary:

Highly skilled GCP Data Engineer with DBA expertise to design, implement, and maintain data solutions using Google Cloud Platform (GCP). The ideal candidate will play a key role in managing and optimizing a Data Lake powered by BigQuery and AlloyDB. You will ensure the system's performance, scalability, and reliability while collaborating with cross-functional teams to support data-driven business decisions.

Roles and Responsibilities:

Data Engineering:

  • Design, develop, and maintain a Data Lake architecture using GCP services, particularly BigQuery and AlloyDB.
  • Build robust data pipelines for ETL/ELT processes using tools like Dataflow, Cloud Composer, or other GCP services.
  • Integrate data from various sources (structured and unstructured) into the Data Lake, ensuring consistency and reliability.
  • Optimize BigQuery data models and queries for high performance and scalability.

Database Administration:

  • Administer and maintain AlloyDB instances, ensuring high availability, security, and performance.
  • Perform backup, recovery, and disaster recovery planning for AlloyDB and BigQuery.
  • Monitor and optimize database performance, query execution, and resource utilization.
  • Manage schema design, indexing, partitioning, and clustering to enhance database efficiency.
  • Apply database governance and compliance best practices to ensure data security and regulatory adherence.

Data Governance and Management:

  • Implement data governance policies, including role-based access controls, data masking, and encryption.
  • Manage metadata, data cataloging, and data lineage tracking to support audit and compliance requirements.
  • Conduct regular health checks on the Data Lake to ensure data quality and integrity.

Collaboration and Reporting:

  • Collaborate with data scientists, analysts, and application developers to define data requirements and deliver solutions.
  • Design and deploy real-time and batch data analytics solutions using GCP's BigQuery and Looker.
  • Provide documentation and training to teams on using Data Lake features effectively.

Automation and Monitoring:

  • Automate database and data pipeline tasks.
  • Set up monitoring and alerting for BigQuery and AlloyDB using Cloud Monitoring and Cloud Logging.

Resolve incidents and troubleshoot performance issues in real time.

Phone : 321-641-0093

Email:

Jobcon Logo Position Details

Posted:

Feb 11, 2025

Employment:

Full-time

Salary:

Not Available

Snaprecruit ID:

SD-CIE-b10463af946f33d913a029587f4a445aba32334613dab7e34242770ceaac72c1

City:

Parsippany

Job Origin:

CIEPAL_ORGANIC_FEED

Share this job:

  • linkedin

Gcp Data Engineer    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Job Title:- GCP Data Engineer

Job Location:- Parsippany NJ (Onsite)

Long Term Contract

Job Summary:

Highly skilled GCP Data Engineer with DBA expertise to design, implement, and maintain data solutions using Google Cloud Platform (GCP). The ideal candidate will play a key role in managing and optimizing a Data Lake powered by BigQuery and AlloyDB. You will ensure the system's performance, scalability, and reliability while collaborating with cross-functional teams to support data-driven business decisions.

Roles and Responsibilities:

Data Engineering:

  • Design, develop, and maintain a Data Lake architecture using GCP services, particularly BigQuery and AlloyDB.
  • Build robust data pipelines for ETL/ELT processes using tools like Dataflow, Cloud Composer, or other GCP services.
  • Integrate data from various sources (structured and unstructured) into the Data Lake, ensuring consistency and reliability.
  • Optimize BigQuery data models and queries for high performance and scalability.

Database Administration:

  • Administer and maintain AlloyDB instances, ensuring high availability, security, and performance.
  • Perform backup, recovery, and disaster recovery planning for AlloyDB and BigQuery.
  • Monitor and optimize database performance, query execution, and resource utilization.
  • Manage schema design, indexing, partitioning, and clustering to enhance database efficiency.
  • Apply database governance and compliance best practices to ensure data security and regulatory adherence.

Data Governance and Management:

  • Implement data governance policies, including role-based access controls, data masking, and encryption.
  • Manage metadata, data cataloging, and data lineage tracking to support audit and compliance requirements.
  • Conduct regular health checks on the Data Lake to ensure data quality and integrity.

Collaboration and Reporting:

  • Collaborate with data scientists, analysts, and application developers to define data requirements and deliver solutions.
  • Design and deploy real-time and batch data analytics solutions using GCP's BigQuery and Looker.
  • Provide documentation and training to teams on using Data Lake features effectively.

Automation and Monitoring:

  • Automate database and data pipeline tasks.
  • Set up monitoring and alerting for BigQuery and AlloyDB using Cloud Monitoring and Cloud Logging.

Resolve incidents and troubleshoot performance issues in real time.

Phone : 321-641-0093

Email:

Loading
Please wait..!!