Find Specialty Development Consultant Expert Gcp Dearborn Job in Dearborn Hts, Michigan | Snaprecruit

Find Specialty Development Expert Jobs in Dearborn Hts
image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Specialty Development Consultant Expert Gcp Dearborn

  • ... Posted on: Sep 05, 2024
  • ... Saanvi Technologies
  • ... Dearborn Hts, Michigan
  • ... Salary: Not Available
  • ... Full-time

Specialty Development Consultant Expert Gcp Dearborn   

Job Title :

Specialty Development Consultant Expert Gcp Dearborn

Job Type :

Full-time

Job Location :

Dearborn Hts Michigan United States

Remote :

No

Jobcon Logo Job Description :

Title: Specialty Development Consultant/Expert - GCP

Location: Dearborn, MI (Hybrid)

Job Type: Long Term Contract

Position Description:

Materials Management Platform (MMP) is a multi-year transformation initiative aimed at transforming Ford's Materials Requirement Planning & Inventory Management capabilities. This is part of a larger Industrial Systems IT Transformation effort. This position responsibility is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier Supply Chain, Supplier Collaboration

Skills Required:

  • Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Big Query, Google Cloud Storage, Cloud SQL, Memory Store, Dataflow, Dataproc, Artifact Registry, Cloud Build, Cloud Run, Vertex AI, Pub/Sub, GCP APIs.
  • Build ETL pipelines to ingest the data from heterogeneous sources into our system
  • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data
  • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets
  • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements
  • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Implement version control and CI/CD practices for data engineering workflows to ensure reliable and efficient deployments.
  • Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures
  • Troubleshoot and resolve issues related to data processing, storage, and retrieval.
  • Promptly address code quality issues using SonarQube, Checkmarx, Fossa, and Cycode throughout the development lifecycle
  • Implement security measures and data governance policies to ensure the integrity and confidentiality of data
  • Collaborate with stakeholders to gather and define data requirements, ensuring alignment with business objectives.
  • Develop and maintain documentation for data engineering processes, ensuring knowledge transfer and ease of system maintenance.
  • Participate in on-call rotations to address critical issues and ensure the reliability of data engineering systems.
  • Provide mentorship and guidance to junior team members, fostering a collaborative and knowledge-sharing environment.

Experience Required:

  • 8 years of professional experience in:
    • Data engineering, data product development and software product launches
    • At least three of the following languages: Java, Python, Spark, Scala, SQL and experience performance tuning.
  • 4 years of cloud data/software engineering experience building scalable, reliable, and cost-effective production batch and streaming data pipelines using:
    • Data warehouses like Google BigQuery.
    • Workflow orchestration tools like Airflow.
    • Relational Database Management System like MySQL, PostgreSQL, and SQL Server.
    • Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub
    • Microservices architecture to deliver large-scale real-time data processing application. o REST APIs for compute, storage, operations, and security.
    • DevOps tools such as Tekton, GitHub Actions, Git, GitHub, Terraform, Docker.
    • Project management tools like Atlassian JIRA
  • Automotive experience is preferred
  • Support in an onshore/offshore model is preferred
  • Excellent at problem solving and prevention.
  • Knowledge and practical experience of agile delivery

Experience Preferred:

  • Experience in IDOC processing, APIs and SAP data migration projects.
  • Experience working in SAP S4 Hana environment

Education Required:

  • Requires a bachelor's or foreign equivalent degree in computer science, information technology or a technology related field

Education Preferred:

  • Master's preferred

Thanks,
Lingesh
Saanvi Technologies

Jobcon Logo Position Details

Posted:

Sep 05, 2024

Employment:

Full-time

Salary:

Not Available

Snaprecruit ID:

SD-CIE-fcc31bb5be210fe44dc54340267469bb8affeb9c0ffc4edc876dc987f8c7c67e

City:

Dearborn Hts

Job Origin:

CIEPAL_ORGANIC_FEED

Share this job:

  • linkedin

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Similar Jobs

Specialty Development Consultant Expert Gcp Dearborn    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Title: Specialty Development Consultant/Expert - GCP

Location: Dearborn, MI (Hybrid)

Job Type: Long Term Contract

Position Description:

Materials Management Platform (MMP) is a multi-year transformation initiative aimed at transforming Ford's Materials Requirement Planning & Inventory Management capabilities. This is part of a larger Industrial Systems IT Transformation effort. This position responsibility is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier Supply Chain, Supplier Collaboration

Skills Required:

  • Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Big Query, Google Cloud Storage, Cloud SQL, Memory Store, Dataflow, Dataproc, Artifact Registry, Cloud Build, Cloud Run, Vertex AI, Pub/Sub, GCP APIs.
  • Build ETL pipelines to ingest the data from heterogeneous sources into our system
  • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data
  • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets
  • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements
  • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Implement version control and CI/CD practices for data engineering workflows to ensure reliable and efficient deployments.
  • Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures
  • Troubleshoot and resolve issues related to data processing, storage, and retrieval.
  • Promptly address code quality issues using SonarQube, Checkmarx, Fossa, and Cycode throughout the development lifecycle
  • Implement security measures and data governance policies to ensure the integrity and confidentiality of data
  • Collaborate with stakeholders to gather and define data requirements, ensuring alignment with business objectives.
  • Develop and maintain documentation for data engineering processes, ensuring knowledge transfer and ease of system maintenance.
  • Participate in on-call rotations to address critical issues and ensure the reliability of data engineering systems.
  • Provide mentorship and guidance to junior team members, fostering a collaborative and knowledge-sharing environment.

Experience Required:

  • 8 years of professional experience in:
    • Data engineering, data product development and software product launches
    • At least three of the following languages: Java, Python, Spark, Scala, SQL and experience performance tuning.
  • 4 years of cloud data/software engineering experience building scalable, reliable, and cost-effective production batch and streaming data pipelines using:
    • Data warehouses like Google BigQuery.
    • Workflow orchestration tools like Airflow.
    • Relational Database Management System like MySQL, PostgreSQL, and SQL Server.
    • Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub
    • Microservices architecture to deliver large-scale real-time data processing application. o REST APIs for compute, storage, operations, and security.
    • DevOps tools such as Tekton, GitHub Actions, Git, GitHub, Terraform, Docker.
    • Project management tools like Atlassian JIRA
  • Automotive experience is preferred
  • Support in an onshore/offshore model is preferred
  • Excellent at problem solving and prevention.
  • Knowledge and practical experience of agile delivery

Experience Preferred:

  • Experience in IDOC processing, APIs and SAP data migration projects.
  • Experience working in SAP S4 Hana environment

Education Required:

  • Requires a bachelor's or foreign equivalent degree in computer science, information technology or a technology related field

Education Preferred:

  • Master's preferred

Thanks,
Lingesh
Saanvi Technologies

Loading
Please wait..!!