image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Azure Data Engineer With Php Edi

  • ... Posted on: Feb 10, 2025
  • ... JS Consulting
  • ... Kansas City, Kansas
  • ... Salary: Not Available
  • ... Full-time

Azure Data Engineer With Php Edi   

Job Title :

Azure Data Engineer With Php Edi

Job Type :

Full-time

Job Location :

Kansas City Kansas United States

Remote :

Yes

Jobcon Logo Job Description :

Job Title: Azure Data Engineer with PHP & EDI

Location: Remote Role; Kansas City, KS

Duration: 12-24+ Months Contract to Hire (C2H).

Interview process: Phone followed by Video Interview.

Candidate Requirements (Must):

  • Must be a U.S. citizen or a Green Card holder.
  • Willing to participate in a brief video call with the vendor.
  • Comfortable providing two managerial references, along with their LinkedIn profile.

Job Description:-

Must have a strong Microsoft Tech stack:

  • SQL Server
  • Azure SQL
  • Azure Data Factory
  • Databricks
  • Python (PySpark)

They will work on one of the applications as client data integration requiring a very specific skill set:

  • Development with PHP
  • Experience with EDI protocols
  • Experience with Laravel and Symphony frameworks
  • Basic LINUX, basic MySQL
  • Knowledge of python desirable
  • This is a hybrid role with good fundamentals as data engineer and required php development experience.
  • Need 10+ year profile.

We need two Data Engineer's with EDW, Data Lake, AI/ML and Omni-Channel experience.

The purpose of this position is to perform Data Development functions which include: the design of new or enhancement of existing enterprise database systems; maintenance and/or development of critical data processes; unit and system testing; support and help desk tasks. It also requires defining and adopting best practices for each of the data development functions as well as visualization and ETL processes. This position is also responsible for architecting ETL functions between a multitude of relational databases and external data files.

Essential Duties and Responsibilities:

  • Work with a highly dynamic team focused on Digital Transformation.
  • Understand the domain and business processes to implement successful data pipelines.
  • Provide work status, and coordinate with Data Engineers.
  • Manage customer deliverables and regularly report the status via Weekly/Monthly reviews.
  • Design, develop and maintain ETL processes as well as Stored Procedures, Functions and Views
  • Program in T-SQL with relational databases including currently supported versions of Microsoft SQL Server.
  • Write high performance SQL queries using Joins, Cross Apply, Aggregate Queries, Merge, Pivot.
  • Design normalized database tables with proper indexing and constraints.
  • Perform SQL query tuning and performance optimization on complex and inefficient queries.
  • Provide guidance in the use of table variable, temporary table, CTE appropriately to deal with large datasets.
  • Collaborate with DBA on database design and performance enhancements.
  • Leading in all phases of the software development life cycle in a team environment.
  • Debug existing code and troubleshoot for issues.
  • Design and provide a framework for maintaining existing data warehouse for reporting and data analytics.
  • Follow best practices, design, develop, test and document ETL processes.

General Comments:

  • Develop data cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching.
  • Keep up to date with the latest database features, data techniques, and technologies.
  • Support current business applications including the implementation of bug fixes as needed.
  • Effectively communicate progress through the project execution phase
  • Other duties as assigned.

Qualifications and Requirements:

  • To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or abilities.
  • Minimum 6 years of work-related experience in T-SQL, SSRS and ETL. If candidate has relevant education in computer science, computer information systems, or related field, lack of experience can be supplemented by education as follows:
  1. Microsoft Certifications 1 year
  2. Associate degree 1 year
  3. Bachelor's Degree 2 years
  4. Master's Degree 3 years

  • Minimum 4 years of experience with development and maintenance of Data Factory data pipelines.
  • Minimum 4 years of experience with development and maintenance of data pipelines using the DataBricks component in Azure.
  • Minimum of 3 years of experience using DevOps CI/CD in Azure.
  • Understanding of data lakehouse architecture in Azure.
  • Experience in developing, maintaining, and supporting database processes using Microsoft SQL Server with emphasis on .NET technologies.
  • Proficient at an expert level with the following: T-SQL, SSRS, SSIS, SSAS, Data warehousing, ETL, Python.
  • Demonstrated ability to perform above listed essential job functions.
  • Ability to work in a dynamic environment with changing requirements.
  • Good communication and presentation skills
  • Working experience with a wide range of data technologies, data modeling, and metadata management.
  • Working experience using SQL and Python.

Jobcon Logo Position Details

Posted:

Feb 10, 2025

Employment:

Full-time

Salary:

Not Available

Snaprecruit ID:

SD-CIE-c5b9c80ef9bc4425fd78d270b2fefe5b30771392d00f874401306a2bef392157

City:

Kansas City

Job Origin:

CIEPAL_ORGANIC_FEED

Share this job:

  • linkedin

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Azure Data Engineer With Php Edi    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Job Title: Azure Data Engineer with PHP & EDI

Location: Remote Role; Kansas City, KS

Duration: 12-24+ Months Contract to Hire (C2H).

Interview process: Phone followed by Video Interview.

Candidate Requirements (Must):

  • Must be a U.S. citizen or a Green Card holder.
  • Willing to participate in a brief video call with the vendor.
  • Comfortable providing two managerial references, along with their LinkedIn profile.

Job Description:-

Must have a strong Microsoft Tech stack:

  • SQL Server
  • Azure SQL
  • Azure Data Factory
  • Databricks
  • Python (PySpark)

They will work on one of the applications as client data integration requiring a very specific skill set:

  • Development with PHP
  • Experience with EDI protocols
  • Experience with Laravel and Symphony frameworks
  • Basic LINUX, basic MySQL
  • Knowledge of python desirable
  • This is a hybrid role with good fundamentals as data engineer and required php development experience.
  • Need 10+ year profile.

We need two Data Engineer's with EDW, Data Lake, AI/ML and Omni-Channel experience.

The purpose of this position is to perform Data Development functions which include: the design of new or enhancement of existing enterprise database systems; maintenance and/or development of critical data processes; unit and system testing; support and help desk tasks. It also requires defining and adopting best practices for each of the data development functions as well as visualization and ETL processes. This position is also responsible for architecting ETL functions between a multitude of relational databases and external data files.

Essential Duties and Responsibilities:

  • Work with a highly dynamic team focused on Digital Transformation.
  • Understand the domain and business processes to implement successful data pipelines.
  • Provide work status, and coordinate with Data Engineers.
  • Manage customer deliverables and regularly report the status via Weekly/Monthly reviews.
  • Design, develop and maintain ETL processes as well as Stored Procedures, Functions and Views
  • Program in T-SQL with relational databases including currently supported versions of Microsoft SQL Server.
  • Write high performance SQL queries using Joins, Cross Apply, Aggregate Queries, Merge, Pivot.
  • Design normalized database tables with proper indexing and constraints.
  • Perform SQL query tuning and performance optimization on complex and inefficient queries.
  • Provide guidance in the use of table variable, temporary table, CTE appropriately to deal with large datasets.
  • Collaborate with DBA on database design and performance enhancements.
  • Leading in all phases of the software development life cycle in a team environment.
  • Debug existing code and troubleshoot for issues.
  • Design and provide a framework for maintaining existing data warehouse for reporting and data analytics.
  • Follow best practices, design, develop, test and document ETL processes.

General Comments:

  • Develop data cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching.
  • Keep up to date with the latest database features, data techniques, and technologies.
  • Support current business applications including the implementation of bug fixes as needed.
  • Effectively communicate progress through the project execution phase
  • Other duties as assigned.

Qualifications and Requirements:

  • To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or abilities.
  • Minimum 6 years of work-related experience in T-SQL, SSRS and ETL. If candidate has relevant education in computer science, computer information systems, or related field, lack of experience can be supplemented by education as follows:
  1. Microsoft Certifications 1 year
  2. Associate degree 1 year
  3. Bachelor's Degree 2 years
  4. Master's Degree 3 years

  • Minimum 4 years of experience with development and maintenance of Data Factory data pipelines.
  • Minimum 4 years of experience with development and maintenance of data pipelines using the DataBricks component in Azure.
  • Minimum of 3 years of experience using DevOps CI/CD in Azure.
  • Understanding of data lakehouse architecture in Azure.
  • Experience in developing, maintaining, and supporting database processes using Microsoft SQL Server with emphasis on .NET technologies.
  • Proficient at an expert level with the following: T-SQL, SSRS, SSIS, SSAS, Data warehousing, ETL, Python.
  • Demonstrated ability to perform above listed essential job functions.
  • Ability to work in a dynamic environment with changing requirements.
  • Good communication and presentation skills
  • Working experience with a wide range of data technologies, data modeling, and metadata management.
  • Working experience using SQL and Python.

Loading
Please wait..!!