image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Data Engineer Software Engineer Phoenix And

  • ... Posted on: Jan 02, 2026
  • ... Amicis Global
  • ... Scottsdale, Arizona
  • ... Salary: Not Available
  • ... Full-time

Data Engineer Software Engineer Phoenix And   

Job Title :

Data Engineer Software Engineer Phoenix And

Job Type :

Full-time

Job Location :

Scottsdale Arizona United States

Remote :

No

Jobcon Logo Job Description :

Job Title: Software Engineer
Location: Phoenix and Scottsdale, AZ 85255 - Hybrid
Duration: 11+ Months
Summary:
The Software Engineer, Data will be responsible for designing, developing, and maintaining robust data solutions that enable efficient storage, processing, and analysis of large-scale datasets. This role focuses on building scalable data pipelines, optimizing data workflows, and ensuring data integrity across systems. The engineer collaborates closely with cross-functional teams-including data scientists, analysts, and business stakeholders-to translate requirements into technical solutions that support strategic decision-making. A successful candidate will have strong programming skills, deep knowledge of data architecture, and experience with modern cloud and big data technologies, while adhering to best practices in security, governance, and performance optimization.

Principal Duties And Responsibilities:
  • Design and implement scalable, reliable, and secure data solutions.
  • Proficient in SQL and data modeling concepts.
  • Develop data models and schemas optimized for performance and maintainability.
  • Build and maintain ETL/ELT pipelines to ingest, transform, and load data from multiple sources.
  • Optimize data workflows for efficiency and cost-effectiveness.
  • Collaborate closely with data analysts and business teams to understand their requirements.
  • Build frameworks for data ingestion pipelines that handle various data sources, including batch and real-time data.
  • Participate in technical decision-making.
  • Design, develop, test, deploy, and maintain data processing pipelines.
  • Design and build scalable, reliable infrastructure with a strong focus on quality, security, and privacy techniques.
  • Communicate complex technical concepts effectively to diverse audiences.
  • Understand business KPIs and translate them into technical solutions.
  • Create detailed technical documentation covering processes, frameworks, best practices, and operational support.
  • Provide constructive feedback during code reviews.

Position Specifications:
  • Bachelor's degree in Computer Science, Computer Engineering, or Information Systems Technology
  • 5+ years of overall experience in software development using Python
  • Strong experience with SQL and database technologies (Relational and NoSQL).
  • Hands-on experience with data pipeline frameworks (e.g., Apache Spark, Airflow, Kafka).
  • Familiarity with cloud platforms (GCP) and data services.
  • Knowledge of data modeling, ETL/ELT processes, and performance optimization.
  • Strong analytical and communication skills: verbal and written
#CareerBuilder #Monster #Dice #Indeed #LinkedIn

Jobcon Logo Position Details

Posted:

Jan 02, 2026

Employment:

Full-time

Salary:

Not Available

City:

Scottsdale

Job Origin:

CIEPAL_ORGANIC_FEED

Share this job:

  • linkedin

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Data Engineer Software Engineer Phoenix And    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Job Title: Software Engineer
Location: Phoenix and Scottsdale, AZ 85255 - Hybrid
Duration: 11+ Months
Summary:
The Software Engineer, Data will be responsible for designing, developing, and maintaining robust data solutions that enable efficient storage, processing, and analysis of large-scale datasets. This role focuses on building scalable data pipelines, optimizing data workflows, and ensuring data integrity across systems. The engineer collaborates closely with cross-functional teams-including data scientists, analysts, and business stakeholders-to translate requirements into technical solutions that support strategic decision-making. A successful candidate will have strong programming skills, deep knowledge of data architecture, and experience with modern cloud and big data technologies, while adhering to best practices in security, governance, and performance optimization.

Principal Duties And Responsibilities:
  • Design and implement scalable, reliable, and secure data solutions.
  • Proficient in SQL and data modeling concepts.
  • Develop data models and schemas optimized for performance and maintainability.
  • Build and maintain ETL/ELT pipelines to ingest, transform, and load data from multiple sources.
  • Optimize data workflows for efficiency and cost-effectiveness.
  • Collaborate closely with data analysts and business teams to understand their requirements.
  • Build frameworks for data ingestion pipelines that handle various data sources, including batch and real-time data.
  • Participate in technical decision-making.
  • Design, develop, test, deploy, and maintain data processing pipelines.
  • Design and build scalable, reliable infrastructure with a strong focus on quality, security, and privacy techniques.
  • Communicate complex technical concepts effectively to diverse audiences.
  • Understand business KPIs and translate them into technical solutions.
  • Create detailed technical documentation covering processes, frameworks, best practices, and operational support.
  • Provide constructive feedback during code reviews.

Position Specifications:
  • Bachelor's degree in Computer Science, Computer Engineering, or Information Systems Technology
  • 5+ years of overall experience in software development using Python
  • Strong experience with SQL and database technologies (Relational and NoSQL).
  • Hands-on experience with data pipeline frameworks (e.g., Apache Spark, Airflow, Kafka).
  • Familiarity with cloud platforms (GCP) and data services.
  • Knowledge of data modeling, ETL/ELT processes, and performance optimization.
  • Strong analytical and communication skills: verbal and written
#CareerBuilder #Monster #Dice #Indeed #LinkedIn

Loading
Please wait..!!