Find Aws Data Engineer Iii Job in Charlotte, North Carolina | Snaprecruit

Find Aws Data Iii Jobs in Charlotte
image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Aws Data Engineer Iii

  • ... Posted on: Oct 15, 2024
  • ... Talent Grub USA inc
  • ... Charlotte, North Carolina
  • ... Salary: Not Available
  • ... Full-time

Aws Data Engineer Iii   

Job Title :

Aws Data Engineer Iii

Job Type :

Full-time

Job Location :

Charlotte North Carolina United States

Remote :

No

Jobcon Logo Job Description :

Client: Utility & Energy

Role: AWS Data Engineer IV

Location: Charlotte, NC

Duration: 24-36 Months

Core Technical Skills:

  • 5+ years of AWS experience
  • AWS services - S3, EMR, Glue Jobs, Lambda, Athena, CloudTrail, SNS, SQS, CloudWatch, Step Functions, QuickSight
  • Experience with Kafka/Messaging preferably Confluent Kafka
  • Experience with EMR databases such as Glue Catalog, Lake Formation, Redshift, DynamoDB and Aurora
  • Experience with AWS data warehousing tools such as Amazon Redshift and Amazon Athena
  • Proven track record in the design and implementation of data warehouse solutions using AWS
  • Skilled in data modeling and executing ETL processes tailored for data warehousing
  • Competence in developing and refining data pipelines within AWS
  • Proficient in handling both real-time and batch data processing tasks
  • Extensive understanding of database management fundamentals
  • Expertise in creating alerts and automated solutions for handling production problems
  • Tools and Languages Python, Spark, PySpark and Pandas
  • Infrastructure as Code technology Terraform/CloudFormation
  • Experience with Secrets Management Platform like Vault and AWS Secrets manager
  • Experience with Event Driven Architecture
  • DevOps pipeline (CI/CD); Bitbucket; Concourse
  • Experience with RDBMS platforms and Strong proficiency with SQL
  • Experience with Rest APIs and API gateway
  • Deep knowledge of IAM roles and Policies
  • Experience using AWS monitoring services like CloudWatch, CloudTrail ad CloudWatch events
  • Deep understanding of networking DNS, TCP/IP and VPN
  • Experience with AWS workflow orchestration tool like Airflow or Step Functions

AWS Data Engineer IV Additional Technical Skills (nice to have, but not required for the role)

  • Experience with native AWS technologies for data and analytics such as Kinesis, OpenSearch
  • Databases - Document DB, Mongo DB
  • Hadoop platform (Hive; HBase; Druid)
  • Java, Scala, Node JS
  • Workflow Automation
  • Experience transitioning on premise big data platforms into cloud-based platforms such as AWS
  • Strong Background in Kubernetes, Distributed Systems, Microservice architecture and containers

Core Responsibilities

  • Provides technical direction, guides the team on key technical aspects and responsible for product tech delivery
  • Lead the Design, Build, Test and Deployment of components
  • Where applicable in collaboration with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead)
  • Understand requirements / use case to outline technical scope and lead delivery of technical solution
  • Confirm required developers and skillsets specific to product
  • Provides leadership, direction, peer review and accountability to developers on the product (key responsibility)
  • Works closely with the Product Owner to align on delivery goals and timing
  • Assists Product Owner with prioritizing and managing team backlog
  • Collaborates with Data and Solution architects on key technical decisions
  • The architecture and design to deliver the requirements and functionality
  • Skilled in developing data pipelines, focusing on long-term reliability and maintaining high data quality
  • Designs data warehousing solutions with the end-user in mind, ensuring ease of use without compromising on performance
  • Manage and resolve issues in production data warehouse environments on AWS

Core Experience and Abilities

  • Ability to perform hands on development and peer review for certain components / tech stack on the product
  • Standing up of development instances and migration path (with required security, access/roles)
  • Develop components and related processes (e.g. data pipelines and associated ETL processes, workflows)
  • Lead implementation of integrated data quality framework
  • Ensures optimal framework design and load testing scope to optimize performance (specifically for Big Data)
  • Supports data scientist with test and validation of models
  • Performs impact analysis and identifies risk to design changes
  • Ability to build new data pipelines, identify existing data gaps and provide automated solutions to deliver analytical capabilities and enriched data to applications
  • Ability to implement data pipelines with the right attentiveness to durability and data quality
  • Implements data warehousing products thinking of the end users experience (ease of use with right performance)
  • Ensures Test Driven development
  • 5+ years of Experience leading teams to deliver complex products
  • Strong technical skills and communication skills
  • Strong skills with business stakeholder interactions
  • Strong solutioning and architecture skills
  • 5+ years of Experience building real time data ingestion streams (event driven)
  • Ensure data security and permissions solutions, including data encryption, user access controls and logging

Jobcon Logo Position Details

Posted:

Oct 15, 2024

Employment:

Full-time

Salary:

Not Available

Snaprecruit ID:

SD-CIE-6637838e12375a0d34a7ecbde3d406f25e54a3f93db029dc88a763cab69ce742

City:

Charlotte

Job Origin:

CIEPAL_ORGANIC_FEED

Share this job:

  • linkedin

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Similar Jobs

Aws Data Engineer Iii    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Client: Utility & Energy

Role: AWS Data Engineer IV

Location: Charlotte, NC

Duration: 24-36 Months

Core Technical Skills:

  • 5+ years of AWS experience
  • AWS services - S3, EMR, Glue Jobs, Lambda, Athena, CloudTrail, SNS, SQS, CloudWatch, Step Functions, QuickSight
  • Experience with Kafka/Messaging preferably Confluent Kafka
  • Experience with EMR databases such as Glue Catalog, Lake Formation, Redshift, DynamoDB and Aurora
  • Experience with AWS data warehousing tools such as Amazon Redshift and Amazon Athena
  • Proven track record in the design and implementation of data warehouse solutions using AWS
  • Skilled in data modeling and executing ETL processes tailored for data warehousing
  • Competence in developing and refining data pipelines within AWS
  • Proficient in handling both real-time and batch data processing tasks
  • Extensive understanding of database management fundamentals
  • Expertise in creating alerts and automated solutions for handling production problems
  • Tools and Languages Python, Spark, PySpark and Pandas
  • Infrastructure as Code technology Terraform/CloudFormation
  • Experience with Secrets Management Platform like Vault and AWS Secrets manager
  • Experience with Event Driven Architecture
  • DevOps pipeline (CI/CD); Bitbucket; Concourse
  • Experience with RDBMS platforms and Strong proficiency with SQL
  • Experience with Rest APIs and API gateway
  • Deep knowledge of IAM roles and Policies
  • Experience using AWS monitoring services like CloudWatch, CloudTrail ad CloudWatch events
  • Deep understanding of networking DNS, TCP/IP and VPN
  • Experience with AWS workflow orchestration tool like Airflow or Step Functions

AWS Data Engineer IV Additional Technical Skills (nice to have, but not required for the role)

  • Experience with native AWS technologies for data and analytics such as Kinesis, OpenSearch
  • Databases - Document DB, Mongo DB
  • Hadoop platform (Hive; HBase; Druid)
  • Java, Scala, Node JS
  • Workflow Automation
  • Experience transitioning on premise big data platforms into cloud-based platforms such as AWS
  • Strong Background in Kubernetes, Distributed Systems, Microservice architecture and containers

Core Responsibilities

  • Provides technical direction, guides the team on key technical aspects and responsible for product tech delivery
  • Lead the Design, Build, Test and Deployment of components
  • Where applicable in collaboration with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead)
  • Understand requirements / use case to outline technical scope and lead delivery of technical solution
  • Confirm required developers and skillsets specific to product
  • Provides leadership, direction, peer review and accountability to developers on the product (key responsibility)
  • Works closely with the Product Owner to align on delivery goals and timing
  • Assists Product Owner with prioritizing and managing team backlog
  • Collaborates with Data and Solution architects on key technical decisions
  • The architecture and design to deliver the requirements and functionality
  • Skilled in developing data pipelines, focusing on long-term reliability and maintaining high data quality
  • Designs data warehousing solutions with the end-user in mind, ensuring ease of use without compromising on performance
  • Manage and resolve issues in production data warehouse environments on AWS

Core Experience and Abilities

  • Ability to perform hands on development and peer review for certain components / tech stack on the product
  • Standing up of development instances and migration path (with required security, access/roles)
  • Develop components and related processes (e.g. data pipelines and associated ETL processes, workflows)
  • Lead implementation of integrated data quality framework
  • Ensures optimal framework design and load testing scope to optimize performance (specifically for Big Data)
  • Supports data scientist with test and validation of models
  • Performs impact analysis and identifies risk to design changes
  • Ability to build new data pipelines, identify existing data gaps and provide automated solutions to deliver analytical capabilities and enriched data to applications
  • Ability to implement data pipelines with the right attentiveness to durability and data quality
  • Implements data warehousing products thinking of the end users experience (ease of use with right performance)
  • Ensures Test Driven development
  • 5+ years of Experience leading teams to deliver complex products
  • Strong technical skills and communication skills
  • Strong skills with business stakeholder interactions
  • Strong solutioning and architecture skills
  • 5+ years of Experience building real time data ingestion streams (event driven)
  • Ensure data security and permissions solutions, including data encryption, user access controls and logging

Loading
Please wait..!!