image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Aws Data Engineer Snowflake Pyspark Kafka

  • ... Posted on: Feb 11, 2025
  • ... Artmac Soft LLC
  • ... Dallas, Texas
  • ... Salary: Not Available
  • ... CTC

Aws Data Engineer Snowflake Pyspark Kafka   

Job Title :

Aws Data Engineer Snowflake Pyspark Kafka

Job Type :

CTC

Job Location :

Dallas Texas United States

Remote :

No

Jobcon Logo Job Description :

Who we are:

Artmac Soft is a technology consulting and service-oriented IT company dedicated to providing innovative technology solutions and services to Customers.

Job Description:

Job Title : AWS Data Engineer Snowflake, PySpark & Kafka

Job Type : W2

Experience : 8 to 10 years

Location : Texas,Dallas

Responsibilities:

  • 3+ years of experience in Data Engineering, Big Data, or Cloud Data Solutions
  • Experience with Airflow or similar workflow orchestration tools.
  • Knowledge of Terraform, CloudFormation, or other IaC tools.
  • Familiarity with machine learning pipelines and MLOps is a plus.
  • Experience with Apache Kafka for real-time data streaming and event-driven architectures.
  • Strong hands-on experience with AWS services such as AWS Glue, S3, Lambda, and Redshift.
  • Expertise in SQL and working with relational and NoSQL databases.
  • Proficiency in PySpark and Python for data processing and analytics.
  • Experience with Apache Kafka for real-time data streaming and event-driven architectures.
  • Hands-on experience with Snowflake, including data modeling, query optimization, and performance tuning.
  • Understanding of CI/CD pipelines, Infrastructure-as-Code (IaC), and DevOps in cloud environments.
  • Strong problem-solving and communication skills.
  • Design and implement data models and warehouse solutions in Snowflake for optimized analytics and reporting.
  • Integrate streaming and batch data sources using Kafka and other messaging technologies.
  • Work with cross-functional teams to ingest, clean, transform, and store structured and unstructured data.
  • Optimize performance, scalability, and reliability of data workflows on AWS cloud infrastructure.
  • Implement best practices for data security, governance, and compliance.
  • Monitor, debug, and improve data pipelines to ensure high availability and consistency.

Qualification:

  • Bachelor's degree or equivalent combination of education and experience.

Jobcon Logo Position Details

Posted:

Feb 11, 2025

Employment:

CTC

Salary:

Not Available

Snaprecruit ID:

SD-CIE-be60aa7bdfa241de121a476a3d4849f8bdfba710f9a74ca00c7e0b38155deba7

City:

Dallas

Job Origin:

CIEPAL_ORGANIC_FEED

Share this job:

  • linkedin

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Aws Data Engineer Snowflake Pyspark Kafka    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Who we are:

Artmac Soft is a technology consulting and service-oriented IT company dedicated to providing innovative technology solutions and services to Customers.

Job Description:

Job Title : AWS Data Engineer Snowflake, PySpark & Kafka

Job Type : W2

Experience : 8 to 10 years

Location : Texas,Dallas

Responsibilities:

  • 3+ years of experience in Data Engineering, Big Data, or Cloud Data Solutions
  • Experience with Airflow or similar workflow orchestration tools.
  • Knowledge of Terraform, CloudFormation, or other IaC tools.
  • Familiarity with machine learning pipelines and MLOps is a plus.
  • Experience with Apache Kafka for real-time data streaming and event-driven architectures.
  • Strong hands-on experience with AWS services such as AWS Glue, S3, Lambda, and Redshift.
  • Expertise in SQL and working with relational and NoSQL databases.
  • Proficiency in PySpark and Python for data processing and analytics.
  • Experience with Apache Kafka for real-time data streaming and event-driven architectures.
  • Hands-on experience with Snowflake, including data modeling, query optimization, and performance tuning.
  • Understanding of CI/CD pipelines, Infrastructure-as-Code (IaC), and DevOps in cloud environments.
  • Strong problem-solving and communication skills.
  • Design and implement data models and warehouse solutions in Snowflake for optimized analytics and reporting.
  • Integrate streaming and batch data sources using Kafka and other messaging technologies.
  • Work with cross-functional teams to ingest, clean, transform, and store structured and unstructured data.
  • Optimize performance, scalability, and reliability of data workflows on AWS cloud infrastructure.
  • Implement best practices for data security, governance, and compliance.
  • Monitor, debug, and improve data pipelines to ensure high availability and consistency.

Qualification:

  • Bachelor's degree or equivalent combination of education and experience.

Loading
Please wait..!!