image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Data Bricks

  • ... Posted on: Jan 05, 2026
  • ... Cloudious LLC
  • ... Jersey City, New Jersey
  • ... Salary: Not Available
  • ... CTC

Data Bricks   

Job Title :

Data Bricks

Job Type :

CTC

Job Location :

Jersey City New Jersey United States

Remote :

No

Jobcon Logo Job Description :

Boston/Jersey City
Contract-6 Months+
Job Description:
8 -12 Years of experience
Design, develop, and maintain scalable data pipelines and ETL processes using Python, Databricks, and AWS Glue.
Build and optimize data warehouses and data lakes on PySpark and AWS platforms.
Implement data ingestion, transformation, and orchestration workflows using AWS Step Functions, Lambda, and Glue Jobs.
Monitor, troubleshoot, and optimize pipeline performance using AWS CloudWatch and other monitoring tools.
Collaborate with data analysts, data scientists, and business stakeholders to define data requirements and deliver reliable datasets.
Ensure data quality, governance, and security across all data systems and workflows.
Automate repetitive data engineering tasks and contribute to building reusable frameworks and templates.
Support continuous improvement by adopting best practices for CI/CD, version control, and infrastructure-as-code (IaC).

Jobcon Logo Position Details

Posted:

Jan 05, 2026

Employment:

CTC

Salary:

Not Available

City:

Jersey City

Job Origin:

CIEPAL_ORGANIC_FEED

Share this job:

  • linkedin

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Data Bricks    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Boston/Jersey City
Contract-6 Months+
Job Description:
8 -12 Years of experience
Design, develop, and maintain scalable data pipelines and ETL processes using Python, Databricks, and AWS Glue.
Build and optimize data warehouses and data lakes on PySpark and AWS platforms.
Implement data ingestion, transformation, and orchestration workflows using AWS Step Functions, Lambda, and Glue Jobs.
Monitor, troubleshoot, and optimize pipeline performance using AWS CloudWatch and other monitoring tools.
Collaborate with data analysts, data scientists, and business stakeholders to define data requirements and deliver reliable datasets.
Ensure data quality, governance, and security across all data systems and workflows.
Automate repetitive data engineering tasks and contribute to building reusable frameworks and templates.
Support continuous improvement by adopting best practices for CI/CD, version control, and infrastructure-as-code (IaC).

Loading
Please wait..!!