• Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,

Chat with the recruiter

...Minimize

Hey I'm Online! Leave me a message.
Let me know if you have any questions.

ETL Developer

Full-time In New Jersey / United States

Save this job

ETL Developer   

Click on the below icons to share this job to Linkedin, Twitter!
JOB TITLE:
ETL Developer
JOB TYPE:
Full-time

JOB SKILLS:
JOB LOCATION:
Trenton New Jersey / United States

JOB DESCRIPTION :
Role: ETL Developer
Location: Trenton, NJ
Duration: 6 Months
Job Description:
-Extract, transform, and load data from various Databases, Sequential
files.
-Create ETL flows to Integrate the On-Prem data to Cloud AWS S3Buckets.
-Use relevant test types and develop test strategy for all data
integration scenarios.
-Create ETL design documents that supports best practices and
development standards.
-Create a flow to load the data from Amazon S3 to Redshift using Glue or
any other ETL tools.
-Involve in developing and maintaining Scheduling and Sequence jobs with
complex dependencies.
-Create staging strategy for optimum reusability and performance.
-Integrate data from multiple source systems into Data Warehouse using
SCD type 1 and type 2 approaches.
-Write the UNIX commands to load the data.
-Develop DataStage Parallel Extender jobs using different stages like
Aggregator, Join, Merge, Lookup, Source dataset, External Filter, Row
generator, Column Generator, Change Capture, Copy, Funnel, Short, Peek
stages etc.
Qualifications:
-Must possess experience and skills in understanding data models and
developing ETL Jobs.
-Demonstrates the ability to design, code and test ETL routines
-Understands the principles of good ETL design.
-Experience with Snowflake, AWS and ETL processes in AWS preferred.
-Intermediate to advanced experience with Red Hat Satellite to include
systems provisioning, content view management, host group management,
Discovery, and Capsule management
-Experience working with AWS big data technologies (Redshift, S3, EMR)
-Experience in loading the data from S3 to Redshift.
-Demonstrates a detailed knowledge and understanding of large Enterprise
Data Warehouse, Data Mart.
-Expert in Data Integration and Metadata Management.
-Excellent knowledge of SQL and PL/SQL with the ability to extract,
manipulate and integrate enterprise data, developing reports and
expanding reporting capability utilizing SQL backend mining tools.
-Experience in coding dimension and fact tables. General understanding
of dimensional modeling and star schemas.
Preferred Skills:
-4+ DataStage ETL development experience. Experience with Version 11.5
and above is preferred. Should be able to design and develop data
extraction / integration solutions using DataStage.
- 2 years of experience with AWS S3, Redshift required.
-Strong professional experience loading the data from S3 to redshift.
-2 years Expertise in designing/implementing Star, snowflake Schema Data
Warehousing concepts(preferred).
-3 years in hands on experience in writing and understanding complex
SQL.
-Able to resolve and improve performance issues

Position Details

Jun 19, 2021
Full-time
S16174664993656438
New Jersey / United States
Trenton
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

ETL Developer    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Role: ETL Developer
Location: Trenton, NJ
Duration: 6 Months
Job Description:
-Extract, transform, and load data from various Databases, Sequential
files.
-Create ETL flows to Integrate the On-Prem data to Cloud AWS S3Buckets.
-Use relevant test types and develop test strategy for all data
integration scenarios.
-Create ETL design documents that supports best practices and
development standards.
-Create a flow to load the data from Amazon S3 to Redshift using Glue or
any other ETL tools.
-Involve in developing and maintaining Scheduling and Sequence jobs with
complex dependencies.
-Create staging strategy for optimum reusability and performance.
-Integrate data from multiple source systems into Data Warehouse using
SCD type 1 and type 2 approaches.
-Write the UNIX commands to load the data.
-Develop DataStage Parallel Extender jobs using different stages like
Aggregator, Join, Merge, Lookup, Source dataset, External Filter, Row
generator, Column Generator, Change Capture, Copy, Funnel, Short, Peek
stages etc.
Qualifications:
-Must possess experience and skills in understanding data models and
developing ETL Jobs.
-Demonstrates the ability to design, code and test ETL routines
-Understands the principles of good ETL design.
-Experience with Snowflake, AWS and ETL processes in AWS preferred.
-Intermediate to advanced experience with Red Hat Satellite to include
systems provisioning, content view management, host group management,
Discovery, and Capsule management
-Experience working with AWS big data technologies (Redshift, S3, EMR)
-Experience in loading the data from S3 to Redshift.
-Demonstrates a detailed knowledge and understanding of large Enterprise
Data Warehouse, Data Mart.
-Expert in Data Integration and Metadata Management.
-Excellent knowledge of SQL and PL/SQL with the ability to extract,
manipulate and integrate enterprise data, developing reports and
expanding reporting capability utilizing SQL backend mining tools.
-Experience in coding dimension and fact tables. General understanding
of dimensional modeling and star schemas.
Preferred Skills:
-4+ DataStage ETL development experience. Experience with Version 11.5
and above is preferred. Should be able to design and develop data
extraction / integration solutions using DataStage.
- 2 years of experience with AWS S3, Redshift required.
-Strong professional experience loading the data from S3 to redshift.
-2 years Expertise in designing/implementing Star, snowflake Schema Data
Warehousing concepts(preferred).
-3 years in hands on experience in writing and understanding complex
SQL.
-Able to resolve and improve performance issues