image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Data Engineer

  • ... Posted on: Nov 26, 2024
  • ... Centstone
  • ... Telecommute, Minnesota
  • ... Salary: Not Available
  • ... Full-time

Data Engineer   

Job Title :

Data Engineer

Job Type :

Full-time

Job Location :

Telecommute Minnesota United States

Remote :

No

Jobcon Logo Job Description :

Job Title: Data Engineer

Location: Telecommute, MN (Remote Work)

Position Type: Contract

Job Description:

Project:
  • Working with customers to design data extracts based on their reporting needs
  • Collaborating with Data Engineers on the data exports required from data pipelines
  • Analyzing data and combining datasets to develop reporting products/solutions

Responsibilities:

  • Architect all phases of software engineering including requirements analysis, application design, code development and testing with a focus on business intelligence dataset development
  • Design reusable components, frameworks, and libraries
  • Contribute and make recommendations to the design and architecture to enable secure, scalable, and maintainable solutions
  • Clearly articulate the implications of design/architectural decisions, issues and plans to technology leadership
  • Work collaboratively with People Analytics on the development and production of standard datasets to drive actionable decision making and reporting stability
  • Conduct design and code reviews to ensure code developed meets business needs, coding best practices guidelines, unit testing, security, and scalability and maintainability guidelines
  • Work very closely with architecture groups and drive solutions
  • Assist with updating infrastructure components
  • Use engineering best practices following an Agile methodology to deliver high-quality emerging tech solutions
  • Communicate with impact - influence and negotiate effectively with all internal and external stakeholders to achieve win-win solutions that advance organizational goals
  • Grow and maintain knowledge of and leverage emerging technologies
  • Develop and analyze highly complex system standards, thresholds, and recommendations to maximize system performance
  • Analyze project requirements and develop detailed specifications for new data warehouse reporting requirements
  • Research API calls and make necessary changes to meet the business, contractual, security, performance needs
  • Assesses and interprets customer requests for feasibility, priority, and complexity
  • Create and maintain internal process documentation
  • Support projects and change initiatives aligned to key priorities of People Analytics and People Analytics customers
  • Understands priorities and organizes prescribed and non-prescribed work to meet or exceed deadlines and expectations
  • Proactively keeps data secure and decommissions legacy content in our environment
  • Serve as a resource to others within the People Analytics community; mentors other data engineers; provides explanations and information to others on difficult issues, problems, and solutions
  • Works with minimal guidance; seeks guidance on only the most complex tasks
  • Coaches, provides feedback, and guides others within the People Analytics community

Required:

  • 3+ years of data engineering experience
  • 3+ years of full lifecycle application, software development experience
  • 3+ years of modern programming language such as Python, Java, and Scala
  • 2+ years of SDLC experience in an Agile environment
  • Working knowledge of the following business and technology concepts: APIs, CI/CD, Big Data, data architecture and governance
  • Experience with Cloud technologies and platforms such as Docker, OSE, Kubernetes, AWS, Snowflake, and Azure
  • Experience with Jenkins, GitHub, Big Data technologies like Spark
  • Experience using IDEs such as Eclipse, JBoss, IntelliJ
  • Relational database experience
  • Experience ingesting and working with large and complex datasets
  • Experience gathering requirements from end users

Preferred:

  • Experience in data extraction, manipulation, and management
  • Experience researching issues, investigating data accuracy, and performing quality checks
  • Project Management skills
  • Ability to work independently and collaborate effectively with other

Jobcon Logo Position Details

Posted:

Nov 26, 2024

Employment:

Full-time

Salary:

Not Available

Snaprecruit ID:

SD-CIE-9cf192ba809d1e3307fcb59b74a19d5fe96da8457fe559ff1438e7d3912c5f47

City:

Telecommute

Job Origin:

CIEPAL_ORGANIC_FEED

Share this job:

  • linkedin

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Data Engineer    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Job Title: Data Engineer

Location: Telecommute, MN (Remote Work)

Position Type: Contract

Job Description:

Project:
  • Working with customers to design data extracts based on their reporting needs
  • Collaborating with Data Engineers on the data exports required from data pipelines
  • Analyzing data and combining datasets to develop reporting products/solutions

Responsibilities:

  • Architect all phases of software engineering including requirements analysis, application design, code development and testing with a focus on business intelligence dataset development
  • Design reusable components, frameworks, and libraries
  • Contribute and make recommendations to the design and architecture to enable secure, scalable, and maintainable solutions
  • Clearly articulate the implications of design/architectural decisions, issues and plans to technology leadership
  • Work collaboratively with People Analytics on the development and production of standard datasets to drive actionable decision making and reporting stability
  • Conduct design and code reviews to ensure code developed meets business needs, coding best practices guidelines, unit testing, security, and scalability and maintainability guidelines
  • Work very closely with architecture groups and drive solutions
  • Assist with updating infrastructure components
  • Use engineering best practices following an Agile methodology to deliver high-quality emerging tech solutions
  • Communicate with impact - influence and negotiate effectively with all internal and external stakeholders to achieve win-win solutions that advance organizational goals
  • Grow and maintain knowledge of and leverage emerging technologies
  • Develop and analyze highly complex system standards, thresholds, and recommendations to maximize system performance
  • Analyze project requirements and develop detailed specifications for new data warehouse reporting requirements
  • Research API calls and make necessary changes to meet the business, contractual, security, performance needs
  • Assesses and interprets customer requests for feasibility, priority, and complexity
  • Create and maintain internal process documentation
  • Support projects and change initiatives aligned to key priorities of People Analytics and People Analytics customers
  • Understands priorities and organizes prescribed and non-prescribed work to meet or exceed deadlines and expectations
  • Proactively keeps data secure and decommissions legacy content in our environment
  • Serve as a resource to others within the People Analytics community; mentors other data engineers; provides explanations and information to others on difficult issues, problems, and solutions
  • Works with minimal guidance; seeks guidance on only the most complex tasks
  • Coaches, provides feedback, and guides others within the People Analytics community

Required:

  • 3+ years of data engineering experience
  • 3+ years of full lifecycle application, software development experience
  • 3+ years of modern programming language such as Python, Java, and Scala
  • 2+ years of SDLC experience in an Agile environment
  • Working knowledge of the following business and technology concepts: APIs, CI/CD, Big Data, data architecture and governance
  • Experience with Cloud technologies and platforms such as Docker, OSE, Kubernetes, AWS, Snowflake, and Azure
  • Experience with Jenkins, GitHub, Big Data technologies like Spark
  • Experience using IDEs such as Eclipse, JBoss, IntelliJ
  • Relational database experience
  • Experience ingesting and working with large and complex datasets
  • Experience gathering requirements from end users

Preferred:

  • Experience in data extraction, manipulation, and management
  • Experience researching issues, investigating data accuracy, and performing quality checks
  • Project Management skills
  • Ability to work independently and collaborate effectively with other

Loading
Please wait..!!