image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,
loadingbar
Loading, Please wait..!!

Data Engineer Telecommute

  • ... Posted on: Nov 26, 2024
  • ... Goli Tech
  • ... TELECOMMUTE, Minnesota
  • ... Salary: Not Available
  • ... Full-time

Data Engineer Telecommute   

Job Title :

Data Engineer Telecommute

Job Type :

Full-time

Job Location :

TELECOMMUTE Minnesota United States

Remote :

No

Jobcon Logo Job Description :

100% telecommute

Hours: Regular day working hours with some flexibility, but must be available for most of Central Time working hours with a heavier focus on availability in the mornings.

Project:

  • Working with customers to design data extracts based on their reporting needs
  • Collaborating with IT Dev Ops partners to ingest extracts into a data warehouse
  • Creating scalable data pipelines for reporting teams from the data warehouse to our BI tool

Team:

  • We are a team of 9 Data Engineers and Analysts + Engineering Managers/Leaders
  • We have 1 product owner on the team and work with several others directly
  • We have a Scrum Master aligned as well

Responsibilities:

  • Architect all phases of software engineering including requirements analysis, application design, code development and testing with a focus on business intelligence dataset development
  • Design reusable components, frameworks, and libraries
  • Contribute and make recommendations to the design and architecture to enable secure, scalable, and maintainable solutions
  • Clearly articulate the implications of design/architectural decisions, issues and plans to technology leadership
  • Work collaboratively with People Analytics on the development and production of standard datasets to drive actionable decision making and reporting stability
  • Conduct design and code reviews to ensure code developed meets business needs, coding best practices guidelines, unit testing, security, and scalability and maintainability guidelines
  • Work very closely with architecture groups and drive solutions
  • Assist with updating infrastructure components
  • Use engineering best practices following an Agile methodology to deliver high-quality emerging tech solutions
  • Communicate with impact - influence and negotiate effectively with all internal and external stakeholders to achieve win-win solutions that advance organizational goals
  • Grow and maintain knowledge of and leverage emerging technologies
  • Develop and analyze highly complex system standards, thresholds, and recommendations to maximize system performance
  • Analyze project requirements and develop detailed specifications for new data warehouse reporting requirements
  • Research API calls and make necessary changes to meet the business, contractual, security, performance needs
  • Assesses and interprets customer requests for feasibility, priority, and complexity
  • Create and maintain internal process documentation
  • Support projects and change initiatives aligned to key priorities of People Analytics and People Analytics customers
  • Understands priorities and organizes prescribed and non-prescribed work to meet or exceed deadlines and expectations
  • Proactively keeps data secure and decommissions legacy content in our environment
  • Serve as a resource to others within the People Analytics community; mentors other data engineers; provides explanations and information to others on difficult issues, problems, and solutions
  • Works with minimal guidance; seeks guidance on only the most complex tasks
  • Coaches, provides feedback, and guides others within the People Analytics community

Required Qualifications:

  • 3+ years of data engineering experience
  • 3+ years of full lifecycle application, software development experience
  • 3+ years of modern programming language such as Python, Java, and Scala
  • 2+ years of SDLC experience in an Agile environment
  • Working knowledge of the following business and technology concepts: APIs, CI/CD, Big Data, data architecture and governance
  • Experience with Cloud technologies and platforms such as Docker, OSE, Kubernetes, AWS, Snowflake, and Azure
  • Experience with Jenkins, GitHub, Big Data technologies like Spark
  • Experience using IDEs such as Eclipse, JBoss, IntelliJ
  • Relational database experience
  • Experience ingesting and working with large and complex datasets
  • Experience gathering requirements from end users

Preferred Qualifications:

  • Bachelor's degree in Computer Science, Engineering, or Technology
  • Experience with People Data
  • Experience with disaster and recovery models
  • DevOps experience
  • Experience creating user stories in an agile tool using gherkin format methodology

Required Skills : Data Analysis

Basic Qualification :

Additional Skills : Data Engineer

This is a high PRIORITY requisition. This is a PROACTIVE requisition

Background Check : No

Drug Screen : No

Jobcon Logo Position Details

Posted:

Nov 26, 2024

Employment:

Full-time

Salary:

Not Available

Snaprecruit ID:

SD-CIE-32f21e81fcc1feb7ba4ba951336a847e1db728c8879a860afe6f2c319cda69e2

City:

TELECOMMUTE

Job Origin:

CIEPAL_ORGANIC_FEED

Share this job:

  • linkedin

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Data Engineer Telecommute    Apply

Click on the below icons to share this job to Linkedin, Twitter!

100% telecommute

Hours: Regular day working hours with some flexibility, but must be available for most of Central Time working hours with a heavier focus on availability in the mornings.

Project:

  • Working with customers to design data extracts based on their reporting needs
  • Collaborating with IT Dev Ops partners to ingest extracts into a data warehouse
  • Creating scalable data pipelines for reporting teams from the data warehouse to our BI tool

Team:

  • We are a team of 9 Data Engineers and Analysts + Engineering Managers/Leaders
  • We have 1 product owner on the team and work with several others directly
  • We have a Scrum Master aligned as well

Responsibilities:

  • Architect all phases of software engineering including requirements analysis, application design, code development and testing with a focus on business intelligence dataset development
  • Design reusable components, frameworks, and libraries
  • Contribute and make recommendations to the design and architecture to enable secure, scalable, and maintainable solutions
  • Clearly articulate the implications of design/architectural decisions, issues and plans to technology leadership
  • Work collaboratively with People Analytics on the development and production of standard datasets to drive actionable decision making and reporting stability
  • Conduct design and code reviews to ensure code developed meets business needs, coding best practices guidelines, unit testing, security, and scalability and maintainability guidelines
  • Work very closely with architecture groups and drive solutions
  • Assist with updating infrastructure components
  • Use engineering best practices following an Agile methodology to deliver high-quality emerging tech solutions
  • Communicate with impact - influence and negotiate effectively with all internal and external stakeholders to achieve win-win solutions that advance organizational goals
  • Grow and maintain knowledge of and leverage emerging technologies
  • Develop and analyze highly complex system standards, thresholds, and recommendations to maximize system performance
  • Analyze project requirements and develop detailed specifications for new data warehouse reporting requirements
  • Research API calls and make necessary changes to meet the business, contractual, security, performance needs
  • Assesses and interprets customer requests for feasibility, priority, and complexity
  • Create and maintain internal process documentation
  • Support projects and change initiatives aligned to key priorities of People Analytics and People Analytics customers
  • Understands priorities and organizes prescribed and non-prescribed work to meet or exceed deadlines and expectations
  • Proactively keeps data secure and decommissions legacy content in our environment
  • Serve as a resource to others within the People Analytics community; mentors other data engineers; provides explanations and information to others on difficult issues, problems, and solutions
  • Works with minimal guidance; seeks guidance on only the most complex tasks
  • Coaches, provides feedback, and guides others within the People Analytics community

Required Qualifications:

  • 3+ years of data engineering experience
  • 3+ years of full lifecycle application, software development experience
  • 3+ years of modern programming language such as Python, Java, and Scala
  • 2+ years of SDLC experience in an Agile environment
  • Working knowledge of the following business and technology concepts: APIs, CI/CD, Big Data, data architecture and governance
  • Experience with Cloud technologies and platforms such as Docker, OSE, Kubernetes, AWS, Snowflake, and Azure
  • Experience with Jenkins, GitHub, Big Data technologies like Spark
  • Experience using IDEs such as Eclipse, JBoss, IntelliJ
  • Relational database experience
  • Experience ingesting and working with large and complex datasets
  • Experience gathering requirements from end users

Preferred Qualifications:

  • Bachelor's degree in Computer Science, Engineering, or Technology
  • Experience with People Data
  • Experience with disaster and recovery models
  • DevOps experience
  • Experience creating user stories in an agile tool using gherkin format methodology

Required Skills : Data Analysis

Basic Qualification :

Additional Skills : Data Engineer

This is a high PRIORITY requisition. This is a PROACTIVE requisition

Background Check : No

Drug Screen : No

Loading
Please wait..!!