Find Technical Data Analyst Job in San Jose, California | Snaprecruit

Find Technical Data Jobs in San Jose
image
  • Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,

Technical Data Analyst

  • ... Cloudious LLC
  • ... San Jose, California,
  • ...

    CTC

  • ... Salary: 90 per hour
  • Posted on: Sep 13, 2024

Technical Data Analyst   

JOB TITLE:

Technical Data Analyst

JOB TYPE:

CTC

JOB LOCATION:

San Jose California United States

REMOTE:

No

JOB DESCRIPTION:

Title: Technical Data Analyst
Location: San Jose, CA (local candidates preferred)
Type: Contract
*** Hybrid role (2-3 days a week)
Job Description:
The Opportunity
Client is looking for a Technical Data Analyst who has hands on experience migrating Sales users and Sales data like Accounts, Opportunities, Contacts from to Dynamics 365. He should also have strong experience in data profiling, data integration, transformations, understanding how data impacts downstream systems like Sales and Finance data warehouses/data lakes, reporting etc.
What You'll Do
  • Work with Adobe internal data teams and business teams to decipher the data related requirements for the project
  • Need to have hands on experience in understanding the SFDC to Dynamics field mappings & data models.
  • Design end to end strategy to stitch objects from various systems and financial KPIs
  • Building integrations between SFDC, Dynamics, SAP ECC and DBX for Power BI reporting and financial metrics.
  • Leverage data sources across the enterprise to build sophisticated and insightful analyses and data models for Sales, Finance and Marketing
  • Need to have hands on experience with migrating Marketing and Sales data and users from to Dynamics
  • Work with the Product Managers to build detailed data requirements/specifications for Engineering teams to build the solution in downstream data management and reporting systems.
  • Need to understand the migration challenges from similar experiences and build creative solutions to help with migrating data from SFDC to Dynamics.
  • Consolidate requirements and suggest building new reporting capabilities for analysis using advanced BI techniques and tools.
  • Proactively collaborate with various product managers to bring a perspective on all data we work on.
  • Conducts QA testing and validations and provides inputs to the Engineering teams along with the PdMs
  • Support Release Planning, scheduling backlog items into regular releases aligned to business priority while working with the PdM's
  • Supports production cutover and Production acceptance testing.
  • Supports post go live sessions with business, addresses and drives technical issues raised during Hyper Care. This will be done with the PdM's and Engineering teams.
Qualifications
  • Requires bachelor's degree. Preferred candidates will have a major in computer science, MBA from reputable institution or equivalent experience.
  • 4+ years of data analytics, 'data BSA' or data product management experience with solid understanding of how to deliver data solutions in an agile environment.
  • Strong proficiency in SQL/SparkSQL/Python to query and manipulate large data sets.
  • Experience with platforms like Databricks, Power BI and Tableau.
  • You are a self-starter, independent, hard worker, with a high degree of motivation to go above and beyond the task at hand. You anticipate and creatively implement next steps
  • in a complex environment.
  • You have mastered the ability to influence outcomes, navigate, mediate to consensus with integrity. You possess great interpersonal communication, presentation skills, and
  • social skills and a solid sense of humor.
  • Data requirement writing skills: collecting, prioritizing, and gathering input from multiple sources, providing accurate requirements with attention to detail.
  • You already know or can rapidly learn enterprise application capabilities in order to deliver transaction and event-driven data solutions (examples: SAP/HANA, MS Dynamics or Salesforce Data, ADLS/Hadoop/Databricks datalake/lakehouse solutions, and/or Kafka streams)

Position Details

POSTED:

Sep 13, 2024

EMPLOYMENT:

CTC

SALARY:

90 per hour

SNAPRECRUIT ID:

SD-017cddb21f9cbf707cbac036ca529cdd444ed1ab2aa6f427766faf4cabcd303c

CITY:

San Jose

Job Origin:

CIEPAL_ORGANIC_FEED

Jobcon Logo
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Similar Jobs

Technical Data Analyst    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Title: Technical Data Analyst
Location: San Jose, CA (local candidates preferred)
Type: Contract
*** Hybrid role (2-3 days a week)
Job Description:
The Opportunity
Client is looking for a Technical Data Analyst who has hands on experience migrating Sales users and Sales data like Accounts, Opportunities, Contacts from to Dynamics 365. He should also have strong experience in data profiling, data integration, transformations, understanding how data impacts downstream systems like Sales and Finance data warehouses/data lakes, reporting etc.
What You'll Do
  • Work with Adobe internal data teams and business teams to decipher the data related requirements for the project
  • Need to have hands on experience in understanding the SFDC to Dynamics field mappings & data models.
  • Design end to end strategy to stitch objects from various systems and financial KPIs
  • Building integrations between SFDC, Dynamics, SAP ECC and DBX for Power BI reporting and financial metrics.
  • Leverage data sources across the enterprise to build sophisticated and insightful analyses and data models for Sales, Finance and Marketing
  • Need to have hands on experience with migrating Marketing and Sales data and users from to Dynamics
  • Work with the Product Managers to build detailed data requirements/specifications for Engineering teams to build the solution in downstream data management and reporting systems.
  • Need to understand the migration challenges from similar experiences and build creative solutions to help with migrating data from SFDC to Dynamics.
  • Consolidate requirements and suggest building new reporting capabilities for analysis using advanced BI techniques and tools.
  • Proactively collaborate with various product managers to bring a perspective on all data we work on.
  • Conducts QA testing and validations and provides inputs to the Engineering teams along with the PdMs
  • Support Release Planning, scheduling backlog items into regular releases aligned to business priority while working with the PdM's
  • Supports production cutover and Production acceptance testing.
  • Supports post go live sessions with business, addresses and drives technical issues raised during Hyper Care. This will be done with the PdM's and Engineering teams.
Qualifications
  • Requires bachelor's degree. Preferred candidates will have a major in computer science, MBA from reputable institution or equivalent experience.
  • 4+ years of data analytics, 'data BSA' or data product management experience with solid understanding of how to deliver data solutions in an agile environment.
  • Strong proficiency in SQL/SparkSQL/Python to query and manipulate large data sets.
  • Experience with platforms like Databricks, Power BI and Tableau.
  • You are a self-starter, independent, hard worker, with a high degree of motivation to go above and beyond the task at hand. You anticipate and creatively implement next steps
  • in a complex environment.
  • You have mastered the ability to influence outcomes, navigate, mediate to consensus with integrity. You possess great interpersonal communication, presentation skills, and
  • social skills and a solid sense of humor.
  • Data requirement writing skills: collecting, prioritizing, and gathering input from multiple sources, providing accurate requirements with attention to detail.
  • You already know or can rapidly learn enterprise application capabilities in order to deliver transaction and event-driven data solutions (examples: SAP/HANA, MS Dynamics or Salesforce Data, ADLS/Hadoop/Databricks datalake/lakehouse solutions, and/or Kafka streams)

Loading
Please wait..!!