• Snapboard
  • Activity
  • Reports
  • Campaign
Welcome ,

Chat with the recruiter

...Minimize

Hey I'm Online! Leave me a message.
Let me know if you have any questions.

Data Architect

In Connecticut / United States

Save this job

Data Architect   

Click on the below icons to share this job to Linkedin, Twitter!
JOB TITLE:
Data Architect
JOB TYPE:

JOB SKILLS:
JOB LOCATION:
Hartford Connecticut / United States

JOB DESCRIPTION :

Title: Data Architect

Location: Hartford, CT (Remote right now)

Duration: 6 - 12+ Months

Mandatory Skills: Talend, PySpark, Redshift

Job Description:

  • The Senior Data Architect will manage and provide expertise in data ingestion, wrangling, cleansing, technologies. In this role they will work with relational and unstructured data formats to create analytics-ready datasets for analytic solutions.
  • The senior data Architect will partner with the Data Analytics team to understand their data needs and build data pipelines using cutting edge technologies.
  • They will perform hands-on development to create, enhance and maintain data solutions enabling seamless integration and flow of data across our data ecosystem.
  • These projects will include designing and developing data ingestion and processing/transformation frameworks leveraging open-source tools such as Python, Spark, pySpark, etc.

Responsibilities:

  • Translating data and technology requirements into our ETL / ELT architecture.
  • Develop real-time and batch data ingestion and stream-analytic solutions leveraging technologies such as Kafka, Apache Spark, Java, NoSQL DBs, AWS EMR.
  • Develop data driven solutions utilizing current and next generation technologies to meet evolving business needs.
  • Develop custom cloud-based data pipeline.
  • Provide support for deployed data applications and analytical models by identifying data problems and guiding issue resolution with partner data engineers and source data providers.
  • Provide subject matter expertise in the analysis, preparation of specifications and plans for the development of data processes.
  • Qualifications:
  • Strong experience in data ingestion, gathering, wrangling and cleansing tools such as Apache NiFI, Kylo, Scripting, Power BI, Tableau and/or Qlik
  • Experience with data modeling, data architecture design and leveraging large-scale data ingest from complex data sources
  • Experience building and optimizing 'big data' data pipelines, architectures and data sets.
  • Advanced SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.

Position Details

Apr 17, 2021
S16174660446242646
Connecticut / United States
Hartford
A job sourcing event
In Dallas Fort Worth
Aug 19, 2017 9am-6pm
All job seekers welcome!

Data Architect    Apply

Click on the below icons to share this job to Linkedin, Twitter!

Title: Data Architect

Location: Hartford, CT (Remote right now)

Duration: 6 - 12+ Months

Mandatory Skills: Talend, PySpark, Redshift

Job Description:

  • The Senior Data Architect will manage and provide expertise in data ingestion, wrangling, cleansing, technologies. In this role they will work with relational and unstructured data formats to create analytics-ready datasets for analytic solutions.
  • The senior data Architect will partner with the Data Analytics team to understand their data needs and build data pipelines using cutting edge technologies.
  • They will perform hands-on development to create, enhance and maintain data solutions enabling seamless integration and flow of data across our data ecosystem.
  • These projects will include designing and developing data ingestion and processing/transformation frameworks leveraging open-source tools such as Python, Spark, pySpark, etc.

Responsibilities:

  • Translating data and technology requirements into our ETL / ELT architecture.
  • Develop real-time and batch data ingestion and stream-analytic solutions leveraging technologies such as Kafka, Apache Spark, Java, NoSQL DBs, AWS EMR.
  • Develop data driven solutions utilizing current and next generation technologies to meet evolving business needs.
  • Develop custom cloud-based data pipeline.
  • Provide support for deployed data applications and analytical models by identifying data problems and guiding issue resolution with partner data engineers and source data providers.
  • Provide subject matter expertise in the analysis, preparation of specifications and plans for the development of data processes.
  • Qualifications:
  • Strong experience in data ingestion, gathering, wrangling and cleansing tools such as Apache NiFI, Kylo, Scripting, Power BI, Tableau and/or Qlik
  • Experience with data modeling, data architecture design and leveraging large-scale data ingest from complex data sources
  • Experience building and optimizing 'big data' data pipelines, architectures and data sets.
  • Advanced SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.