Job Description :
BIG DATA ARCHITECT

Location: Milwaukee, WI

Duration: 6+ Months

The Data Management is looking for a Solution Architect, Big Data to join their ever-growing team. The Solution Architect is responsible for the design and management of the data warehouse and processes that support the enterprises Big Data Initiatives. This person must have the ability to understand complex data relations, business requirements, data modeling, and formulate efficient and reliable solutions to difficult problems utilizing modern data architecture technologies aligned to industry best practices.

Primary Responsibilities
Partner with Data Management Team leads to define the architectural vision and direction of a Big Data Ecosystem that may comprise of a mix of Big Data Storage system such as Hadoop for batch analytics, MPP based Analytical platforms for near-time analytics, and NoSQL databases for online application access
Present solutions for the Data Management organization as well as the users it serves around the use and design of Big Data, BI, and Data Warehouse platforms and supporting technologies
Work with various global business teams in defining problem statements and gathering requirements
Drive use case analysis and architectural design around activities focused on determining how to best meet customer requirements within the tools of the ecosystem
Contribute to detailed project plans and lead technical project scoping and planning
Design, and develop automated test cases that verify solution feasibility and interoperability, including performance assessments.
Research new technologies and startups in the Big Data and BI space.
POC solution design and development
Develop benchmark, verification criteria and statistics analysis
Make sure that all aspects of the ecosystems solution architecture are optimized by working with (SMEs) in the areas of technology, information and application architectures and disciplines
Develop and implements ELT/ETL processes and procedures


Requirements/Qualifications
Bachelor''s degree or equivalent in Information Technology, Computer Sciences or Computer Engineering
8 years IT experience
1+ year of experience working on large scale Big Data projects.
2-5 years experience on Java/J2EE and understands OO concepts. (2-5 years)
5+ yeas experience on large scale DW or Big Data projects
Understanding of technologies and strategy required to operate and mature an enterprise Data Lake
Experience programing in SQL and Java to perform the data query, extract, transformation and load functions.
Broad understanding of BI concepts, best practices and solutions
Experience in the use and design of data warehouses and data marts
Understanding of semantic layer design and architectures to enable analytics and reporting efficiencies
Deep knowledge around the selection and defining of appropriate data model strategies within traditional SQL as well as NoSQL technologies
Deep knowledge of core Hadoop components (HDFS, MapReduce, Hive, Hbase, YARN), and substantial awareness of other components in the Big Data stack
Broad understanding of cloud solutions and experience working with the big three (Google, Amazon, Azure)
Experience architecting solutions for large data volumes (Terabytes of data) for complex reporting needs
Some experience in architecting solutions delivered on NoSQL stores (Hbase, Cassandra, MongoDB, etc
Broad understanding and experience of real-time analytics, NoSQL data stores, data modeling and data management, analytical tools, languages, or libraries (e.g. SAS, SPSS, R, Mahout
Ability to lead initiatives and people toward common goals.
Excellent written and verbal communication, presentation, and analytical skills
Project Management experience with agile and project management methodologies (Scrum and/or Kanban)
Experience with the entire Software Development Lifecycle (SDLC)
Ability to quickly learn new technologies/tools, adapt to new environments, and function within a team
Proficiency in Cascading, or Linux shell a plus

Preferred Qualifications
Master''s degree in Information Technology, Computer Sciences or Computer Engineering
Experience with Microsoft BI tools, specifically around SSAS and Power BI
Experience with the Hadoop technologies specific to Hortonworks
Experience in using Talend for data ingestion and integration on top of Hadoop

Looking forward for your response.
             

Similar Jobs you may be interested in ..