Job Description :
Job Title: Big Data Solutions Architect
Rate: $60/hr on C2C all inclusive.
Location: Milwaukee WI
Duration: 3 month contract to hire


IV Process: 2 step process: May 5th and May 10th interview times for 30 minute phone screen, 30 minute onsite (panel style) , may have additional third round with higher up but onsite 30 minute

Project: Client needs help with a large scale data mapping/data modeling project, this candidate will have to actually design the data lake and where information will flow to and from.

Ideal candidates will have the following experience – Please see highlighted

Bachelor''s degree or equivalent in Information Technology, Computer Sciences or Computer Engineering
8 years IT experience
1+ year of experience working on large scale Big Data projects.
2-5 years’ experience on Java/J2EE and understands OO concepts. (2-5 years)
5+ yeas experience on large scale DW or Big Data projects
Understanding of technologies and strategy required to operate and mature an enterprise Data Lake
Experience programing in SQL and Java to perform the data query, extract, transformation and load functions.
Broad understanding of BI concepts, best practices and solutions
Experience in the use and design of data warehouses and data marts
Understanding of semantic layer design and architectures to enable analytics and reporting efficiencies
Deep knowledge around the selection and defining of appropriate data model strategies within traditional SQL as well as NoSQL technologies
Deep knowledge of core Hadoop components (HDFS, MapReduce, Hive, Hbase, YARN), and substantial awareness of other components in the Big Data stack
Broad understanding of cloud solutions and experience working with the big three (Google, Amazon, Azure)
Experience architecting solutions for large data volumes (Terabytes of data) for complex reporting needs
Some experience in architecting solutions delivered on NoSQL stores (Hbase, Cassandra, MongoDB, etc
Broad understanding and experience of real-time analytics, NoSQL data stores, data modeling and data management, analytical tools, languages, or libraries (e.g. SAS, SPSS, R, Mahout
Ability to lead initiatives and people toward common goals.
Excellent written and verbal communication, presentation, and analytical skills
Project Management experience with agile and project management methodologies (Scrum and/or Kanban)
Experience with the entire Software Development Lifecycle (SDLC)
Ability to quickly learn new technologies/tools, adapt to new environments, and function within a team
Proficiency in Cascading, or Linux shell a plus
The Data Management organization within the client’s is looking for a Solution Architect, Big Data to join their ever-growing team. The Solution Architect is responsible for the design and management of the data warehouse and processes that support the enterprise’s Big Data Initiatives. This person must have the ability to understand complex data relations, business requirements, data modeling, and formulate efficient and reliable solutions to difficult problems utilizing modern data architecture technologies aligned to industry best practices.

Primary Responsibilities

Partner with Data Management Team leads to define the architectural vision and direction of a Big Data Ecosystem that may comprise of a mix of Big Data Storage system such as Hadoop for batch analytics, MPP based Analytical platforms for near-time analytics, and NoSQL databases for online application access
Present solutions for the Data Management organization as well as the users it serves around the use and design of Big Data, BI, and Data Warehouse platforms and supporting technologies
Work with various global business teams in defining problem statements and gathering requirements
Drive use case analysis and architectural design around activities focused on determining how to best meet customer requirements within the tools of the ecosystem
Contribute to detailed project plans and lead technical project scoping and planning
Design, and develop automated test cases that verify solution feasibility and interoperability, including performance assessments.
Research new technologies and startups in the Big Data and BI space.
POC solution design and development
Develop benchmark, verification criteria and statistics analysis
Make sure that all aspects of the ecosystem’s solution architecture are optimized by working with (SMEs) in the areas of technology, information and application architectures and disciplines
Develop and implements ELT/ETL processes and procedures.

Preferred Qualifications
Master''s degree in Information Technology, Computer Sciences or Computer Engineering
Experience with Microsoft BI tools, specifically around SSAS and Power BI
Experience with the Hadoop technologies specific to Hortonworks
Experience in using Talend for data ingestion and integration on top of Hadoop
             

Similar Jobs you may be interested in ..