Job Description :
Position: Big data architect
Location: Chicago, IL
Mode: Full time
Salary Range: As in Market
Relocation allowance will be provided.


Role Summary/Purpose:
The candidate will build, optimize and maintain conceptual and logical database models in the enterprise data lake. The data architect will serve as a vital bridge from the business applications to technology architecture (TOGAF Architecture Development Method The data architect will develop database solutions to ensure company information is stored effectively and securely. The data architect will work closely with IT and functional teams to recommend database structures based on the data storage and retrieval needs within each data domains. The Data Architect will need to develop intimate knowledge of SYF key data domains (originations, loan activity, collection, etc and Maintain a holistic view across SYF functions to eliminate redundancies in data and data processing.

Essential Responsibilities:
Define logical and physical data model structures to store, integrate, govern, and maintain data in a secure and efficient manner while maintaining accuracy of the data in the enterprise data lake.
Map to information entities that can define how information should flow and be consumed by various business functions and IT customers. Create and own Data Flow Diagrams for data movement.
Assist in strategy to bring the existing data models and their transformational logic from legacy warehouses to a modern big data platform to support analytics, reporting, machine learning and AI applications.
Creates source to target mappings between legacy warehouses and the future state in the data lake for various business domains.
Keep integrity in data derivation across functions for a consistent view.
Identify opportunities to reuse existing data structures
Identify impact on existing model/solution triggered by any new/changed requirements maintaining integrity of the models.
Establish mechanism to validate source data populated into integrated data repositories.
Work as one team with peer architects, modelers, and data engineers.
Work in collaboration with Provisioning / ETL architect, System Architect and Reporting architects to strategize and execute on target data architecture, standardize and manage key dimensions.
Lead data governance activities - Proper cataloging of data and capturing end-to-end lineage.
Provide insight into the changing database storage and utilization requirements for the company and offer suggestions for solutions.
Develop database design and architecture documentation for the management and executive teams
Help maintain the integrity and security of the company data warehouses and the data lake.

Qualifications/Requirements:
Bachelor’s Degree in Computer Engineering or related field required (Master’s degree preferred)
5+ years’ experience with data warehouses and big data platforms with hands-on experience in
Rigorous data analysis through SQL in Oracle and various Hadoop technologies.
Hands-on experience with data modeling of refined data layers on top of raw data.
Performance Optimization of the physical models in the data lake.
2+ years’ of programming experience in ETL tools - Ab Initio or Informatica highly preferred. Be able to read and reverse engineer the logic in Ab Initio graphs.
Working knowledge of Hive, Spark, Kafka and other Data Lake technologies.
Strong familiarity with data governance, data lineage, data processes, DML, and data architecture control execution
Experience to analyze system requirements and implement migration methods for existing data.
Familiarity with predictive analysis and data visualization techniques using relevant tools (e.g. Tableau, D3.js and R required.
Experience working iteratively in a fast-paced agile environment.
Desired Characteristics:
Credit card/payment experience
Background in Financial Services
Some familiarity with TOGAF / leading data architecture framework
Demonstrated experience building strong relationships with colleagues and partners