Job Description :
Bachelor’s degree in Computer Science or equivalent. with minimum of 8+ years of experience.
Hadoop Architect: Experience with Hadoop and has an architecture background. Resource will be migrating Teradata to Hadoop. Experience with Teradata, SQL, Oracle
5+ years’ Experience as architecture or development lead in data analytics, data science or platform development
Extensive experience designing and implementing complex solutions for distributed systems
10+ years’ experience developing Big Data and eco system (e.g. Hadoop, Hive, Spark, HBase, Sqoop, Impala, Kafka, Flume, Oozie, MapReduce, S3 etc
Experience with emerging open source technologies (e.g Apache NiFi, Apache Ignite, etc)
Strong understanding of security protocols around authentication and authorization (e.g. Kerberos, LDAP, Sentry, Ranger, etc
Expert in data management programming such as SQL, PL-SQL, and Python
Experienced leader with excellent communication skills
Excellent written, verbal and diagramming skills.
Expertise in using PowerPoint and clearly articulating findings / presenting solutions
Responsibilities:
Work with the LOB business leaders, Enterprise Data Architecture, Information Security and LOB Architects to develop and maintain current and future state architecture for Data Analytics Platforms
Serve as the highest-level technical consultant to internal clients and technical management to ensure conformity with Enterprise Data Architecture
Communicate architectural decisions, plans, goals and strategies
Maintain in-depth knowledge of the organization''s technologies and architectures
Maintain in-depth knowledge of IT industry best practices, technologies, architectures and emerging technologies as it relates to distributed computing and data management
Initiate and deliver technology evaluations and recommendations especially in Big Data space.
Design and help develop ETL, replication schemes, and query optimization techniques that support highly-varied (structured, semi-structured, and unstructured) and high-velocity (near real-time) data processing and delivery
Ensure the data architecture is optimized for large dataset acquisition, analysis, storage, cleansing,
Develop standards and methodologies for benchmarking, performance, evaluation, testing, data security and data privacy
Recommend and set goals for unified data management practices such as meta-data management, provenance management, governance, stewardship, data quality and lifecycle management
Perform impact analysis, performance tuning, capacity planning for the enterprise data warehouse and its infrastructure as source systems are added and new integration business rules and logic are introduced
Establish and enforce policies, procedures, standards, methodologies, and metrics for data quality, metadata management, and master data management
Provide recommendations, technical direction and leadership for Hadoop and NoSQL technologies as part of the overall architecture
Participate in regular status meetings to track progress, resolve issues, mitigate risks and escalate concerns in a timely manner
Contribute to the development, review, and maintenance of requirements documents, technical design documents and functional specifications
             

Similar Jobs you may be interested in ..