Job Description :
Need Sr Big Data Developer
Location : Dallas,TX
Full Time

Technical Skills:
Apache Kafka/Confluent Kafka
Streaming programming - Spark ; Scala; Java
HDFS technologies (Hbase, HIVE)
Other optional - Fluentd/Syslog Ng/etc.,)

Experience with :

Implement and develop Cloudera Hadoop data driven platform with advanced capabilities to meet business and infrastructure needs
Leads the discovery phase, design and development of medium to large scale complex projects with agile approach and security standards.
Leads and participates in proof-of-concept for prototypes & validate ideas, automating platform installation, configuration and operations processes and tasks (Site reliability engineering) of global events data platform
Contributes to continuous improvement by providing optimized practices, efficiency practices in current core services (platform, and infrastructure) areas
Work with offshore team and provides development opportunities for associates
Supporting change management and operations support for security events platform with ITSM/ITIL standards

Qualifications:

Bachelor's degree in Computer Science, Information Systems, Math or equivalent training and relevant experience
10+ years of work experience within one or more IT organizations. Prior work experience in the technology engineering and development is plus.
5+ years of advanced Java/Python Development experience (spring boot/python, server-side components preferred)
2+ years of Hadoop ecosystem (HDFS, Hbase, Spark, Zookeeper, Impala, Flume, Parquet, Avro) experience for high volume based platforms and scalable distributed systems
Experience working with data model, frameworks and open source software, Restful API design and development, and software design patterns
Experience with Agile/Scrum methodologies, FDD (Feature data driven), TDD (Test Driven Development), Elastic search (ELK), Automation of SRE for Hadoop technologies, Cloudera, Kerberos, Encryption, Performance tuning, and CI/CD (Continuous integration & deployment)
Capable of full lifecycle development: user requirements, user stories, development with a team and individually, testing and implementation
Knowledgeable in technology infrastructure stacks a plus; including: Windows and Linux Operating systems, Network (TCP/IP), Storage, Virtualization, DNS/DHCP, Active Directory/LDAP, cloud, Source control/Git, ALM tools (Confluence, Jira), API (Swagger, Gateway), Automation (Ansible/Puppet)
Production Implementation Experience in projects with considerable data size (in Petabytes PB) and complexity
Strong communication and written communications skills with the ability to be highly effective with both technical and business partners. Ability to operate effectively and independently in a dynamic, fluid environment.




Please send me the resume to OR you can reach me at.
             

Similar Jobs you may be interested in ..