Job Description :
Job Description for Big Data Platform Engineering

You will be involved on performing the day to day administrative, engineering and consulting tasks associated with our Hadoop, Cassandra, MongoDB, Big Data, MDM, and SQL/NOSQL Environments. As a member of the Big Data Platforms Engineering team, you will be managing petabyte scale Hadoop clusters. Implement best practices in code promotion, DevOPS practices, performance tuning, capacity planning and troubleshooting issues. You will also have exposure to postgres, mongodb and related technologies. You are expected to have a deep understanding of Hadoop stack such as HDFS, Java, JVM, MapReduce, HBase, Hive, Kafka, Pig, DevOPS, Virtualization, Distributed systems, and Spark. You will incorporate the requirements of Dell’s customers to improve Hadoop for large enterprise environment. You are expected to work in a very dynamic environment.



Requirements
Experience with Hadoop, Big Data Technologies, Distributed computing and parallel programming. Experience with Hadoop, NoSQL, MongoDB, and Cassandra strongly desired.
Performance tuning and Continuous Improvement of Hadoop infrastructure.
Building Hadoop Cluster, capacity planning, expanding and upgrading the clusters.
HDFS support and Maintenance
Cluster Monitoring and Troubleshooting
Manage and review Hadoop, and Big Data Tools logs.
Experience with MDM data governance tools like Collibra.
Expertized in Java, Python or Scala.
Experience with Hadoop/Spark and big data analysis.
Expertized writing scripts and code in multiple languages.
Work with infrastructure teams to install, upgrade and patches as required.
Work with data warehouse, security, network and storage teams to implement and troubleshoot.
Good Linux knowledge, understanding of shell, debugging performance issues etc.
Design, implement and maintains security at Hadoop and database level.
Understanding of Security, Authentication, and Authorization concepts, i.e., Encryption, Kerberization, SSL, LDAP, etc.
Experience in Data Ingestion into Hadoop and Big Data Technologies.
Experience with Stremsets and Kafka.
Good verbal and written communication skills
ETL, ELT and/or reporting experience
Knowledge of Data Science Tools and Technologies
Knowledge of Storage technologies.


Expected Level and Experience

Skill Level Years’ experience
Hadoop Expert 7+
Big Data Advanced 5+
SQL/NOSQL Expert 5+
Scripting – java/Python, Shell Expert 3+
Unix Expert 7+
DevOps Advanced 2+
Databases Expert 5+
Platform Administration Expert 6+