Job Description :

 Big Data and Hadoop Engineer/Administrator Dallas, TX 12 months Required Skills Hadoop (Able to create framework from scratch) Spark OR Python Oracle Some = DBA experience as well Required: * 2-3 years, creating, maintaining and managing Hadoop clusters * 4-7 years of experience with development to centered around big data = applications, and adhoc transformation of unstructured raw data * 1-2 years Relational DBA experience, preferably with experience in Or= acle and MySQL, and preferably SQLServer. * Design, build, and maintain Big Data workflows/pipelines to process c= ontinuous stream of data with experience in end-to-end design and build pro= cess of Near-Real-Time and Batch Data Pipelines. * Demonstrated work experience in the following with Big Data and distr= ibuted programming models and technologies * Knowledge of database structures, theories, principles and practices = (both SQL and NoSQL). * Active development of ETL processes using Spark or other highly paral= lel technologies, and implementing ETL/data pipelines * Experience with Data technologies and Big Data tools, like Spark, Kaf= ka, Hive * Understanding of Map Reduce and other Data Query and Processing and a= ggregation models * Understanding of challenges of transforming data across distributed c= lustered environment * Experience with techniques for consuming, holding and aging out conti= nuous data streams * Ability to provide quick ingestion tools and corresponding access API= 's for continuously changing data schema, working closely with Data Enginee= rs around specific transformation and access needs Preferred: * Experience as Oracle Database administrator (DBA) will be responsible= for keeping critical tools database up and running * Building and managing high availability environments for databases an= d HDFS systems * Familiarity with transaction recovery techniques and DB Backup Technical qualifications and experience level: At least 10 years of combined proven working experience as a Spark/Big Dat= a developer, Relational DBA and Hadoop Admin 1. 5-10 years in development using Java, Python, Scala, and object-orien= ted approaches in designing, coding, testing, and debugging programs 2. Ability to create simple scripts and tools. 3. Development of cloud based, distributed applications 4. Understanding of clustering and cloud orchestration tools 5. Working knowledge of database standards and end user applications 6. Working knowledge of data backup, recovery, security, integrity and S= QL 7. Familiarity with database design, documentation and coding 8. Previous experience with DBA case tools (frontend/backend) and third = party tools 9. Understanding of distributed file systems, and their optimal use in t= he commercial cloud (HDFS, S3, Google File System, Datastax Delta lake) 10. Familiarity with programming languages API 11. Problem solving skills and ability to think algorithmically 12. Working Knowledge on administering RDBMS/ORDBMS preferably Oracle and= MySQL. 13. Working knowledge on Hadoop administration. 14. Knowledge of SDLC (Waterfall, Agile and Scrum) 15. BS degree in a computer discipline or relevant certification

             

Similar Jobs you may be interested in ..