Job Description :
I have mentioned the job description for your reference·

Role – Spark Developer

Location – McLean/Richmond/Plano

Duration- Long Term

Phone and Skype

Only USC/ GC/ L2EAD/ TN

Mandatory Skills:

Top needed skills: Java or Scala, Python, PySpark, Apache Spark, Kafka, AWS/Cloud, Hadoop

Job Description : Spark and Scala with Java or Data (ETL) background.
Your experience probably includes a few of these:
- Experience in data management solutions architecture, development, and deployment
- Data mining, machine learning, statistical modeling tools or underlying algorithms
- Proven track record of end-to-end implementation of integrated data management solutions leveraging the Hadoop Ecosystem and Spark with Java or Scala.
- Should have a deep understanding of Core Java and is expected to perform complex data transformations in Java and Spark using Scala language.
- A strong willingness to learn new technologies and ability to think critically is required.
- Exposure to AWS echo system and preferably hands-on experience in AWS tool stack.
- UNIX/ Linux skills, including basic commands, shell scripting, and system administration/configuration
             

Similar Jobs you may be interested in ..