Job Description :
Job #167359 | Big Data (Hadoop) - Lewisville - 268221 - Duration: 6 months - USC, GC and EAD GC - Rates: $56/hr w2

Jd:

Skills/Experiences:

At least 5-7 years of experience in software application development At least 3 years experience with Big Data /
Hadoop architecture and related technologies Hands-on experience with Spark - RDDs, Datasets, Dataframes, Spark
SQL, Hands-on experinece with streaming technologies such Spark Streaming and Kafka Hands-on experience using
SQL, Spark SQL, HiveQL and performance tuning for big data operations Hands-on experience with Java 8, Scala or Python and use of IDEs for the same Hands-on experience using technologies such as Hive, Pig, Sqoop, Experience
building micro-services based application Experience dealing with SQL and NoSQL databases such as Orace, DB2,
Teradata, Cassandra Experience using CI/CD processes for application software integration and deployment using Maven, Git, Jenkins, Jules Experience building scalable and resilient applications in private or public cloud
environments and cloud technologies Experience using SDLC and Agile software development practices Experience
building enterprise applications enabled for logging, monitoring, alerting and operational control Experience enabling scheduling for big data jobs Hands-on experience working in unix environment Good written, verbal, presentation and interpersonal communication skills, given an opportunity willing to work in a challenging and cross platform environment.

Strong Analytical and problem-solving skills. Ability to quickly master new concepts and applications Preferable - experience in Financial industry Preferable - experience in Data Science, Machine Learning, Deep learning, Business Intelligence and Visualization
             

Similar Jobs you may be interested in ..