Job Description :
Title: Big Data Engineer
Location: McLean, VA
Length: 06 months Contract To Hire (CTH)

Skills:
Experience in developing and deploying distributed computing Big Data applications using Open Source frameworks like Apache Spark, Apex, Flink, Nifi, Storm and Kafka on AWS Cloud
Utilizing programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift
Utilizing Hadoop modules such as YARN & MapReduce, and related Apache projects such as Hive, Hbase, Pig, and Cassandra
Leveraging DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, utilizing tools like Jenkins, Maven, Nexus, Chef, Terraform, Ruby, Git and Docker
At least 5 years of Java development for modern data engineering
4+ years'' experience with Relational Database Systems and SQL (PostgreSQL or Redshift)
4+ years of UNIX/Linux experience
2+ years of experience with Cloud computing (AWS)
1+ years of experience with supervised machine learning
Good understanding of CAP Theorem and experience in addressing NFR constraints
Experience in developing large and complex event processing architecture
Experience in developing high volume transaction processing solutions that can be scaled for millions of daily transactions