Job Description :
Job Title: BigData Architect
Duration: 12+ months
Location: Reston, VA
Interview: Phone + Face to Face
Pay Terms: W2-1099


Job Description & Responsibilities:
The role of a big data solutions architect is a very technical one, but he or she should have skills that are important in designing the right architecture for the right need:

To be able to benchmark systems, analyze system bottlenecks and propose solutions to eliminate them;
To be able to clearly articulate the pros and cons of various technologies and platforms;
To be able to document use cases, solutions, and recommendations;
To have excellent written and verbal communication skills;
To be able to explain the work in plain language;
To be able to help program and project managers in the design, planning, and governance of implementing projects of any kind;
To be able to perform detailed analysis of business problems and technical environments and use this in designing the solution;
To be able to work creatively and analytically in a problem-solving environment;
To be a self-starter;
To be able to work in teams, as a big data environment is developed in a team of employees with different disciplines;
To be able to work in a fast-paced agile development environment.
To be able to develop the right solutions for Machine Learning algorithms
To be able to train and review developer/programmer codes.
A Big Data Solutions Architect should have a lot of experience gained in normal solutions architecture before making the move to big data solutions.
10-15 years of working experience is a must. Obviously, he or she needs to have experience with the major big data solutions like Hadoop, MapReduce, Hive, HBase, MongoDB, Cassandra.
Also need to have experience in big data solutions like Impala, Oozie, Mahout, Flume, ZooKeeper and/or Sqoop.
In addition to big data solutions, big data solutions architect needs to have a firm understanding of major programming/scripting languages like Java, Linux, PHP, Ruby, Phyton and/or R.
As well as have experience in working with ETL tools such as Informatica, Talend and/or Pentaho. He or she should have experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS and/or knowledge on NoSQL platforms.
When the big data solution will be developed in the cloud, the big data solutions architecture should have experience with one of the large cloud-computing infrastructure solutions like Amazon Web Services or Elastic MapReduce.
             

Similar Jobs you may be interested in ..