Job Description :
Position; Data Architect
Location: Minneapolis, MN
Duration: Full Time

Role Data Architect.
Roles and responsibility "• 10-15 years of working experience with 3+ years of experience as Big Data solutions architect. Needs to have experience with the major big data solutions like Hadoop, Kafka, Nifi, MapReduce, Hive, HBase, MongoDB, Cassandra, Spark, .Impala, Oozie, , Flume, ZooKeeper, Sqoop etc, NoSQL databases.
Big Data Solution Architect Certified Preferred. Hands-on experience on Hadoop implementations preferred
Big Data Certification is a must.
Work experience on various distributions such as Cloudera, Hortonworks, MapR etc.
Translate complex functional and technical requirements into detailed design.
Propose best practices/standards with data security and privacy handling experience.
Knowledge in handling different kinds of source systems and different formats of data.
Hands-on experience with Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning
Strong knowledge of major programming/scripting languages like Java, Linux, R, Scala etc. As well as have experience in working with ETL tools such as Informatica, Talend and/or Pentaho.
Experience in designing multiple data lake solutions with a good understanding of cluster and parallel architecture.
Experience Cloud Computing.
To be able to benchmark systems, analyze system bottlenecks and propose solutions to eliminate them;
To be able to clearly articulate pros and cons of various technologies and platforms;
To have excellent written and verbal communication skills;
To be able to perform detailed analysis of business problems and technical environments and use this in designing the solution;
To be able to work in a fast-paced agile development environment.

"
Required Technical / Functional Skills "• 10-15 years of working experience with 3+ years of experience as Big Data solutions architect. Needs to have experience with the major big data solutions like Hadoop, MapReduce, Hive, HBase, MongoDB, Cassandra, Spark, .Impala, Oozie, , Flume, ZooKeeper, Sqoop, Kafka, Nifi, etc, NoSQL databases.
Big Data Solution Architect Certified Preferred. Hands-on experience on Hadoop implementations preferred
Big Data Certification is a must.
Work experience on various distributions such as Cloudera, Hortonworks, MapR etc.
Translate complex functional and technical requirements into detailed design.
Propose best practices/standards with data security and privacy handling experience.
Knowledge in handling different kinds of source systems and different formats of data.
Hands-on experience with Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning
Strong knowledge of major programming/scripting languages like Java, Linux, R, Scala etc. As well as have experience in working with ETL tools such as Informatica, Talend and/or Pentaho.
Experience in designing multiple data lake solutions with a good understanding of cluster and parallel architecture.
Experience Cloud Computing.
To be able to benchmark systems, analyze system bottlenecks and propose solutions to eliminate them;
To be able to clearly articulate pros and cons of various technologies and platforms;
To have excellent written and verbal communication skills;
To be able to perform detailed analysis of business problems and technical environments and use this in designing the solution;
To be able to work in a fast-paced agile development environment.
"
Desired Technical / Functional Skills "• Experience in Hortonworks Hadoop Platform
Knowledge in Airline Domain
"
Total experience in required skill (years) 10 Years
Pay Yearly Salary:$130K Per anum
Work Location Atlanta
Duration : Full Time


Client : TCS