Job Description :
Are you serious about your career change? We are a fast growing company that is committed to helping in your next career move.
VLink is currently seeking a role of Big Data Architect to join one of our premier clients in Omaha, NE. The hiring manager is seeking for a professional with excellent communication skills.

Title: Big Data Architect
Location: Omaha, NE
Job Description
Data Architect is responsible for creating the Design, Development and Implementation of Big Data and Data warehouse based platform for acquiring data from internal and external systems. Responsibilities include analysis, design, development, implementation and support of the application(s) to benefit businesses. Design and Development of a Framework for common and consistent use of the intended design for real time streaming of data and batch process based data migration. Architect will be responsible for all deliverables from Review of requirements, developing Technical Design Documentation, Validation of test scripts and test cases, coordinate the implementation of the same into production and related support activities.
Core Responsibilities:
1. Lead the Architecture, design, development and Implementation of Big Data Applications.
2. Selecting the right Big Data tool sets to meet the requirement
3. Define the scope of the of multiple Big Data projects within the program
Day-to-Day Responsibilities:
1. Design, Develop and Implement Big Data Applications
2. Extensive experience application optimization (performance Tuning the application), Big Data – data architecture strategy
3. Develop Technical Design documentation, participate in architecture reviews with sr. architects in the organization, collaborate with Data Modeling, FRD and Testing team for successful implementation of planned deliverables with zero defects in time
4. Hands experience in developing routines using Spark/ Scala, HIVE, SQOOP and Linux scripting.
Key Deliverables:
1. Architect / Design, Lead, Implement Big Data Applications
Qualifications
Minimum 10-12 years of relevant experience in Enterprise Data Managament and computing tool like using Java, ETL products and BI / Datawarehousing products
Must have 2+ year experience working on Big Data Technologies, preferably Java, Spark and Big data tools like Hive, Scala, Pig, etc.,.
BS or MS in Computer Science
Skills Need for this position include a combination of:
Big Data – Spark, Impala, Hiva, Scala, Kafka, Hadoop (HDP), Sqoop
RDBMS - Teradata or SQL Server or Netezza or Oracle
Azure cloud based experience