Job Description :
Job Title: Data Engineer with Hadoop ecosystem exp
Location: Oakland, CA
Duration: 6+ Months

extensive knowledge in Hadoop ecosystem
Build end-to-end solutions involving both traditional Enterprise Data Warehouse infrastructure and Enterprise Data Lake infrastructure (Hadoop ecosystem)
Design and development of data transformation processes and pipelines
Migrate data from source systems such as mainframe, RDBMS, files into a data lake solution
Translate business requirements into technical artifacts (ETL/ELT processes, etc
Participate in reference architecture build out of data lake and data warehouse solution
Build EDW data models (dimensional) as well as Hive/HCatalog
Build role-level security implementation
Develop and execute enterprise self-service reporting environment on data visualization tools such as Tableau (served on custom portals built on JavaScript or .NET)

Strong communication and collaboration skills
In depth experience in Hadoop ecosystem with expert level understanding
Strong experience with HDFS, MapReduce, Parquet files, Sqoop, Oozie, Zookeeper, Pig, Hive, and Linux OS
Strong experience with Dimensional (Star Schema) modeling - preferably on MSFT SQL Server
Strong experience with MSFT SSIS
In depth experience implementing and using BigData environments - for analytics, storage, integration with other technologies