Job Description :
Job description :

· Responsibilities

o Building an end-to-end solution involving both traditional Enterprise Data Warehouse infrastructure as well as Enterprise Data Lake infrastructure (Hadoop ecosystem)
o Design and development of data transformation processes and pipelines
o Migrate data from source systems such as mainframe, RDBMS, files into a data lake solution
o Translate business requirements into technical artifacts (ETL/ELT processes, etc
o Participate in reference architecture build out of data lake and data warehouse solution
o Build EDW data models (dimensional) as well as Hive/HCatalog
o Build role-level security implementation
o Develop and execute enterprise self-service reporting environment on data visualization tools such as Tableau (served on custom portals built on JavaScript or .NET)
o
· Qualifications
o Strong communication and collaboration skills
o In depth experience in Hadoop ecosystem and expert level understanding
o Strong experience with HDFS, MapReduce, Parquet files, Sqoop, Oozie, Zookeeper, Pig, Hive, Linux OS.
o Strong experience with Dimensional (Star Schema) modeling - preferably on MSFT SQL Server
o Strong experience with MSFT SSIS
o In depth experience implementing and using BigData environments - for analytics, storage, integration with other technologies.

Reach me at anil(at)mysbscorp(dot)com