Job Description :
Bachelor’s degree in Engineering/ Master’s in Computers.
Should have minimum 14 years of experience in DWBI set ups.
Should have good knowledge of Healthcare/Life Sciences domain.
Possess excellent verbal and written communication skills.
Should have experience in building large scale data warehouses, development/migration to Teradata.
Should be hands on in Teradata with strong understanding of architecture and underlying data distribution.
Strong knowledge of ELT and in-database processing, Workload management (WLM)
Strong understanding of duplicate processing, key generation techniques, fast/multi load etc.
Excellent skills in navigating Unix OS, shell scripting and Perl a plus.
Experience with ETL tools with experience in large scale migration, data movement.
Strong knowledge on Reporting standards, best practices and frameworks, preferably Business Objects.
Strong knowledge on Data Warehouse project constraints, technical risks, and business user’s viewpoints.
Strong knowledge on Data Governance and Stewardship.
Strong knowledge on Ad-hoc\OLAP reporting along with understanding of advanced data analytics.
Experience in architecting and designing large scale enterprise data warehouse solutions and full lifecycle DW BI Projects
Experience in creating Logical Data Models and Physical Data Models - Experience with atleast one industry standard model.
Hands on experience in 1 or more industry standard ETL and Reporting tools
Good hands on in working with large volumes of data and multi-terabyte data warehouses, requiring high throughput and low response times.
Oversee and provide technical oversight and support for solution implementation
Should have experience in building Relational - 3NF, Dimensional for Enterprise data warehouse
Strong hands on experience with one or more Data Modeling tools with experience repository and version management.
Strong database design skills involving Teradata 14 / 15 Databases- with emphasis on design for performance.
Good understanding of data governance, master data and metadata standards.
Must possess practical experience with Big Data and Analytics tools and technologies including. Hands on experience with design, and building big-data platforms using Hadoop- Mapreduce/ Spark etc.
Experience with cloud and virtualization technologies, solutions and architecture including Big Data and Analytics.
Hands on working with Horton works -Hadoop and Teradata Aster appliance is a big plus