Job Description :
Position: Big Data Analytics Software Engineer

Location – Philadelphia

Duration: Long Term



Responsibilities:

Development and maintenance of ETL applications written using Apache Spark, Java, SQL/Pentaho and Python.
Performance tuning and management of SQL based Databases like Oracle, MySQL, Redshift etc.
Backup/Restore/Archiving of Database
Exploring new data store technologies for various business scenarios like Business Intelligence, Data Warehousing, Real time analytics etc.


Required Skills:

Java/Python experience is a must.
Extensive ETL and Database experience with SQL based DB.
5 years’ experience in DevOps specially test automation and application deployment/monitoring.
Experience with Open stack and/or AWS Cloud services
Experience with Linux (Ubuntu preferred) system




Responsibilities:

Development and Testing of ETL pipelines written using one or more of the following technologies:
Apache Spark
Java
SQL/Pentaho
Python.
Functional and Performance testing of Analytics Platform including but not limited to:
ETL Pipelines
Data ware house based on Redshift
Tableau based Dashboards
Deployment, hardening and monitoring of applications and services using DevOps practices.
Configuring Linux/Unix systems and Databases, including performance tuning and monitoring


Required Skills:

Programming experience with Java, SQL and one or more scripting languages.
Database experience with SQL based databases like MySQL, Redshift, and Oracle etc.
Experience with Open stack and/or Amazon Cloud services
Desirable to have BIG Data experience
Experience with Linux (Ubuntu preferred) system
             

Similar Jobs you may be interested in ..