Job Description :

Job Description:

Responsibilities:

  • Experience in collaboration and coordination with the stakeholders to ensure project delivery from requirement to user acceptance testing.
  • Establish scalable, efficient, automated processes for data analyses, model development, validation, and implementation.
  • Work closely with data scientists and analysts to create and deploy new features.
  • Write efficient and well-organized software to ship products in an iterative, continual-release environment.
  • Reporting key insight trends, using statistical rigor to simplify and inform the larger team of noteworthy story lines that impact the business.
  • Monitor and plan out core infrastructure enhancements.
  • Contribute to and promote good software engineering practices across the team.
  • Mentor and educate team members to adopt best practices in writing and maintaining production code.
  • Communicate clearly and effectively to technical and non-technical audiences.
  • Actively contribute to and re-use community best practices.

Minimum Qualifications:

  • University or advanced degree in engineering, computer science, mathematics, or a related field
  • Strong experience working with a variety of relational SQL and NoSQL databases.
  • Strong experience working with big data tools: Big Data tech stack (Hadoop, Spark, Kafka etc.)
  • Experience with at least one cloud provider solution (AWS, GCP, Azure, GCP preferred)
  • Strong experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
  • Ability to work with Linux platform.
  • Strong knowledge of data pipeline and workflow management tools
  • Expertise in standard software engineering methodology, e.g. unit testing, code reviews, design documentation
  • Experience creating Data pipelines that prepare data for ingestion & consumption appropriately.
  • Experience in setting up, maintaining, and optimizing databases/filesystems for production usage in reporting and analytics.
  • Experience with workflow orchestration (Airflow, Tivoli, etc.)
  • Working knowledge of Git Hub/Git Toolkit
  • Working in a collaborative environment and interacting effectively with technical and non-technical team members equally well (Good verbal and written English)
  • Relevant working experience with Containerization (Docker and Kubernetes) preferred.
  • Experience working with APIs (Data as a Service) preferred.

Experience with data visualization using Tableau, PowerBI, Looker or similar tools is a plus

             

Similar Jobs you may be interested in ..