Job Description :

Specialized knowledge and skills required:

·         Demonstrated experience providing customer-driven solutions, support or service.

·         In-depth knowledge of SQL or NoSQL and experience using a variety of data stores (e.g. RDBMS, analytic database, scalable document stores)

·         Extensive hands-on Python programming experience, with an emphasis towards building ETL workflows and data-driven solutions. Able to employ design patterns and generalize code to address common use cases. Capable of authoring robust, high quality, reusable code and contributing to the division’s inventory of libraries.

·         Expertise in big data batch computing tools (e.g. Hadoop or Spark), with demonstrated experience developing distributed data processing solutions.

·         Applied knowledge of cloud computing (AWS, GCP, Azure).

·         Knowledge of open source machine learning toolkits, such as sklearn, SparkML, or H2O.

·         Solid data understanding and business acumen in the data rich industries like insurance or financial

·         Applied knowledge of data modeling principles (e.g. dimensional modeling and star schemas).

·         Strong understanding of database internals, such as indexes, binary logging, and transactions.

·         Experience using tools for infrastructure-as-code (e.g. Docker, CloudFormation, Terraform, etc.)

·         Experience with software engineering tools and workflows (i.e. Jenkins, CI/CD, git).

·         Practical experience authoring and consuming web services.

             

Similar Jobs you may be interested in ..