Job Description :
Data Engineer
Location: Tampa, FL
Duration: 6+ months contract

Job Description
As a Data Engineer on the Assurance Innovation team you will work in a team together with Data Scientists, Software Engineers and Product Managers to drive Innovation and Technical solutions into the practice. Data Engineers will focus on the design and build out of data models, codification of business rules, mapping of data sources to the data models (structured and unstructured), engineering of scalable ETL pipelines, development of data quality solutions, and continuous evaluation of technologies to continue to enhance the capabilities of the Data Engineer team and broader Innovation group.
Minimum Year(s) of Experience: 2, preferably as a data engineer, business systems analyst, data analyst, or similar role.
Minimum Degree Required: Bachelor''s degree in one of the following: Accounting, Finance/Economics, Management Information Systems, Computer Science, Business Administration, Statistics Mathematics, Regulatory Compliance, Science, Technology, Engineering, Mathematics and/or other business field of study.

Technical skills required:
Object-oriented/object function scripting languages:
Python, R, C/C++, Java, Scala, etc.
Relational SQL, distributed SQL and NoSQL databases:
MSSQL, PostgreSQL, MySQL, etc.
MemSQL, CrateDB, etc.
MongoDB, Cassandra, etc.
Neo4j, AllegroGraph, ArangoDB, etc.
Big data tools such as Hadoop, Spark, Kafka, etc.
Data modeling tools such as ERWin, Enterprise Architect, Visio, etc.
Data integration tools such as SSIS, Informatica, SnapLogic, etc.
Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
Business Intelligence Tools such as Tableau, PowerBI, Zoomdata, Pentaho, etc.
Cloud technologies such as SaaS, IaaS and PaaS within Azure, AWS or Google
Linux and comfortable with bash scripting
Docker and Puppet

Knowledge Preferred:
Working knowledge of Python and experience with data extraction, data cleansing and data wrangling;
Working knowledge of SQL and experience with relational databases;
Experience in codification of business rules (analytics) in one of the programming languages listed above;
Experience working with business teams to capture and define data models and data flows to enable downstream analytics;
Experience with data modeling, data mapping, data governance and the processes and technologies commonly used in this space;
Experience with data integration tools (e.g. Talend, SnapLogic, Informatica) and data warehousing / data lake tools;
Experience with systems development life cycles such as Agile and Scrum methodologies; and
Demonstrating experience in API based data acquisition and management.

Skills Preferred:
Has built enterprise data pipelines and can craft code in SQL, Python, and/or R
Has built batch data pipeline with relational and columnar database engines as well as Hadoop or Spark, and understands their respective strengths and weaknesses
Ability to build scalable and performant data models
Possesses strong computer science fundamentals: data structures, algorithms, programming languages, distributed systems, and information retrieval
Experience with agile development processes
Execution focused - knows how to get things done
Possesses a keen analytical mind with attention to detail and accuracy
Excellent verbal and written communication skills with ability to present technical and non-technical information to various audiences
Excellent organization and prioritization skills with strong ability to multitask and switch focus as necessary to meet deadlines and/or with change in priorities
Experience working with large data sets and deriving insights from data using various BI and data analytics tools
Ability to think outside of the box to solve complex business problems
Understanding of the security requirements for handling data both in motion and at rest such as communication protocols, encryption, authentication, and authorization.
Understanding of Graph databases and graph modeling
Understanding of the requirements of data science teams