Job Description :
Data Engineer
Contract- 12 month plus
Location -Houston TX
Video interviews are a MUST for this role.

The successful candidate will have more than (5) five years of experience as a Data Engineer
along with Cloud architecture experience.
Data Engineer Job details
Work with other data engineers, data ingestion specialists, and experts across the company to consolidate methods and tool standards where practical.
Work independently on complex data engineering problems to support data science strategy of products.
Use broad and deep technical knowledge in the data engineering space to tackle complex data problems for product teams, with a core focus on using technical expertise.
Improve the data availability by acting as a liaison between Lab teams and source systems.
Collect, blend, and transform data using ETL tools, database management system tools, and code development.
Implement data models and structures data in ready-for business consumption formats.
Aggregate data across various warehousing models (e.g. OLAP cubes, star schemas, etc for BI purposes.
Collaborate with business teams and understand how data needs to be structured for consumption.
Data Engineer Must have Skills
5 years or more experience in a Data Engineer.
5 Year in an agile environment.
Cloud Architecture experience, preferably in an Azure environment.
Building and maintaining optimal data pipeline architecture.
Assembly of large, complex data sets that meet functional / non-functional business requirements.
Ability to identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, data quality checks, minimize Cloud cost, etc.
Experience building the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Data Bricks, No-SQL.
Experience building analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
Document and communicate standard methods and tools used.
Experienced using the following software/tools:
Big data tools: Hadoop, HDI, & Spark
Relational SQL and NoSQL databases, including COSMOS
Data pipeline and workflow management tools: Data Bricks (Spark), ADF, Dataflow
Microsoft Azure
Stream-processing systems: Storm, Streaming-Analytics, IoT Hub, Event Hub
Object-oriented/object function scripting languages: Python, Scala, SQL.
Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.