Job Description :
Role:Sr Big Data Engineer
Location: Burbank ,CA
Duration:Long term
Skills: BIG Data,Cloud DataFlow, Google Cloud Platform, BigTable ,Unix/Linux operating systems ,API''s ,SQL ,Google BigQuery,Python ,AirFlow.


We have opportunities for data engineers to join our growing tech team at Universal’s offices and contribute to the development of our innovative approach to finding new and impactful insights to grow business.
We have moved all of our Data and Analytics to Google Cloud Platform (GCP
As a data engineer, you will be responsible for designing and implementing new GCP-based data solutions – new data processing, data sets and systems to support various advanced analytics needs.
This involves working with the existing engineering team, data scientists, analysts and the business to understand requirements and data needs and definitions, all the while thinking creatively about what data can be best exploited to solve a wide array of business problems.
You will create data flows to integrate with multiple external sources using APIs, database connections and flat files. You will liaise with members of the wider Universal Data & Analytics teams to ensure alignment with existing systems and consistency with internal standards and best practice.

Job Function:

Build understanding of data sources and downstream systems
Liaise with key stakeholders to understand requirements, business definitions and the potential value of different data
Design and document and implement suitable solutions for loading, piping and exposing data from multiple sources
Design and build well-engineered data systems to support analytical needs using Google Cloud Platform (Cloud DataFlow, BigQuery, BigTable are musts)
Assure accuracy of data processing and outputs through consistently high software development skills, adherence to best practice, thorough testing and peer reviews
Habitually approach problem solving with creativity and resourcefulness;
carefully evaluate risks and determine correct courses of action when completing tasks

Job Requirements:

Bachelor’s Degree in Computer Science or closely related discipline
Demonstrable professional experience designing, building, and maintaining data systems and processes using cloud-based platforms
(Google BigQuery and Cloud DataFlow extremely desirable; AirFlow is a big plus), including experience working in Unix/Linux operating systems and tools.
Expertise using cloud-based systems and services to acquire and deliver data via APIs and flat files
Demonstrable, hands-on professional software development skills using Java. Python is a plus.
Extensive hand-on experience working with data using SQL
Excellent verbal and written communications skills with the ability to clearly present ideas, concepts, and solutions
Demonstrated willingness and ability to effectively work with various team members when gathering requirements, delivering solutions, and eliciting suggestions and feedback
Extremely quick learner both in terms of new technical skills and acquiring domain knowledge