Job Description :
Role: Data Engineer
Location – Seattle, WA
Duration – 8 Months +

This position contributes to Organization’s success by building enterprise data services for analytic solutions. This position is responsible for design, development, testing and support for data pipelines to enable continuous data processing for data exploration, data preparation and real-time business analytics.

Responsibilities and essential job functions include but are not limited to the following:
1. Demonstrate deep knowledge of data and the ability to lead others in the data engineering team to build and support non-interactive (batch, distributed) & real-time, highly available data, data pipeline and technology capabilities.
2. Demonstrate focus in working towards defined business objectives and understanding the business value of work performed
3. Demonstrate deep understanding of the ETL process (and variants there-of), including orchestration and development of data products
4. Translate strategic requirements into business requirements to ensure solutions meet business needs
5. Work with infrastructure provisioning & configuration tools to develop scripts to automate deployment of physical and virtual environments; to develop tools to monitor usage of virtual resources.
6. Assist in the definition of architecture that ensure that solutions are built within a consistent framework.
7. Lead resolution activities for complex data issues
8. Define & implement data retention policies and procedures
9. Define & implement data governance policies and procedures
10. Identify improvements in team coding standards and help in implementation of the improvements.
11. Leverage subject matter expertise to coordinate issue resolution efforts across peer support groups, technical support teams, and vendors
12. Develop and maintain documentation relating to all assigned systems and projects
13. Perform systems and applications performance characterization and trade-off studies through analysis and simulation
14. Perform root cause analysis to identify permanent resolutions to software or business process issues

The candidate must have experience with processing and decrypting streaming data

In addition:

1. Ability to apply knowledge of multidisciplinary business principles and practices to achieve successful outcomes in cross-functional projects and activities
2. Effective communication skills
3. Excel at problem solving
4. Strong working knowledge of Python, Java, Scala or C#
5. Strong working knowledge of SQL
6. Strong working knowledge of SQL and No-SQL Platforms
7. Proficiency in debugging, troubleshooting, performance tuning and relevant tooling
8. Strong working knowledge of Hadoop, YARN, MapReduce, Pig or Hive, Spark
9. Demonstrated ability to “productionalize” at least 2 big data implementations
10. Experience using one of the public cloud (AWS or Azure preferred) for data applications
11. Proficiency in shell scripting
12. Solid understanding of data design patterns and best practices
13. Proficiency in CI/CD tools
14. Proficiency in logging and monitoring tools, patterns & implementations
15. Understanding of enterprise security, REST / SOAP services, best practices around enterprise deployments.
16. Proven ability and desire to mentor others in a team environment
17. Working knowledge of data visualization tools such as Tableau is a plus
18. Practice, evangelize and be an ambassador for agile and DevOps culture
19. Proven ability and desire to lead others in a team environment
             

Similar Jobs you may be interested in ..