Job Description :
Devops, AWS, Hadoop 6 months, Dallas TX


Build data pipeline frameworks to automate high-volume and real-time data delivery for our Hadoop and streaming data hub
Build data APIs and data delivery services that support critical operational and analytical applications for our internal business operations, customers and partners
Transform complex analytical models into scalable, production-ready solutions
Continuously integrate and deploy code into cloud environments
Develop applications from ground up using modern technology stack such as Java/Scala/Python, Spark and NoSQL, Postgres/Snowflake
Build robust systems with an eye on the long term maintenance and support of the application
Leverage reusable code modules to solve problems across the team and organization
Utilize a working knowledge of multiple development languages
Drive cross team design / development via technical leadership / mentoring
Understand complex multi-tier, multi-platform systems

Skill Set :
· Big data/Spark with SQL - Required
· AWS/Cloud computing – Required, definitely need to have AWS, know what EC2 clusters, dealing with big data and moving that data to the Cloud.
· Programming languages ( Java/Python/Scala) – Java is the preferred language
· Dev Ops experience
             

Similar Jobs you may be interested in ..