Job Description :
Title: Hadoop/Bigdata Developer
Location: Costa Mesa, CA
Duration: 12+ months


MUST HAVES: Data lake experience

A data lake is a storage repository that holds a vast amount of raw data in its native format until it is needed. While a hierarchical data warehouse stores data in files or folders, a data lake uses a flat architecture to store data.
Technologies:
Needs to have strong skills within the Apache Hadoop Ecosystem and tools for CI/CD. If have some of the tools can learn other tools that do the same but must be willing and wanting to learn new tools and have the right mindset.
Example – if they have Kafta, understand the concepts and should be able to pick up and learn Kudu.
             

Similar Jobs you may be interested in ..