Job Description :
Location - Downers Grove,IL

They are willing to dial back some of their “must haves” for the Cloud Data Engineer role.

Google Cloud no longer a requirement
Must have at least four solid years developing in two or more of the following: Java, Scala, Python. Those years should be development, not learning the languages
No longer able to sponsor visas
Looking for a Data Engineer not an Application Developer.
Looking to hire ASAP!!

Technical Requirements and/or Certifications »
Data, Big Data and ML-learning new technologies
Logical/physical database design, development, analysis, architecture, and modeling
Designing and developing large scale applications utilizing Big Data tech
Engineering trade-offs, with an ability to understand the impact of software changes on extendibility, scalability, performance, and maintainability

Education and Experience »
5 +years of experience in Java or Python and with Data Transformation
3 years of experience with the Hadoop stack
1 year of experience with Apache Beam
Experience working with GCP services like Data Flow, Big Table, Big Query, and GCP Data Storage Buckets
Experience in architecting multi-tier, distributed database applications
Experience with Kafka/ Pub-Sub, SQL programming, and performance tuning skills

Job Responsibilities »
Design, architect and build data platform while using a variety of BIG DATA technologies
Work closely in the team to analyze and develop data architecture; ETL processes, ERD modeling and physical database implementation with GCP Data Services: BigQuery, Big Table, Data Flow
Design, develop and roll out new application features that impact databases
Develop and maintain an in-depth understanding of the data/ETL architecture and the general application functionality used to maintain data integrity
Develop Data Flow jobs to answer complex analytical and real-time operational questions
Innovate by exploring, recommending, benchmarking, and implementing data-centric platform technologies
Provide hardware architectural guidance, estimate cluster capacity, and create roadmaps for Hadoop or BIG Data Cloud services
Provide support for both analytics and operational platforms
Work closely with team-members including IT managers to deliver defect-free solutions in a timely manner. Update work status on a frequent (as often as daily) basis
Follow and improve upon processes and policies for database application development methodologies and lifecycles
Work on multiple projects at a time either independently or as a team member
Work with developers and business owners to provide database needs for the entire company platform
Oversee the development and release of solutions to non-production environments
Are you''ll willing to collaborate with some of the best Java architects to establish platform standards when new technologies are introduced in the company platform?
Are you curious and want to continually investigate new technologies and their possible application to the company''s business requirements?
Do you wish to assist in the development of application development processes, policies, and standards?