Job Description :
Cerebra Consulting Inc is a System Integrator and IT Services Solution provider with a focus on Big Data, Business Analytics, Cloud Solutions, Amazon Web Services, Salesforce, Oracle EBS, Peoplesoft, Hyperion, Oracle Configurator, Oracle CPQ, Oracle PLM and Custom Application Development. Utilizing solid business experience, industry-specific expertise, and proven methodologies, we consistently deliver measurable results for our customers. Cerebra has partnered with leading enterprise software companies and cloud providers such as Oracle, Salesforce, Amazon and able to leverage these partner relationships to deliver high-quality, end-to-end customer solutions that are targeted to the needs of each customer.










Position: Data Engineer Location: Waltham, MA/ San Jose, CA Duration: 1+ year Position Description: Our Client in Waltham, MA is looking for a Data Engineer to join their team on a long term basis. Scope The ideal candidate will perform testing on our Client's Cloud Platform that supports multiple deployed routers (BHR4, BHRx1, K2, CHR), as well as the testing of a hybrid cloud, developed by our Client. Testing will include Acceptance testing, end-to-end testing of API parameters from the routers to the cloud API, and end-to-end testing of TR-069 parameters from the routers. Work shall include GUI and other feature testing of the cloud platform. Day to Day Responsibilities Include: Create a 1-click automation test suite for existing test cases for the cloud platform using one of the following scripting languages: PySpark/Scala/Python. Perform manual functional testing of each parameter from router to cloud Perform performance testing of each parameter from router to cloud GUI testing for the Cloud platform Feature testing of newly developed features for cloud platform Participate in each sprint testing Deliver automation scripts to automate REpresentational State Transfer quot;RESTful") APIs testing from cloud platform Perform testing on router for the API's sent from cloud to router and verify E-E functionality Create adhoc queries based on use cases on database for data analysis Help perform Exploratory data analysis for the data collected on cloud Required Skills: 4+ years' experience in AWS cloud environment Very good understanding of big data pipeline and common database like MySQL, MongoDB, HBase Good understanding of Hadoop Framework Experience performing batch/real time processing using Spark Experience designing and developing appropriate test automation frameworks and data validation techniques to ensure optimized product performance Understanding of networking stack or any wireless protocol - preferable 802.11 Experience working with embedded consumer products - router experience preferable Experience with Cloud testing tools Very strong scripting experience using Spark (preferred) or Python and SQL is a must Additional Requirements: Candidates may be subject to background check and drug test. Required Skills : The following are all must have skills: Amazon Web Services (AWS) Python SQL big data: hadoop, spark, etc Basic Qualification : 4+ years with AWS Python SQL Additional Skills : 4+ years with AWS Python SQL Please share resume to or can can reach me at