Job Description :
Role : DevOps Automation Lead Location : Appleton, Wisconsin ( Remote till Covid) Rate : $65/hr No. of Position : 1   JD : 10 + years of experience in Data pipeline engineering for both batch and streaming applications. Experience with data ingestion process, creating data pipelines and performance tuning with Snowflake and AWS. Able to help build DevOps: Facilitating the development process and operations. Establishing continuous build environments to speed up software development, Designing efficient practices. Identifying setbacks and shortcomings. Able to help preparing test automation solution, designing automation frameworks. Implementing SQL query tuning, cache optimization, and parallel execution techniques. Must be hands-on coding capable in at least a core language skill of (Python, Java or Scala) with Spark. Expertise in working with distributed DW and Cloud services (like Snowflake, Redshift, AWS etc) via scripted pipeline Leveraged frameworks and orchestration like Airflow as required for ETL pipeline This role intersects with “Big data” stack to enable varied analytics, ML etc. Not just Datawarehouse type workload. Experience handling large and complex sets of XML, JSON, Parquet and CSV from various sources and databases. Solid grasp of database engineering and design Identify bottlenecks and bugs in the system. Nice to have: Knowledge of highly scalable ‘big data’ data stores, Stream sets, Databricks