Job Description :
TECHNOGEN, Inc. is a Proven Leader in providing full IT Services, Software Development and Solutions for 15 years.



TECHNOGEN is a Small & Woman Owned Minority Business with GSA Advantage Certification. We have offices in VA; MD & Offshore development centers in India. We have successfully executed 100+ projects for clients ranging from small business and non-profits to Fortune 50 companies and federal, state and local agencies.


This is Tony from TechnoGen Inc. And I am writing to see whether you are interested in an exciting/challenging opportunity in Austin,  Texas.Kindly reach me on or  


Role : ETL Developer
Location : Austin,  Texas
Duration : Long Term  

Notes :Hadoop, Informatica BDM, CDC and ETL are must

The ETL Developer delivers full-stack data solutions across the entire data processing pipeline. This relies on systems engineering principles to design and implement solutions that span the data lifecycle to: collect, ingest, process, transform, store, persist, access, and deliver data at scale and at speed. It includes knowledge of local, distributed, and cloud-based technologies; data transformation and smart caching; and all security and authentication mechanisms required to protect the data.



Technical Skills:

·         Experience in Data Management, Data Integration & Analytics in diverse contexts with depth in data & information architectures - structured and unstructured preferably in Transportation or in large government for descriptive/diagnostic and predictive, prescriptive analytical needs.

·         Past or current experience in ETL development using Informatica or other compatible ETL solution

·         Expereince using Information BDM (Big Data Management)

·         Expertise in big data application data architecture and supporting implementation

·         Experience with one or more of the state of Texas government agencies

·         Knowledge of modern enterprise data architectures, design patterns, and data toolsets and the ability to apply them

·         Strong knowledge and experience of SQL and proficiency in data modeling techniques and understanding of normalization

·         Experience implementing or supporting Data Integration of Big Data with Sqoop or similar tools

·         Experience working with AWS S3 Storage, parquet and ORC files.

·         Strong problem solving, conceptualization, and communication skills

·         Understanding of enterprise service bus architectures and rest services using Kafka or compatible solution

·         Leveraging Big data & streaming technologies within AWS for data ingestion, transformation & persistence Sqoop, hive, kafka, nifi, oozie, java , Python, Spark, Hbase,Hartonworks Hadoop

·         Experience designing & managing Data Marts & Data Warehouse platforms.

·         Strong analytics & reporting skills experience with BI tools like Tableau, qlik, Ms Power BI Data modeling for data lake, data warehousing OLTP, normalized models,de-normalized, dimensional methods.

·         Strong data integration skills and experience especially around moving large data sets in batch & near real time across cloud & ground desired

·         Demonstrated success engaing business partners in a consultative manner and turning business concepts into well designed technology solutions.
             

Similar Jobs you may be interested in ..