Job Description :
Linked In is mandatory. please send resume along with Linked In

Job 1. Power BI Admin - L2/L3 Support: -location Los Angeles CA
Maintaining Workspaces. Deploying reports to production. Monitoring node capacity, Gateways, Creating and Monitoring AD Groups. Sharing reports to end users. Production report bugs. Managing embed codes – if this feature is used in future.

Resource should know BI tool admins tasks, understand the various monitoring and housekeeping activities for the tool, conversant with Production deployment activities and change management., Basic administration activities, User administration. Good analytical and problem solving skills to analyze production issue. Familiar with ITIL methodologies. understand report development lifecycle for the BI tool.


1. Monitoring of the production and other environments and make sure all processes are running fine.
2. Monitoring of scheduled/interactive reports and log analysis activity .
3. Giving access to different users .
4. Monitoring memory and CPU utilization of reporting servers.
5. Migrating objects(reports, package etc) from one environment to another .
6. Publishing package and providing security to required users.
7. Security patching of reporting servers.
8. Creating or modifying jobs in autosys/ similar environment and configuring reports with autosys jobs.
9. Sending Advisory note for P1 and P2 and managing Incident lifecyle.
10. Triaging of Tickets and User queries.11-Acknowledging all Incident and Queries as per defined SLA .
12. Resolution of all Incidents and User queries as per defined SLA.

Hadoop Architect: - Dallas TX (need verifiable references and Linked In)
Big Data - Hadoop Architect(with Scala & Spark): 6 Positions

Work location: Dallas

We need Big Data (Hadoop) HortonworksData Platform (HDP ) Data Engineer who is well versed with Hortonworks Data Flow ( HDF) and apache Nifi /miNifi and Kafka tools . The candidate should have strong experience on Scala & Spark.

The resource should be 6+ yrs exp with good in depth knowledge & experience in Hadoop around all the Hadoop ecosystem- HDP, HDF, Nifi , M/R, Hive, pig, Spark/scala, kafka, Hbase and having ETL Background.The resource should be able Develop the framework of Data Ingestion into Data lake with Utilities around this Platform

6+ years total experience in development mainly around Java and all related technologies in the Java stack (e.g. Spring)
6+ year in depth knowledge & experience in Hadoop around all the Hadoop ecosystem (HDP, HDF, M/R, Hive, pig, Spark/scala, kafka, Hbase, Elastic search and log stash a plus)
4+ years of experience working in Linux/Unix
Good understanding & experience with Performance and Performance tuning for complex S/W projects mainly around large scale and low latency.
Experience with leading Design & Architecture
Hadoop/Java certifications is a plus.
Excellent communication skills.
Ability to work in a fast-paced, team oriented environment.

Client : Telecom client