We have openings with our client
PFB details
Duration: 6 Months
Required Skills and Experience:
• 8 - 12 years of experience in various capacities working with data warehousing or data engineering projects.
• 3+ years’ experience in leading a team size of 8 Hadoop data engineers in the delivery of Hadoop data & Informatica Integration implementations on On-premise Hadoop cluster / AWS / GCP.
• Should be very strong in Sqoop, Hive, Spark, KAFKA, BigQuery, Oozie / Azkaban / Airflow etc. experience on Airflow / Azkaban would be a plus.
• Should be very good experience in Linux / Unix shell scripting.
• Should be very strong in performance optimized tuning techniques while processing the data using Spark and storing the data in BigQuery or any other Data Warehouse database.
• The candidate should be best Use his / her creative and innovative skills to assist in end-to-end business solution activities by creating, reviewing, controlling and improving processes, reusable / unique solutions.
• Should be good in written and verbal communication skills and ability to communicate with colleagues of all levels.
Mandatory Skills : Bigdata, ETL, BI
Bigdata : Kafka, Flume, Spark, Hbase, Hive, Impala, Cloudera distribution, (Cloudera manager understanding)
ETL : Informatica BDM ,BDQ , analyst
BI : MicroStrategy
• Introducing best code practices in delivery
• optimization techniques implementation
• Connecting the Business and technical dots
• Industry standard practices implementation
• Owning the CART approved Architecture towards delivery
• Participate in stakeholder meetings and architecture discussions
• Experience in Data management and governance
RESPONSIBILITIES:
• Should be able to spearhead the team size of 10
• Should be able to communicate with the customer & Manager understand the requirements, analyses the requirement, come up with best possible solutions and implement more optimized solution.
• Should be able to work well with on-site leads during the requirement analysis, design / solutions decisions.
• Should be able to prepare the STM doc to the development team and ensure that the team is working as per the designed solutions and able to deliver the quality of work on-time.
• Should be able to work with the team to deliver optimized solution to the customer.
• Should be able to take end-to-end projects delivery ownership and accountability.
• Should communicate to status report on daily and weekly status to the management.
*** تقدم على الرابط التالي : Apply on the following link ***
https://www.naukrigulf.com/technical-lead-bigdata-etl-b-jobs-in-dubai-uae-in-confidential-5-to-10-years-jid-160819000027