You will be responsible for designing and optimizing big data and data warehouse architecture, as well as optimizing data flow and pipelines for cross functional teams. You are a technical guru when it comes to selecting the right tools for implementing data ingestion, processing, and storage. Security, performance, scalability, availability, accessibility, and maintainability are your top priorities when designing data solutions. You have a deep, broad, and hands-on experience in the various technologies from Hadoop ecosystem, NoSQL, RDBMs, ingestion, and processing
Desired Candidate Profile
•
9-12 years of experience in data warehousing and big data projects.
•
Deep and broad experiences in the Hadoop ecosystem, including HDFS, MapReduce, Hive, HBase, Impala, Kudu, Solr, etc..
•
Hands-on experience in multiple NoSQL databases like Cassandra, MongoDB, Neo4j, ElasticSearch, and ELK Stack
•
Experience with stream-processing systems: Storm, Spark-Streaming, Flink, etc.
•
Experience with a real-time messaging platform like KAFKA, Kinesis, etc.
•
Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases, including distributed relational databases like SingleStore and Vitess
•
Experience in building and optimizing big data data pipelines
•
Strong analytic skills related to working with unstructured datasets.
•
Experience in object stores like MINIO and ceph
•
Build processes supporting data transformation, data structures, metadata, dependency and workload management.
•
Proven record of building highly available and always-on data platforms
•
Linux shell scripting
•
languages: Python, Java, Scala, etc.
•
Fluent in English & Arabic
https://www.naukrigulf.com/technical-architect-jobs-in-riyadh-saudi-arabia-in-devoteam-international-9-to-12-years-n-cd-10002691-jid-281221500559