24 Oct
Big Data/Hadoop Architect
Vacancy expired!
- The candidate should have strong experience with the focus on Open source and large data handling in Petabyte range.
- Strong hands-on experience in Hadoop administration and design aspects
- Minimum 10 years hands-on experience involved in planning, designing, and strategizing the roadmap and deciding how the organization moves forward.
- Have hands-on experience in working with Hadoop distribution platforms like Hortonworks, Cloudera, MapR, and others
- Exposure to multi cloud architecture and Data platforms
- Experience and Understanding of underlying infrastructure for Big Data Solutions (Clustered/Distributed Computing, Storage, Data Center Networking).
- Experience with data pipeline and workflow management using components – Kafka, NIFI, Strom
- Working knowledge on Splunk
- Strong hands-on experience in optimizing jobs and the platform
- Data/Analytics areas.
- Expertise in Big Data technologies in Hadoop ecosystem – Hive, HDFS, MapReduce, Yarn, Kafka, Pig, Oozie, HBase, Sqoop, Spark, Solr/Elastic Search etc.
- Expertise in SQL and NoSQL databases technologies
- A strong understanding of data analytics and visualization.
- System design and data modeling skills
Vacancy expired!