10 Mar
Hadoop Developer
Michigan, Dearbornmi 00000 Dearbornmi USA

Vacancy expired!

1. Design and development of data ingestion pipelines. 2. Perform data migration and conversion activities. 3. Develop and integrate software applications using suitable development methodologies and standards, applying standard architectural patterns, taking into account critical performance characteristics and security measures. 4. Collaborate with Business Analysts, Architects and Senior Developers to establish the physical application framework (e.g. libraries, modules, execution environments). 5. Perform end to end automation of ETL process for various datasets that are being ingested into the big data platform. Required: 1. Java/J2EE 2. Web Applications, Tomcat (or any equivalent App server) , RESTful Services, JSON 3. Spring, Spring Boot, Struts, Design Patterns 4. Hadoop (Cloudera (CDH)) , HDFS, Hive, Impala, Spark, Oozie, HBase 5. SCALA 6. SQL 7. Linux Good to Have: 8. Google Analytics, Adobe Analytics 9. Python, Perl 10. Flume, Solr 11. Strong Database Design Skills 12. ETL Tools 13. NoSQL databases (Mongo, Couchbase, Cassandra) 14. JavaScript UI frameworks (Angular, NodeJS, Bootstrap) 16. Good understanding and working knowledge of Agile development Job Summary: The Java/Hadoop/ETL Developer position will provide expertise in a wide range of technical areas, including but not limited to: Cloudera Hadoop ecosystem, ETL, Java, collaboration toolsets integration using SSO, configuration management, hardware and software configuration and tuning, software design and development, and application of new technologies and languages which are aligned with other clients' internal projects. Job Title: Hadoop Developer Minimum Qualifications and Job Requirements: Must have a Bachelor's degree in Computer Science or related IT discipline. Must have at least 5 years of IT development experience. Must have knowledge of SCALA Spark programming. Must have relevant professional experience working with Hadoop (HBase, Hive, MapReduce, Sqoop, Flume) Java, JavaScript, .Net, SQL, PERL, Python or equivalent scripting language Must have extensive experience with ETL tools. Must have experience integrating web services. Knowledge of standard software development methodologies such as Agile and Waterfall Strong communication skills. Must be willing to flex work hours accordingly to support application launches and manage production outages if necessary. Other Responsibilities: 1. Document and maintain project artifacts. 2. Suggest best practices, and implementation strategies using Hadoop, Java, ETL tools. 3. Maintain comprehensive knowledge of industry standards, methodologies, processes, and best practices. 4. Other duties as assigned.

Vacancy expired!


Report job