06 May
Hadoop Developer
North Carolina, Charlotte , 28201 Charlotte USA

Vacancy expired!

  • This Design and build high performing and scalable data pipeline platform using Hadoop, Apache Spark, MongoDB and object storage architecture.
  • Design and build the data services on container-based architecture such as Kubernetes and Docker
  • Partner with Enterprise data teams such as Data Management & Insights and Enterprise Data Environment (Data Lake) and identify the best place to source the data
  • Work with business analysts, development teams and project managers for requirements and business rules.
  • Collaborate with source system and approved provisioning point (APP) teams, Architects, Data Analysts and Modelers to build scalable and performant data solutions.
  • Effectively work in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist
  • Work with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure.
  • Work with DBAs in Enterprise Database Management group to troubleshoot problems and optimize performance
  • Support ongoing data management efforts for Development, QA and Production environments
  • Utilizes a thorough understanding of available technology, tools, and existing designs.
  • Leverage knowledge of industry trends to build best in class technology to provide competitive advantage.
  • Acts as expert technical resource to programming staff in the program development, testing, and implementation process

Vacancy expired!


Related jobs

Report job