28 Jun
ETL Developer with AWS, Java and Spark @ Philadelphia, PA
Georgia, Atlanta , 30301 Atlanta USA

Vacancy expired!

HelloHope you’re doing well

Title: ETL Developer with AWS, Java and Spark

Location: Philadelphia, PA

Duration: Long term

Rate: DOE

Mandatory Skills:- ETL, AWS, JAVA and Spark

Responsibilities:
  • Hands-on architecture/development of ETL pipelines using our internal framework written in Java
  • Hands-on architecture of real time REST APIs or other solutions for streaming data from Graph using Spark
  • Interpret data, analyze results using statistical techniques and provide ongoing reports
  • Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality
  • Acquire data from primary or secondary data sources and maintain databases/data systems
  • Identify, analyze, and interpret trends or patterns in complex data sets

Qualifications:
  • At least 8+ years of experience architecting and implementing complex ETL pipelines preferably with Spark toolset.
  • At least 4+ years of experience with Java particularly within the data space
  • Technical expertise regarding data models, database design development, data mining and segmentation techniques
  • Good experience writing complex SQL and ETL processes
  • Excellent coding and design skills, particularly in Java/Scala and Python and or Java.
  • Experience working with large data volumes, including processing, transforming and transporting large-scale data
  • Experience in AWS technologies such as EC2, Redshift, Cloud formation, EMR, AWS S3, AWS Analytics required.
  • Big data related AWS technologies like HIVE, Presto, Hadoop required.
  • AWS certification is preferable: AWS Developer/Architect/DevOps/Big Data
  • Excellent working knowledge of Apache Hadoop, Apache Spark, Kafka, Scala, Python etc.
  • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
  • Good understanding & usage of algorithms and data structures
  • Good Experience building reusable frameworks.
  • Experience working in an Agile Team environment.
  • AWS certification is preferable: AWS Developer/Architect/DevOps/Big Data

Vacancy expired!


Related jobs

Report job