28 Jul
Senior Data Engineer
North Carolina, Raleigh / durham / CH , 27601 Raleigh / durham / CH USA

Vacancy expired!

Atyeyi is Inc. Inc. 500 & 5000 Honoree Company for 2012,2013,2014,2015, 2016 and 2017.Atyeti Ranks No. 270 on the 2012 Inc. 500 List2012, 2016 and 2017 NJ 50 Fastest Growing Companies.Our direct client is a Global Investment Bank. Request you to send an updated cv on priya.suman(at)atyeti.comJob Description:We are looking for a data engineer who is will help us build new and improve existing data pipelines. You are comfortable consuming streaming data from Apache Kafka in Python, are familiar with stream processing primitives (joins, aggregations, co-partitioning, …), and have a solid understanding of modern database technologies as well as a software engineering mindset. This individual will report into the Head of CM&A Data Strategy and will partner closely with product, data science and project management to deliver a best-in-class client coverage data tool. This individual’s work will be critical to team success as data access is at the core of the Hubble platform.Candidate QualificationsYou’ll be a good fit if you: · Are proficient in modern, efficient Python development (asynchronous generators and asyncio are in your Python tool belt?) · Have solid understanding of both classical and modern database technologies (SQL and noSQL / document based) · Have had practical experience with streaming data and have worked with Apache Kafka · Are comfortable in a Linux environment, using Git (or any version control) and are into automating everything · Have worked with graph databases (e.g. Neo4j) and/or search indices (Solr, Elastic) · Have used machine learning packages and libraries and understand fundamental techniques to building models · Are interested in working with a diverse stack of tools and learning new technologies as different problems ariseNice To Have Skills: · Experience with public cloud services (Azure preferred) · Have worked with Kafka Streams or Faust · Have worked with larger datasets with distributed computing tools (e.g. Apache Spark) · Practical experience with Snowflake · Proficient reading and understanding enterprise-grade Java code (Java development background a huge plus) · Experience presenting data in custom dashboards based on NodeJS / React / JavaScript a plus

Vacancy expired!


Related jobs

Report job