27 Feb
Data Engineer [Must have Python, Sql, Airflow & Pyspark experience]
Georgia, Atlanta , 30301 Atlanta USA

Vacancy expired!

Job Role: Data Engineer [Must have Python, Sql, Airflow & Pyspark experience] Location: GA,US [Remote work in Covid]As the Data Engineer within Online Analytics & BI, you will be responsible for

development of the workflow orchestration & ETL pipelines within marketing's analytics and data science platforms. You will ensure that the

data pipeline infrastructure meets the analysis, reporting, and data science needs of the marketing organization.This position calls for top technical talent to implement continued design, development and optimization of the marketing data pipelines & data prep infrastructure built on cutting-edge cloud technologies.Roles & ResponsibilitiesApply strong technical skills in a data engineering team building industry-leading technologyEmbrace an active team role to help design, implement, and launch efficient and reliable data pipelines moving data across a number of platforms including Data Warehouse, online caches and real-time systems.Create data architecture that is flexible, scalable, and consistent for cross-functional use, and aligned to stakeholder business requirements.Deploy workflow orchestration and demonstrate expertise in data modeling, ETL development, and data warehousingBuild industry-leading tools to increase productivity of Data Analysts, Data Scientists and MarketersHelp Marketing organization to become a 100% data-driven organization by building a next generation data platform that brings accurate and timely data to the Marketers

Validate Data Engineering business data elements, organizational and business intelligence architecture designs for engineering functional areas from Dashboards, Data Lakes, Data Operations, ML - AI, and upstream/downstream intake and output processes

Required Qualifications:BA/BS Degree in Computer Science, any Engineering discipline, Statistics, Information Systems or another quantitative field.4+ years of industry experience in

data engineering, data science, or related field with a track record of manipulating, processing, and extracting value from large datasets

Experience building and managing data pipelines and repositories in cloud environments such as Google Cloud/Microsoft Azure/AWSExperience in Airflow is a mustExperience extracting/cleansing data and generating insights from large transactional data sets

using Spark SQL, SQL, Python, and PySpark on cloudExperience with optimizing Spark pipelines on Dataproc, Databricks or similar technologies. Strong verbal and written communications skills at all levels; ability to communicate complex customer behavior information to both functional partners and Executive LeadershipOpen to idea exploration with strong problem-solving/analytical abilitiesDemonstrated strength in creating partnerships and in building relationships with other functions and associates within the organizationPls let us know your interest with Availability on Or text me onto discuss ASAP

Vacancy expired!


Related jobs

Report job