16 Apr
Senior Data Engineer (Kafka)
California, Sanfrancisco , 94102 Sanfrancisco USA

Vacancy expired!

Position: Senior Data Engineer (Kafka) Location: SFO, CA (Remote to start) Interview Type: Hiring over Phone + Video (Webex) Visa Type: Only and Citizen Duration: Full Time Contract Job Responsibilities:

  • Designs, develops and maintains scalable data pipelines and builds out new integrations and processes required for optimal extraction, transformation, and loading of data from a wide variety of data sources using HQL and 'Big Data' technologies
  • Assemble large, complex data sets that meet functional / non-functional business requirements and fostering data-driven decision making across the organization
  • Implements processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
  • Exerts data architecture influence across multiple domains
  • Owns the delivery of data solution by partnering across multiple teams
  • Defines and drives best practices and coding standards across data organization
  • Represents Data organization to interact with other engineering teams for enterprise level architecture designs
  • Performs root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
Job Requirements:
  • 8+ years of relevant experience who is excited to apply their current skills and to grow their knowledge base.
  • BS or MS degree in Computer Science or a related technical field
  • Experience with data pipeline, data streaming, data analytics, data warehousing and big data
  • Experience with SQL/No-SQL, schema design and dimensional data modelling
  • Extensive knowledge in fine tuning SQL, understanding optimizer, and execution plans
  • Experience with Data Management and Data Governance processes and standards
  • Experience with BigData technologies stack such as Kafka, Cassandra, Airflow, HBase, Hadoop, Hive, Oozie, MapReduce
  • Experience in AWS/Cloud/Spark/Java/Python development
  • Proficiency in programming languages such as: Java, Scala
  • Familiar with Agile methodology, test-driven development, source control management, DevOps, Continuous Integration/Delivery, and testing automation

Vacancy expired!


Related jobs

Report job