25 Sep
Sr. Data Ops Engineer
Vacancy expired!
- Serve as a technical leader in Operations team for the video data ingest platform and build next-gen operations systems; support and improve key aspects of infrastructure services including security, availability, performance, and analytics.
- Create procedures/run books for operational and security aspects of all platforms.
- Suggest best practices for security, cost optimization and infra monitoring system.
- Continuously integrate and deliver in all aspects of the software development life cycle across multiple, interconnected software products.
- Provide advanced business and engineering support services to end users. Escalate issues as needed
- Gather business details from Product owner and business stakeholders, and provide updates
- Research and deploy new tools and frameworks to maintain a big data platform. Implement new monitoring tools while achieving balance between strategic design and tactical needs
- Assist with creating programs for training and onboarding new customers
- Troubleshoot customer issues to resolve problems. Provide issue review and triage problems for new service/support requests
- Track all details in the issue tracking system (JIRA) and PagerDuty on-call system
- Use DevOps automation tools, including GitLab build jobs, and code development/testing tools
- Fulfill any ad-hoc data or report request queries from different functional groups
- On-call support for P1 issues on weekends/holidays on rotation basis
- 10+ years of total IT experience
- 5+ years supporting systems infrastructure operations, upgrades, deployments, and monitoring
- 3+ years of experience as a Data Engineer working with large, complex data sources, specifically Big data (or video) and analytics
- 3+ years of experience with DevOps automation and building CI/CD pipelines - Terraform, Gitlab Runner/Jenkins, CloudFormation, etc.
- 2+ years providing customer service directly to end users, especially with crisis management
- Experience with data pipeline and workflow management tools: Airflow, Step Functions, etc.
- Experience designing, building, or operationalizing large scale data applications using AWS services (Spark, EMR, RedShift, Kinesis, Firehose, Athena, Lambda, Glue) in combination with other platforms
- Working experience and excellent understanding of AWS services
- Experience implementing role-based security/management, AD integration, IAM security policies, and auditing in a Linux/Hadoop/AWS environment. Familiar with penetration testing and scan tools for remediation of security vulnerabilities
- Hands on experience with monitoring tools such as AWS CloudWatch, Datadog, and Elastic Search
- Demonstrated success learning new technologies quickly
- Experience developing with Java or Python
- Experience working with video data pipelines in both batch and streaming environments with technologies for computer vision such as AWS Rekognition, Kinesis, or Panorama; experience working with video formats and codecs/compression formats such as MP4, AVI, MOV, H.264
- Ability to work with different geo regions
- Data Science tools (nice to have): Jupyter, Tensor flow
Vacancy expired!