01 Aug
Senior IT Data Engineer
Maryland, Fortmeade , 20755 Fortmeade USA

Vacancy expired!

Description Job Description:Leidos is seeking a Senior IT Data Engineer, to be based in the Fort Meade, MD area. As the Senior IT Data Engineer, you will be a member of a multi-disciplinary team that supports the Joint Regional Security Stack (JRSS). You interface with other technical teams, DISA personnel, vendor technical support, and with technical representatives from DoD services. You will work as part of an integrated, cross-platform team to provide military base/post/camp/station migration support services DoD-wide as the JRSS stacks are deployed and used for operations.You will leverage your understanding of Kafka, modern data architectures and Data Streaming to support the design, development, implementation and automation of enterprise level solutions for our DoD customers. You will support requirements analysis, systems design, integration, testing, and associated documentation. You will also provide tier 3 support for the operations and maintenance of fielded systems.Primary Responsibilities:

  • Assist in designing, publishing and the maintenance of schemas
-Schema design and optimization-Maintain Schema Registry-Work with mission partners to leverage schemas and data
  • Develop connectors for non-standard data sources
  • Manage Topics
-Establish naming conventions-Design topic hierarchies-Optimize and manage partitions-Design and manage partition placement-Handle partition balancing
  • Support the development and implementation of automation
-Design and build out Ansible deployment of Confluent components -Design and build out Ansible deployment of replication configuration -Manage Ansible for operations (upgrades, scaling, etc.) -Provide Ansible toolkit for mission partners
  • Develop User Defined Functions (UDF) to support real time streaming data processing & analytics in KSQL
  • Develop, build, optimize, and maintain KStreams processing jobs
  • Assist in designing and implementing the Data Ingest pipeline
-Ensure throughput and capacity of Kafka Connect -Normalize data feeds into a common schema for ingestion into Elastic and into mission partner systems -Enrich data feeds with derived metadata -Route incoming data feeds into topic topology -Support the addition of new data sources -Support changes in schemas -Implement any data reduction strategies such as duplicate event aggregationBasic Qualifications:Bachelor's degree or equivalent experience/combined education, with 12+ years of experience; or 10+ years of professional experience with a related Master's degree.Experience with KafkaExperience with RedHat LinuxExperience with KSQLExperience with AnsibleExperience with data architectures and data streamingU.S. citizenship and an active DoD Secret clearance. In addition, you must be able to successfully obtain up to Top Secretbased on requirements from the customer and program.DoD 8570 IAT2 certificationPreferred Qualifications:Experience with enterprise level architecture design and developmentExperience with container technologiesExperience with multiple scripting/programming languages (e.g., Python, Java, etc.)Good Debugging Skills and troubleshooting skills Good written and oral communication skillsExternal Referral Bonus:EligibleExternal Referral Bonus $:$3000Potential for Telework:Yes, 10%Clearance Level Required:SecretTravel:Yes, 10% of the timeScheduled Weekly Hours:40Shift:DayRequisition Category:ProfessionalJob Family:Software EngineeringPay Range:

Vacancy expired!


Report job