Kafka Developer
Charger Logistics Inc, Canada

Experience
1 Year
Salary
0 - 0
Job Type
Job Shift
Job Category
Traveling
No
Career Level
Telecommute
No
Qualification
Bachelor's Degree
Total Vacancies
1 Job
Posted on
Mar 4, 2021
Last Date
Apr 4, 2021
Location(s)

Job Description

Charger Logistics is hiring, and we are looking for an experienced Kafka Developer to join our team.

As a Kafka Developer, you will apply your experience and knowledge in implementing cloud-native event-based architecture using Apache Kafka platform and on-premise or cloud based complementary systems which includes development and support of Kafka integrations, including topics, producers, consumers, Schema Registry, Kafka Control Center, KSQL, and streaming applications. You will contribute to implementation of the application lifecycle requirements, architecture design, developments and build, integration and release configuration, system testing, production operations, application optimization, and best practices adoption.

Job Duties:

  • Development, maintenance of reusable data streaming solutions to support various initiatives ensuring adherence to overall solution direction.
  • Construction and maintenance of robust and real-time data exchange solution using Kafka or other real-time technologies.
  • Develop techniques to process and analyze events on a real-time streaming platform, develop quality scalable, tested, and reliable data services using industry best practices.

Requirements

  • Hands on experience using Apache Kafka, Java amp; Linux is a must
  • Experience building RESTful APIs-Experience working with Docker amp; Kubernetes.
  • Experience with Kafka Streams / KSQL architecture and associated clustering model.
  • Strong fundamentals and experience in Kafka configuration, and troubleshooting.
  • Understand and experience with Kafka clustering, and its fault-tolerance model supporting HA and DR.
  • Practical experience with how to scale Kafka, KStreams, and Connector infrastructures, with the motivation to build efficient platforms.
  • Experience with developing KSQL queries and best practices of using KSQL vs streams.
  • Strong knowledge of the Kafka Connect framework, with experience using several connector types: JMS, File, SFTP, JDBC, Splunk, Elastic Search
  • Have developed KStreams pipelines, as well as deployed KStreams clusters.
  • Knowledge of connectors available from Confluent and the community
  • The familiarity of the Schema Registry and its management
  • Best practices to optimize the Kafka ecosystem based on use-case and workload, e.g., how to effectively use topic, partitions, and consumer groups to provide optimal routing and support of UDF and UDAF.
  • Scripting proficiency with Java/node.js and best practices in development
  • Experience with monitoring Kafka infrastructure along with related components (Connectors, KStreams)

Job Specification

Job Rewards and Benefits

Charger Logistics Inc

Information Technology and Services - Victoria, Mexico
© Copyright 2004-2024 Mustakbil.com All Right Reserved.