Skip to content

NimTechnology

Trình bày các công nghệ CLOUD một cách dễ hiểu.

  • Kubernetes & Container
    • Docker
    • Kubernetes
      • Ingress
    • Helm Chart
    • Isito-EnvoyFilter
    • Apache Kafka
      • Kafka
      • Kafka Connect
      • Lenses
    • Vault
    • Longhorn – Storage
    • VictoriaMetrics
    • MetalLB
    • Kong Gateway
  • CI/CD
    • ArgoCD
    • ArgoWorkflows
    • Spinnaker
    • Jenkins
    • Harbor
    • TeamCity
    • Git
      • Bitbucket
  • Coding
    • Terraform
      • GCP – Google Cloud
      • AWS – Amazon Web Service
    • Golang
    • Laravel
    • Python
    • Jquery & JavaScript
    • Selenium
  • Log & Monitor
    • DataDog
    • Prometheus
    • Grafana
    • ELK
      • Kibana
      • Logstash
  • BareMetal
    • NextCloud
  • Toggle search form

[Lenses/kafka] Fix the problem “Cannot extract connector information from the configuration provided” when creating connector Kafka-connect

Posted on February 23, 2022April 3, 2022 By nim No Comments on [Lenses/kafka] Fix the problem “Cannot extract connector information from the configuration provided” when creating connector Kafka-connect

Nếu bạn sửa dụng kafka-connect mà sẽ cần add thêm rất nhiều plugin vào server kafka-connect.
Mình đang gặp vấn đề có 1 vài plugin lenses ko thể validate được.

Ví dụ như plugin này: MongoDB Connector (Source and Sink)

khi mình create 1 connector có hiện tượng sau:

Cannot extract connector information from the configuration provided.
Lenses kafka
Lenses can’t validate a few plugin of Kafka-connect

Có 1 người anh chỉ mình:
https://docs.lenses.io/5.0/configuration/static/options/topology/#configure-lenses-to-load-a-custom-connector

chúng ta cần Configure Lenses to load a custom connector

Nếu bạn chạy docker compose thì

version: '3'
services:
  lenses:
    image: lensesio/lenses:latest
    container_name: lenses
    ...
    environment:
      ...
      LENSES_CONNECTORS_INFO: |
        [
            {
            class.name = "com.splunk.kafka.connect.SplunkSinkConnector"
            name = "Splunk Sink",
            instance = "splunk.hec.uri"
            sink = true,
            extractor.class = "io.lenses.config.kafka.connect.SimpleTopicsExtractor"
            icon = "splunk.png",
            description = "Stores Kafka data in Splunk"
            docs = "https://github.com/splunk/kafka-connect-splunk",
            author = "Splunk"
            },
            {
            class.name = "io.debezium.connector.sqlserver.SqlServerConnector"
            name = "CDC MySQL"
            instance = "database.hostname"
            sink = false,
            property = "database.history.kafka.topic"
            extractor.class = "io.lenses.config.kafka.connect.SimpleTopicsExtractor"
            icon = "debezium.png"
            description = "CDC data from RDBMS into Kafka"
            docs = "//debezium.io/docs/connectors/mysql/",
            author = "Debezium"
            }
        ]
    ...

LENSES_CONNECTORS_INFO

[
  {
  class.name = "com.mongodb.kafka.connect.MongoSourceConnector"
  name = "MongoSourceConnector"
  instance = "mongo.source.offical"
  sink = false
  extractor.class = "io.lenses.config.kafka.connect.SimpleTopicsExtractor"
  icon= "mongodb.png"
  description = "Mongo Offical source connector"
  author = "Mongodb team"
  property = "topic"
  }
]

với deployment trên kubenetes thì cũng y chang nhé các bạn.
Mình sài rancher nên easy lắm

Nhiều anh em sẽ hỏi là sao biêt được tên của file icon.

mình dùng thủ thuật tìm số ảnh có sẵn he
hiện tại thi ở như trong ảnh
Apache Kafka, Lenses

Post navigation

Previous Post: [Kafka] UI control Kafka, Kafka-connect, … It’s akhq.io
Next Post: [Kafka-connect] Single Message Transform: lesson 4 – RegexRouter – change topic name.

More Related Articles

[Kafka/Schema-registry] Installing Schema-registry to use for the Kafka and Kafka-connect model Apache Kafka
[Kafka-connect] Single Message Transform: lesson 6 InsertField – Insert fields using attributes in the process data. Apache Kafka
[Kafka-connect] Single Message Transform: lesson 5 MaskField – Cover the sensitive data. Apache Kafka
[Kafka-connect] Single Message Transform: lesson 7 – TimeStampRouter and MessageTimestampRouter – Custom format topic name with timestamp Apache Kafka
[Kafka-connect] Streaming the data of MySQL throughs Kafka-connect and Debezium plugin. Apache Kafka
[Kafka-connect] Single Message Transform: lesson 9 – Cast Apache Kafka

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Tham Gia Group DevOps nhé!
Để Nim có nhiều động lực ra nhiều bài viết.
Để nhận được những thông báo mới nhất.

Recent Posts

  • [Prometheus/Grafana] Install Prometheus and Grafana on ubuntu. March 27, 2023
  • [Kong Gateway] WebSocket connection failed March 26, 2023
  • [Nextcloud] Can’t download files to have a size bigger than 2Gi on NextCloud – RaspBerry March 24, 2023
  • [Datadog] Using DataDog to monitor all services on kubernetes March 19, 2023
  • [Metrics Server] Failed to make webhook authorizer request: the server could not find the requested resource March 17, 2023

Archives

  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021

Categories

  • BareMetal
    • NextCloud
  • CI/CD
    • ArgoCD
    • ArgoWorkflows
    • Git
      • Bitbucket
    • Harbor
    • Jenkins
    • Spinnaker
    • TeamCity
  • Coding
    • Golang
    • Jquery & JavaScript
    • Laravel
    • Python
    • Selenium
    • Terraform
      • AWS – Amazon Web Service
      • GCP – Google Cloud
  • Kubernetes & Container
    • Apache Kafka
      • Kafka
      • Kafka Connect
      • Lenses
    • Docker
    • Helm Chart
    • Isito-EnvoyFilter
    • Kong Gateway
    • Kubernetes
      • Ingress
    • Longhorn – Storage
    • MetalLB
    • Vault
    • VictoriaMetrics
  • Log & Monitor
    • DataDog
    • ELK
      • Kibana
      • Logstash
    • Grafana
    • Prometheus
  • Uncategorized
  • Admin

Copyright © 2023 NimTechnology.