如何在Golang中创建kafka消费者组?

可用库为 sarama (或其扩展名 sarama-cluster ),但是没有 提供了消费者组示例,但未在 sarama 和< a href =“ https://github.com/bsm/sarama-cluster/issues/105” rel =“ nofollow noreferrer”> sarama-cluster 。</ p>

我愿意 不了解API。 我可以举一个为主题创建消费者组的示例吗?</ p>
</ div>

展开原文

原文

An available library is sarama (or its expansion sarama-cluster) however no consumer group example are provided, not in sarama nor in sarama-cluster.

I do not understand the API. May I have an example of creating a consumer group for a topic?

2个回答



使用者组由集群使用者“构造函数”的第二个参数指定。 这是一个非常基本的草图:</ p>

  import(
“ github.com/Shopify/sarama"
” github.com/bsm/sarama-cluster“
)\ n
conf:= cluster.NewConfig()
//添加配置值

brokers:= [] string {“ kafka-1:9092”,“ kafka-2:9092”}
group:=“您的- 消费者组”
topics:= [] {“ topicName”}
consumer:= cluster.NewConsumer(经纪人,组,主题,conf)
</ code> </ pre>

您将拥有一个属于指定消费者组的消费者。</ p>
</ div>

展开原文

原文

The consumer group is specified by the second argument of the cluster consumer "constructor". Here's a very basic sketch:

import (
    "github.com/Shopify/sarama"
    "github.com/bsm/sarama-cluster"
)

conf := cluster.NewConfig()
// add config values

brokers := []string{"kafka-1:9092", "kafka-2:9092"}
group := "Your-Consumer-Group"
topics := []{"topicName"}
consumer := cluster.NewConsumer(broker, group, topics, conf)

And so you'll have a consumer belonging to the specified consumer group.

There is no need to use sarama-cluster library. It is DEPRECATED for apache kafka integration. Sarama original library itself provide a way to connect to kafka cluster using consumer group.

We need to create client and then we initialize consumer group where we create claims and wait for message channel to receive message.

Initializing client :-

kfversion, err := sarama.ParseKafkaVersion(kafkaVersion) // kafkaVersion is the version of kafka server like 0.11.0.2
if err != nil {
    log.Println(err)
}

config := sarama.NewConfig()
config.Version = kfversion
config.Consumer.Return.Errors = true

// Start with a client
client, err := sarama.NewClient([]string{brokerAddr}, config)
if err != nil {
    log.Println(err)
}
defer func() { _ = client.Close() }()

Connection to consumer group :-

// Start a new consumer group
group, err := sarama.NewConsumerGroupFromClient(consumer_group, client)
if err != nil {
    log.Println(err)
}
defer func() { _ = group.Close() }()

Start consuming messages from topic partition :-

// Iterate over consumer sessions.
ctx := context.Background()
for {
    topics := []string{topicName}
    handler := &Message{}
    err := group.Consume(ctx, topics, handler)
    if err != nil {
        log.Println(err)
    }
}

The last part is to wait for message channel to consume messages. We need to implement all of the functions (three) to implement ConsumerGroupHandler interface.

func (exampleConsumerGroupHandler) Setup(_ ConsumerGroupSession) error   { return nil }
func (exampleConsumerGroupHandler) Cleanup(_ ConsumerGroupSession) error { return nil }
func (h exampleConsumerGroupHandler) ConsumeClaim(sess ConsumerGroupSession, claim ConsumerGroupClaim) error {
    for msg := range claim.Messages() {
        fmt.Printf("Message topic:%q partition:%d offset:%d
", msg.Topic, msg.Partition, msg.Offset)
        sess.MarkMessage(msg, "")
    }
    return nil
}

For more information on kafka using golang check sarama library.

douyalin2258
douyalin2258 除非您被卡夫卡0.10.1卡住,否则您仍然需要使用sarama-cluster
10 个月之前 回复
Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
连接到Kafka后Golang消费者延迟接收Kafka消息

<div class="post-text" itemprop="text"> <p><em>I'm new to Golang and Kafa so this might seem like a silly question.</em></p> <p>After my Kafka consumer first connects to the Kafka server, why is there a delay (~ 20 secs) between establishing connection to the Kafka server, and receiving the first message?</p> <p>It prints a message right before <code>consumer.Messages()</code> and print another message for each message received. The ~20 sec delay is between the first <code>fmt.Println</code> and second <code>fmt.Println</code>.</p> <pre><code>package main import ( "fmt" "github.com/Shopify/sarama" cluster "github.com/bsm/sarama-cluster" ) func main() { // Create the consumer and listen for new messages consumer := createConsumer() // Create a signal channel to know when we are done done := make(chan bool) // Start processing messages go func() { fmt.Println("Start consuming Kafka messages") for msg := range consumer.Messages() { s := string(msg.Value[:]) fmt.Println("Msg: ", s) } }() &lt;-done } func createConsumer() *cluster.Consumer { // Define our configuration to the cluster config := cluster.NewConfig() config.Consumer.Return.Errors = false config.Group.Return.Notifications = false config.Consumer.Offsets.Initial = sarama.OffsetOldest // Create the consumer brokers := []string{"127.0.0.1:9092"} topics := []string{"orders"} consumer, err := cluster.NewConsumer(brokers, "my-consumer-group", topics, config) if err != nil { log.Fatal("Unable to connect consumer to Kafka") } go handleErrors(consumer) go handleNotifications(consumer) return consumer } </code></pre> <p><strong>docker-compose.yml</strong></p> <pre><code>version: '2' services: zookeeper: image: "confluentinc/cp-zookeeper:5.0.1" hostname: zookeeper ports: - "2181:2181" environment: ZOOKEEPER_CLIENT_PORT: 2181 ZOOKEEPER_TICK_TIME: 2000 broker-1: image: "confluentinc/cp-enterprise-kafka:5.0.1" hostname: broker-1 depends_on: - zookeeper ports: - "9092:9092" environment: KAFKA_BROKER_ID: 1 KAFKA_BROKER_RACK: rack-a KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181' KAFKA_ADVERTISED_HOST_NAME: 127.0.0.1 KAFKA_ADVERTISED_LISTENERS: 'PLAINTEXT://127.0.0.1:9092' KAFKA_METRIC_REPORTERS: io.confluent.metrics.reporter.ConfluentMetricsReporter KAFKA_DELETE_TOPIC_ENABLE: "true" KAFKA_JMX_PORT: 9999 KAFKA_JMX_HOSTNAME: 'broker-1' KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1 CONFLUENT_METRICS_REPORTER_BOOTSTRAP_SERVERS: broker-1:9092 CONFLUENT_METRICS_REPORTER_ZOOKEEPER_CONNECT: zookeeper:2181 CONFLUENT_METRICS_REPORTER_TOPIC_REPLICAS: 1 CONFLUENT_METRICS_ENABLE: 'true' CONFLUENT_SUPPORT_CUSTOMER_ID: 'anonymous' KAFKA_CREATE_TOPICS: "orders:1:1" </code></pre> </div>

如何在Golang Kafka 10中获取分区的消费者组偏移量

<div class="post-text" itemprop="text"> <p>Now that Golang Kafka library (sarama) is providing consumer group capability without any external library help with kafka 10. How can I get the current message offset being processed by a consumer group at any given time ?</p> <p>Previously I used kazoo-go (<a href="https://github.com/wvanbergen/kazoo-go" rel="nofollow noreferrer">https://github.com/wvanbergen/kazoo-go</a>) to get my consumer group message offset as it is stored in Zookeeper. Now I use sarama-cluster (<a href="https://github.com/bsm/sarama-cluster" rel="nofollow noreferrer">https://github.com/bsm/sarama-cluster</a>), I am not sure which API to use to get my consumer group message offset.</p> </div>

如何在Golang Kafka 10中将使用者设置为从特定偏移量开始

<div class="post-text" itemprop="text"> <p>My need is to make the producer to start from the last message it processed before it crashed. Fortunately I am in the case of having only one topic, with one partition and one consumer.</p> <p>To do so I tried <a href="https://github.com/Shopify/sarama" rel="nofollow noreferrer">https://github.com/Shopify/sarama</a> but it doesn't seems to be available yet. I am now using <a href="https://godoc.org/github.com/bsm/sarama-cluster" rel="nofollow noreferrer">https://godoc.org/github.com/bsm/sarama-cluster</a>, which allow me to commit every message offset.</p> <p>I cannot retrieve the last committed offset I cannot figure out how to make a sarama <a href="https://godoc.org/github.com/bsm/sarama-cluster#Consumer" rel="nofollow noreferrer">consumer</a> to start from said offset. The only parameter I've found so far is <code>Config.Producer.Offsets.Initial</code>.</p> <ol> <li>How to retrieve the last committed offset?</li> <li>How to make the consumer start from the last message whose offset has been committed? <code>OffsetNewest</code> will make it start from the last message produced, not the last processed b the consumer.</li> <li>Is it possible to do so using only Shopify/sarama and not bsm/sarama-cluster ?</li> </ol> <p>Thank in advance</p> <p>P.S. I am using Kafka 10.0, so the offsets are stores in a kafka and not in zookeeper.</p> <p><strong>EDIT1:</strong> Partial solution: fetch all the messages since sarama.OffsetOldest and skip all of them until we found a non processed one.</p> </div>

使用go获取kafka中所有组的所有主题的消费者组偏移量

<div class="post-text" itemprop="text"> <p>NOTE: <strong>NOT A DUPLICATE OF</strong> <a href="https://stackoverflow.com/questions/40642689/how-to-get-consumer-group-offsets-for-partition-in-golang-kafka-10">How to get consumer group offsets for partition in Golang Kafka 10</a> does not answer my question, it's not even a working solution</p> <p>I'm trying to write a function in go that queries kafka for all consumer group offsets for all topics.</p> <p>To do that, I was hoping to read all the messages in <code>__consumer_offsets</code> topic and parse them.</p> <p>However, in all the kakfa go libraries I looked through, I could not find a way to just <em>read</em> all the messages from <code>__consumer_offsets</code> without consuming them.</p> <p>(<code>kafka-go</code> either gives me a way to read from a single partition, or <em>consume</em> messages from the entire topic)</p> <p>So my question is, simply put: Is there a way, using any kafka library out there, to get consumer group offsets for all the groups for all the topics?</p> <p>If not, is there a way to get the offset for a given topic and group id?</p> </div>

Golang Kafka不消耗所有偏移量的最新消息

<div class="post-text" itemprop="text"> <p>First Batch:- I am trying to pull data from 100 flat file and loading up into an array and inserting them to kafka producer one by one as byte array.</p> <p>Second Batch:- I am consuming from kafka consumer and then inserting them to NoSQL database.</p> <p>I use Offsetnewset in the config file of shopify sarama golang package for Kafka.</p> <p>I can receive and insert messages to kafka but while consuming I am getting only the first message. Since I gave Offset newest in the sarama config. how can I get all the data here.</p> </div>

使用kafka-go安排在Kafka中创建消费者的时间表

<div class="post-text" itemprop="text"> <p>I am new to kafka and currently working on it. I am using kafka-go in golang to create producer and consumer. Currently i am able to create a producer but i want consumer to be created once a producer of a topic is created and not every time. means for each topic, a consumer is created only once. Also, when there is a need of creating more consumer for a topic to balance load, it gets created. Is there any way to schedule that, either through goroutines or Faktory?</p> </div>

无法使用Sarama Golang软件包创建Kafka生产者客户端-“客户端/元数据在获取元数据时从代理处出错:EOF”

<div class="post-text" itemprop="text"> <p>Versions: GoLang 1.10.2 Kafka 4.4.1 Docker 18.03.1</p> <p>I'm trying to use Shopify's Sarama package to test out my Kafka instance. I used Docker compose to stand up Kafka/Zookeeper and it is all successfully running. </p> <p>When I try to create a Producer client with Sarama, an error is thrown. </p> <p>When I run the following </p> <pre><code> package main import ( "fmt" "log" "os" "os/signal" "time" "strconv" "github.com/Shopify/sarama" </code></pre> <p>)</p> <pre><code>func main() { // Setup configuration config := sarama.NewConfig() config.Producer.Return.Successes = true config.Producer.Partitioner = sarama.NewRandomPartitioner config.Producer.RequiredAcks = sarama.WaitForAll brokers := []string{"localhost:29092"} producer, err := sarama.NewAsyncProducer(brokers, config) if err != nil { // Should not reach here panic(err) } defer func() { if err := producer.Close(); err != nil { // Should not reach here panic(err) } }() </code></pre> <p>I get this </p> <p>[sarama] 2018/06/12 17:22:05 Initializing new client</p> <p>[sarama] 2018/06/12 17:22:05 client/metadata fetching metadata for all topics from broker localhost:29092</p> <p>[sarama] 2018/06/12 17:22:05 Connected to broker at localhost:29092 (unregistered)</p> <p>[sarama] 2018/06/12 17:22:05 client/metadata got error from broker while fetching metadata: EOF</p> <p>[sarama] 2018/06/12 17:22:05 Closed connection to broker localhost:29092</p> <p>{sarama] 2018/06/12 17:22:05 client/metadata no available broker to send metadata request to</p> <p>[sarama] 2018/06/12 17:22:06 Closing Client panic: kafka: client has run out of available brokers to talk to (Is your cluster reachable?)</p> <p>goroutine 1 [running]: main.main() /Users/benwornom/go/src/github.com/acstech/doppler-events/testprod/main.go:29 +0x3ec exit status 2</p> <p>Sarama did try several times in a row to create a producer client, but failed each time. </p> <p>My understanding of Sarama's "NewAsyncProducer" method is that it calls "NewClient", which is invoked regardless of whether you are creating a Producer or Consumer. NewClient attempts to gather metadata from the Kafka broker, which is failing in my situation. I know it is connecting to the Kafka broker, but once it connects it seems to break. Any advice would be helpful. My network connection is strong, I can't think of anything interfering with the server. As far as I know, I only have one broker and one partition for the existing topic. I don't think I have to manually assign a topic to a broker. If my client is connecting with the broker, why can't I establish a lasting connection for my producer?</p> <p>This is from the kafka log file right before it dies. </p> <p>__consumer_offsets-5 -&gt; Vector(1), connect-offsets-23 -&gt; Vector(1), __consumer_offsets-43 -&gt; Vector(1), __consumer_offsets-32 -&gt; Vector(1), __consumer_offsets-21 -&gt; Vector(1), __consumer_offsets-10 -&gt; Vector(1), connect-offsets-20 -&gt; Vector(1), __consumer_offsets-37 -&gt; Vector(1), connect-offsets-9 -&gt; Vector(1), connect-status-4 -&gt; Vector(1), __consumer_offsets-48 -&gt; Vector(1), __consumer_offsets-40 -&gt; Vector(1), __consumer_offsets-29 -&gt; Vector(1), __consumer_offsets-18 -&gt; Vector(1), connect-offsets-14 -&gt; Vector(1), __consumer_offsets-7 -&gt; Vector(1), __consumer_offsets-34 -&gt; Vector(1), __consumer_offsets-45 -&gt; Vector(1), __consumer_offsets-23 -&gt; Vector(1), connect-offsets-6 -&gt; Vector(1), connect-status-1 -&gt; Vector(1), connect-offsets-17 -&gt; Vector(1), connect-offsets-0 -&gt; Vector(1), connect-offsets-22 -&gt; Vector(1), __consumer_offsets-26 -&gt; Vector(1), connect-offsets-11 -&gt; Vector(1), __consumer_offsets-15 -&gt; Vector(1), __consumer_offsets-4 -&gt; Vector(1), __consumer_offsets-42 -&gt; Vector(1), __consumer_offsets-9 -&gt; Vector(1), __consumer_offsets-31 -&gt; Vector(1), __consumer_offsets-20 -&gt; Vector(1), connect-offsets-3 -&gt; Vector(1), __consumer_offsets-1 -&gt; Vector(1), __consumer_offsets-12 -&gt; Vector(1), connect-offsets-8 -&gt; Vector(1), connect-offsets-19 -&gt; Vector(1), connect-status-3 -&gt; Vector(1), __confluent.support.metrics-0 -&gt; Vector(1), __consumer_offsets-17 -&gt; Vector(1), __consumer_offsets-28 -&gt; Vector(1), __consumer_offsets-6 -&gt; Vector(1), __consumer_offsets-39 -&gt; Vector(1), __consumer_offsets-44 -&gt; Vector(1), connect-offsets-16 -&gt; Vector(1), connect-status-0 -&gt; Vector(1), connect-offsets-5 -&gt; Vector(1), connect-offsets-21 -&gt; Vector(1), __consumer_offsets-47 -&gt; Vector(1), __consumer_offsets-36 -&gt; Vector(1), __consumer_offsets-14 -&gt; Vector(1), __consumer_offsets-25 -&gt; Vector(1), __consumer_offsets-3 -&gt; Vector(1), __consumer_offsets-30 -&gt; Vector(1), __consumer_offsets-41 -&gt; Vector(1), connect-offsets-13 -&gt; Vector(1), connect-offsets-24 -&gt; Vector(1), connect-offsets-2 -&gt; Vector(1), connect-configs-0 -&gt; Vector(1), __consumer_offsets-11 -&gt; Vector(1), __consumer_offsets-22 -&gt; Vector(1), __consumer_offsets-33 -&gt; Vector(1), __consumer_offsets-0 -&gt; Vector(1), connect-offsets-7 -&gt; Vector(1), connect-offsets-18 -&gt; Vector(1))) (kafka.controller.KafkaController) [36mkafka_1 |[0m [2018-06-12 20:24:47,461] DEBUG [Controller id=1] Topics not in preferred replica for broker 1 Map() (kafka.controller.KafkaController) [36mkafka_1 |[0m [2018-06-12 20:24:47,462] TRACE [Controller id=1] Leader imbalance ratio for broker 1 is 0.0 (kafka.controller.KafkaController)</p> </div>

如何在Golang Kafka 10中获取GroupID?

<div class="post-text" itemprop="text"> <p>I am using Kafka 10.0 and <a href="https://github.com/Shopify/sarama" rel="nofollow noreferrer">https://github.com/Shopify/sarama</a>. I am trying to get the offset of the latest message that a consumer processed.</p> <p>To do so I've found the method <a href="https://godoc.org/github.com/Shopify/sarama#NewOffsetManagerFromClient" rel="nofollow noreferrer">NewOffsetManagerFromClient(group string, client Client)</a> which require the group name.</p> <p><strong>How do I get consumer group name?</strong></p> <pre><code>offsets := make(map[int32]int64) config := sarama.NewConfig() config.Consumer.Offsets.CommitInterval = 200 * time.Millisecond config.Version = sarama.V0_10_0_0 // config.Consumer.Offsets.Initial = sarama.OffsetNewest cli, _ := sarama.NewClient(kafkaHost, config) defer cli.Close() offsetManager, _ := sarama.NewOffsetManagerFromClient(group, cli) for _, partition := range partitions { partitionOffsetManager, _ := offsetManager.ManagePartition(topic, partition) offset, _ := partitionOffsetManager.NextOffset() offsets[partition] = offset } return offsets </code></pre> <p>I created a consumer with</p> <pre><code>consumer := sarama.NewConsumer(connections, config) </code></pre> <p>but I do not know how to create a consumer group and get its group name.</p> </div>

在Golang中动态创建PostgreSQL表

<div class="post-text" itemprop="text"> <p>I am using "database/sql" package in GO. I want to create a table of variable name.</p> <p>Only way I can think of is-</p> <pre><code>db.Exec(`CREATE TABLE`+table_name+`;`) </code></pre> <p>But it is not safe as there can be SQL injection.</p> </div>

Golang动态创建成员Struct

<div class="post-text" itemprop="text"> <p>I am fairly new to Golang. I know there is Struct in Golang. But for all I know, you have to define struct </p> <pre><code>type Circle struct{ x,y,r float64 } </code></pre> <p>I am wondering how you can declare a new variable that doesn't exist in the Struct</p> <pre><code>circle := new(Cirlce) circle.color = "black" </code></pre> <p>Thx in advance.</p> </div>

如何在golang中创建对象数组?

<div class="post-text" itemprop="text"> <p>I have a requirement in which I need to store array of objects in a variable. The objects are of different types. Refer to following example:</p> <pre><code> v := [ {"name":"ravi"}, ["art","coding","music","travel"], {"language":"golang"}, {"experience":"no"} ] </code></pre> <p>Notice the second element is array of string itself. After research, I thought of storing this as interface type like:</p> <pre><code> var v interface{} = [ {"name":"ravi"}, ["art","coding","music","travel"], {"language":"golang"}, {"experience":"no"} ] </code></pre> <p>Still, I am getting few compilation errors which I am not able to find out.</p> </div>

在golang中创建后缀树

<div class="post-text" itemprop="text"> <p>I have an array of strings and I need to create a suffix tree out of it in Golang. SuffixArray in Golang does not suffice my needs, because it only accepts byte array (i.e of a single string). Could anybody provide pointers for implementation. Thanks in advance.</p> </div>

如何在golang中创建一个不固定的长度切片

<div class="post-text" itemprop="text"> <p>Is there any way to create a <strong>unfixed length</strong> slice in go? As a example, I want to grab all the fileNames in a directory(<code>content/</code>) fill to a <code>[] string</code> slice.</p> <p>The <code>content/</code> dir contains:</p> <pre><code>$-&gt; tree content/ content/ ├── 1.txt ├── 2.txt └── tmp </code></pre> <p>Here is what I currently got:</p> <pre><code>package main import ( "fmt" "io/ioutil" ) func listFile() []string { list := make([]string, 100) // as you can see, I make a slice length as 100, but that is not appropriate. files, _ := ioutil.ReadDir("content") i := 0 for _, f := range files{ list[i] = f.Name() i = i+1 } return list } func main(){ fmt.Print(listFile()) } </code></pre> <p>What I want to achieve is a way to simulate the behavior of <code>ArrayList</code> in java, which I can just simply <code>list.add()</code> and wont care about the length.</p> <p>Can slice in GoLang do that? </p> <p>Thanks.</p> </div>

如何在Golang中创建词典列表?

<div class="post-text" itemprop="text"> <p>I'm newbie with Golang.</p> <p>I want to create a list of dictionaries with resizable capability (this is not static) with append some dict to the list, finally write it on a file, but I confused.</p> <p>I want something like that:</p> <pre><code>[ {"port": 161, "timeout": 1, "sleep_time": 5, "metrics": [ {"tag_name": "output_current", "id": 3}, {"tag_name": "input_voltage", "id": 2} ]}, {"port": 161, "timeout": 1, "sleep_time": 4, "metrics": [ {"tag_name": "destructor", "id": 10} ]} ] </code></pre> <hr> <p>[UPDATE]:</p> <p>What is the <code>.append()</code> Python equivalent in Go language something like that?</p> <pre><code>list_ = [] dict_ = {"key": val} list_.append(dict_) </code></pre> <p>I found the answer to this section ([UPDATE]) by borrowing <a href="https://stackoverflow.com/a/53846458/3702377">this answer</a>:</p> <pre><code>type Dictionary map[string]interface{} data := []Dictionary{} dict1 := Dictionary{"key": 1} dict2 := Dictionary{"key": 2} data = append(data, dict1, dict2) </code></pre> </div>

在Golang中动态创建结构数组

<div class="post-text" itemprop="text"> <p>I try to create a generic function that accepts any struct value and create a array of that struct type. Here is the code I tried. But I get the error "t is not a type". How can I implement this.</p> <pre><code> type RegAppDB struct { nm string data []interface{} } func CreateRegTable(tbl string, rec interface{}) RegAppDB { t := reflect.TypeOf(rec) fmt.Println(t) return RegAppDB{"log", []t} } </code></pre> </div>

在golang中创建二维字符串数组

<div class="post-text" itemprop="text"> <p>I need to create a 2 dimensional string array as shown below - </p> <pre><code>matrix = [['cat,'cat','cat'],['dog','dog']] </code></pre> <p>Code:-</p> <pre><code>package main import ( "fmt" ) func main() { { // using append var matrix [][]string matrix[0] = append(matrix[0],'cat') fmt.Println(matrix) } } </code></pre> <p>Error:- </p> <pre><code>panic: runtime error: index out of range goroutine 1 [running]: main.main() /tmp/sandbox863026592/main.go:11 +0x20 </code></pre> </div>

如何在GOLANG中创建SOAP Web服务

<div class="post-text" itemprop="text"> <p>I am newbie in GO and I am looking to create a new SOAP web-service in GO. I have seen examples of how to consume a soap service in GO i.e. writing clients for SOAP services and generating code using .wsdl from services but I haven't come across any example or library that helps in creating a SOAP web-service server. </p> <p>Any help or pointers will be much appreciated. </p> <p>thanks.</p> </div>

如何在GoLang中创建多维键值数组/切片

<div class="post-text" itemprop="text"> <p>I have a <code>PHP</code> script in which i'hv created some array formats (data structures) which i want to convert the same array structure using <code>Golang</code>.</p> <p>Below is the array structure of my PHP script</p> <pre><code> $response['spf']['current_value'] = $spfValue; // this will be the array of strings $response['spf']['required_value'] = "v=spf1 a include:32782.pppp.com ~all"; $response['spf']['is_verified'] = $isValidSpf; //this will be int $response['spf']['spf_matched'] = $isMatched; //this will be int print_r($response); </code></pre> <p>The outut of above script will be <code>Array</code> of key named SPF </p><pre> <code>[spf] =&gt; Array ( [current_value] =&gt; Array ( [0] =&gt; v=spf1 a -all, ) [required_value] =&gt; v=spf1 a include:32782.pppp.com ~all [is_verified] =&gt; 0 [spf_matched] =&gt; 0 ) </code></pre> <p>As i am a new in <code>golang</code> , Need some golang code which will return same output as per <code>PHP</code> script</p> </div>

apache kafka消费者恢复行为

<div class="post-text" itemprop="text"> <p>I am currently working on Apache Kafka using go/golang confluent library. I have some doubts regarding consumer and its APIs.</p> <ol> <li><p>I am using pause and resume APIs of the library and doing manual commits. Let's say, I send 100 messages and without committing, I paused consumer and resume it afterward. I notice it didn't consume those 100 messages again but it started consuming the latest messages. Is this an expected behavior? If yes, is there a way to consume those 100 messages again.</p></li> <li><p>When I am resuming the consumers, after some processing, I am doing manual commit. I noticed the return offset committed is -1001 for a partition. I am not able to understand why it is happening and what it means? Did I lose all the data or the commit failed?</p></li> <li><p>Can someone please explain to me auto.offset.reset - latest and earliest?</p></li> </ol> <p>Thanks</p> </div>

初级玩转Linux+Ubuntu(嵌入式开发基础课程)

课程主要面向嵌入式Linux初学者、工程师、学生 主要从一下几方面进行讲解: 1.linux学习路线、基本命令、高级命令 2.shell、vi及vim入门讲解 3.软件安装下载、NFS、Samba、FTP等服务器配置及使用

我以为我对Mysql事务很熟,直到我遇到了阿里面试官

太惨了,面试又被吊打

Python代码实现飞机大战

文章目录经典飞机大战一.游戏设定二.我方飞机三.敌方飞机四.发射子弹五.发放补给包六.主模块 经典飞机大战 源代码以及素材资料(图片,音频)可从下面的github中下载: 飞机大战源代码以及素材资料github项目地址链接 ————————————————————————————————————————————————————————— 不知道大家有没有打过飞机,喜不喜欢打飞机。当我第一次接触这个东西的时候,我的内心是被震撼到的。第一次接触打飞机的时候作者本人是身心愉悦的,因为周边的朋友都在打飞机, 每

Python数据分析与挖掘

92讲视频课+16大项目实战+源码+¥800元课程礼包+讲师社群1V1答疑+社群闭门分享会=99元 &nbsp; 为什么学习数据分析? &nbsp; &nbsp; &nbsp; 人工智能、大数据时代有什么技能是可以运用在各种行业的?数据分析就是。 &nbsp; &nbsp; &nbsp; 从海量数据中获得别人看不见的信息,创业者可以通过数据分析来优化产品,营销人员可以通过数据分析改进营销策略,产品经理可以通过数据分析洞察用户习惯,金融从业者可以通过数据分析规避投资风险,程序员可以通过数据分析进一步挖掘出数据价值,它和编程一样,本质上也是一个工具,通过数据来对现实事物进行分析和识别的能力。不管你从事什么行业,掌握了数据分析能力,往往在其岗位上更有竞争力。 &nbsp;&nbsp; 本课程共包含五大模块: 一、先导篇: 通过分析数据分析师的一天,让学员了解全面了解成为一个数据分析师的所有必修功法,对数据分析师不在迷惑。 &nbsp; 二、基础篇: 围绕Python基础语法介绍、数据预处理、数据可视化以及数据分析与挖掘......这些核心技能模块展开,帮助你快速而全面的掌握和了解成为一个数据分析师的所有必修功法。 &nbsp; 三、数据采集篇: 通过网络爬虫实战解决数据分析的必经之路:数据从何来的问题,讲解常见的爬虫套路并利用三大实战帮助学员扎实数据采集能力,避免没有数据可分析的尴尬。 &nbsp; 四、分析工具篇: 讲解数据分析避不开的科学计算库Numpy、数据分析工具Pandas及常见可视化工具Matplotlib。 &nbsp; 五、算法篇: 算法是数据分析的精华,课程精选10大算法,包括分类、聚类、预测3大类型,每个算法都从原理和案例两个角度学习,让你不仅能用起来,了解原理,还能知道为什么这么做。

如何在虚拟机VM上使用串口

在系统内核开发中,经常会用到串口调试,利用VMware的Virtual Machine更是为调试系统内核如虎添翼。那么怎么搭建串口调试环境呢?因为最近工作涉及到这方面,利用强大的google搜索和自己

程序员的兼职技能课

获取讲师答疑方式: 在付费视频第一节(触摸命令_ALL)片头有二维码及加群流程介绍 限时福利 原价99元,今日仅需39元!购课添加小助手(微信号:csdn590)按提示还可领取价值800元的编程大礼包! 讲师介绍: 苏奕嘉&nbsp;前阿里UC项目工程师 脚本开发平台官方认证满级(六级)开发者。 我将如何教会你通过【定制脚本】赚到你人生的第一桶金? 零基础程序定制脚本开发课程,是完全针对零脚本开发经验的小白而设计,课程内容共分为3大阶段: ①前期将带你掌握Q开发语言和界面交互开发能力; ②中期通过实战来制作有具体需求的定制脚本; ③后期将解锁脚本的更高阶玩法,打通任督二脉; ④应用定制脚本合法赚取额外收入的完整经验分享,带你通过程序定制脚本开发这项副业,赚取到你的第一桶金!

MFC一站式终极全套课程包

该套餐共包含从C小白到C++到MFC的全部课程,整套学下来绝对成为一名C++大牛!!!

C++语言基础视频教程

C++语言基础视频培训课程:本课与主讲者在大学开出的程序设计课程直接对接,准确把握知识点,注重教学视频与实践体系的结合,帮助初学者有效学习。本教程详细介绍C++语言中的封装、数据隐藏、继承、多态的实现等入门知识;主要包括类的声明、对象定义、构造函数和析构函数、运算符重载、继承和派生、多态性实现等。 课程需要有C语言程序设计的基础(可以利用本人开出的《C语言与程序设计》系列课学习)。学习者能够通过实践的方式,学会利用C++语言解决问题,具备进一步学习利用C++开发应用程序的基础。

北京师范大学信息科学与技术学院笔试10复试真题

北京师范大学信息科学与技术学院笔试,可以更好的让你了解北师大该学院的复试内容,获得更好的成绩。

深度学习原理+项目实战+算法详解+主流框架(套餐)

深度学习系列课程从深度学习基础知识点开始讲解一步步进入神经网络的世界再到卷积和递归神经网络,详解各大经典网络架构。实战部分选择当下最火爆深度学习框架PyTorch与Tensorflow/Keras,全程实战演示框架核心使用与建模方法。项目实战部分选择计算机视觉与自然语言处理领域经典项目,从零开始详解算法原理,debug模式逐行代码解读。适合准备就业和转行的同学们加入学习! 建议按照下列课程顺序来进行学习 (1)掌握深度学习必备经典网络架构 (2)深度框架实战方法 (3)计算机视觉与自然语言处理项目实战。(按照课程排列顺序即可)

网络工程师小白入门--【思科CCNA、华为HCNA等网络工程师认证】

本课程适合CCNA或HCNA网络小白同志,高手请绕道,可以直接学习进价课程。通过本预科课程的学习,为学习网络工程师、思科CCNA、华为HCNA这些认证打下坚实的基础! 重要!思科认证2020年2月24日起,已启用新版认证和考试,包括题库都会更新,由于疫情原因,请关注官网和本地考点信息。题库网络上很容易下载到。

Python界面版学生管理系统

前不久上传了一个控制台版本的学生管理系统,这个是Python界面版学生管理系统,这个是使用pycharm开发的一个有界面的学生管理系统,基本的增删改查,里面又演示视频和完整代码,有需要的伙伴可以自行下

软件测试2小时入门

本课程内容系统、全面、简洁、通俗易懂,通过2个多小时的介绍,让大家对软件测试有个系统的理解和认识,具备基本的软件测试理论基础。 主要内容分为5个部分: 1 软件测试概述,了解测试是什么、测试的对象、原则、流程、方法、模型;&nbsp; 2.常用的黑盒测试用例设计方法及示例演示;&nbsp; 3 常用白盒测试用例设计方法及示例演示;&nbsp; 4.自动化测试优缺点、使用范围及示例‘;&nbsp; 5.测试经验谈。

Tomcat服务器下载、安装、配置环境变量教程(超详细)

未经我的允许,请不要转载我的文章,在此郑重声明!!! 请先配置安装好Java的环境,若没有安装,请参照我博客上的步骤进行安装! 安装Java环境教程https://blog.csdn.net/qq_40881680/article/details/83585542 Tomcat部署Web项目(一)·内嵌https://blog.csdn.net/qq_40881680/article/d...

2019数学建模A题高压油管的压力控制 省一论文即代码

2019数学建模A题高压油管的压力控制省一完整论文即详细C++和Matlab代码,希望对同学们有所帮助

图书管理系统(Java + Mysql)我的第一个完全自己做的实训项目

图书管理系统 Java + MySQL 完整实训代码,MVC三层架构组织,包含所有用到的图片资源以及数据库文件,大三上学期实训,注释很详细,按照阿里巴巴Java编程规范编写

linux下利用/proc进行进程树的打印

在linux下利用c语言实现的进程树的打印,主要通过/proc下的目录中的进程文件,获取status中的进程信息内容,然后利用递归实现进程树的打印

微信小程序开发实战之番茄时钟开发

微信小程序番茄时钟视频教程,本课程将带着各位学员开发一个小程序初级实战类项目,针对只看过官方文档而又无从下手的开发者来说,可以作为一个较好的练手项目,对于有小程序开发经验的开发者而言,可以更好加深对小程序各类组件和API 的理解,为更深层次高难度的项目做铺垫。

[已解决]踩过的坑之mysql连接报“Communications link failure”错误

目录 前言 第一种方法: 第二种方法 第三种方法(适用于项目和数据库在同一台服务器) 第四种方法 第五种方法(项目和数据库不在同一台服务器) 总结 前言 先给大家简述一下我的坑吧,(我用的是mysql,至于oracle有没有这样的问题,有心的小伙伴们可以测试一下哈), 在自己做个javaweb测试项目的时候,因为买的是云服务器,所以数据库连接的是用ip地址,用IDE开发好...

人工智能-计算机视觉实战之路(必备算法+深度学习+项目实战)

系列课程主要分为3大阶段:(1)首先掌握计算机视觉必备算法原理,结合Opencv进行学习与练手,通过实际视项目进行案例应用展示。(2)进军当下最火的深度学习进行视觉任务实战,掌握深度学习中必备算法原理与网络模型架构。(3)结合经典深度学习框架与实战项目进行实战,基于真实数据集展开业务分析与建模实战。整体风格通俗易懂,项目驱动学习与就业面试。 建议同学们按照下列顺序来进行学习:1.Python入门视频课程 2.Opencv计算机视觉实战(Python版) 3.深度学习框架-PyTorch实战/人工智能框架实战精讲:Keras项目 4.Python-深度学习-物体检测实战 5.后续实战课程按照自己喜好选择就可以

2019 AI开发者大会

2019 AI开发者大会(AI ProCon 2019)是由中国IT社区CSDN主办的AI技术与产业年度盛会。多年经验淬炼,如今蓄势待发:2019年9月6-7日,大会将有近百位中美顶尖AI专家、知名企业代表以及千余名AI开发者齐聚北京,进行技术解读和产业论证。我们不空谈口号,只谈技术,诚挚邀请AI业内人士一起共铸人工智能新篇章!

机器学习初学者必会的案例精讲

通过六个实际的编码项目,带领同学入门人工智能。这些项目涉及机器学习(回归,分类,聚类),深度学习(神经网络),底层数学算法,Weka数据挖掘,利用Git开源项目实战等。

Python数据分析师-实战系列

系列课程主要包括Python数据分析必备工具包,数据分析案例实战,核心算法实战与企业级数据分析与建模解决方案实战,建议大家按照系列课程阶段顺序进行学习。所有数据集均为企业收集的真实数据集,整体风格以实战为导向,通俗讲解Python数据分析核心技巧与实战解决方案。

YOLOv3目标检测实战系列课程

《YOLOv3目标检测实战系列课程》旨在帮助大家掌握YOLOv3目标检测的训练、原理、源码与网络模型改进方法。 本课程的YOLOv3使用原作darknet(c语言编写),在Ubuntu系统上做项目演示。 本系列课程包括三门课: (1)《YOLOv3目标检测实战:训练自己的数据集》 包括:安装darknet、给自己的数据集打标签、整理自己的数据集、修改配置文件、训练自己的数据集、测试训练出的网络模型、性能统计(mAP计算和画出PR曲线)和先验框聚类。 (2)《YOLOv3目标检测:原理与源码解析》讲解YOLOv1、YOLOv2、YOLOv3的原理、程序流程并解析各层的源码。 (3)《YOLOv3目标检测:网络模型改进方法》讲解YOLOv3的改进方法,包括改进1:不显示指定类别目标的方法 (增加功能) ;改进2:合并BN层到卷积层 (加快推理速度) ; 改进3:使用GIoU指标和损失函数 (提高检测精度) ;改进4:tiny YOLOv3 (简化网络模型)并介绍 AlexeyAB/darknet项目。

2021考研数学张宇基础30讲.pdf

张宇:博士,全国著名考研数学辅导专家,教育部“国家精品课程建设骨干教师”,全国畅销书《张宇高等数学18讲》《张宇线性代数9讲》《张宇概率论与数理统计9讲》《张宇考研数学题源探析经典1000题》《张宇考

三个项目玩转深度学习(附1G源码)

从事大数据与人工智能开发与实践约十年,钱老师亲自见证了大数据行业的发展与人工智能的从冷到热。事实证明,计算机技术的发展,算力突破,海量数据,机器人技术等,开启了第四次工业革命的序章。深度学习图像分类一直是人工智能的经典任务,是智慧零售、安防、无人驾驶等机器视觉应用领域的核心技术之一,掌握图像分类技术是机器视觉学习的重中之重。针对现有线上学习的特点与实际需求,我们开发了人工智能案例实战系列课程。打造:以项目案例实践为驱动的课程学习方式,覆盖了智能零售,智慧交通等常见领域,通过基础学习、项目案例实践、社群答疑,三维立体的方式,打造最好的学习效果。

DirectX修复工具V4.0增强版

DirectX修复工具(DirectX Repair)是一款系统级工具软件,简便易用。本程序为绿色版,无需安装,可直接运行。 本程序的主要功能是检测当前系统的DirectX状态,如果发现异常则进行修复

期末考试评分标准的数学模型

大学期末考试与高中的考试存在很大的不同之处,大学的期末考试成绩是主要分为两个部分:平时成绩和期末考试成绩。平时成绩和期末考试成绩总分一般为一百分,然而平时成绩与期末考试成绩所占的比例不同会导致出现不同

Vue.js 2.0之全家桶系列视频课程

基于新的Vue.js 2.3版本, 目前新全的Vue.js教学视频,让你少走弯路,直达技术前沿! 1. 包含Vue.js全家桶(vue.js、vue-router、axios、vuex、vue-cli、webpack、ElementUI等) 2. 采用笔记+代码案例的形式讲解,通俗易懂

c语言项目开发实例

十个c语言案例 (1)贪吃蛇 (2)五子棋游戏 (3)电话薄管理系统 (4)计算器 (5)万年历 (6)电子表 (7)客户端和服务器通信 (8)潜艇大战游戏 (9)鼠标器程序 (10)手机通讯录系统

相关热词 c#框体中的退出函数 c# 按钮透明背景 c# idl 混编出错 c#在位置0处没有任何行 c# 循环给数组插入数据 c# 多线程死锁的例子 c# 钉钉读取员工排班 c# label 不显示 c#裁剪影像 c#工作进程更新ui
立即提问