We're in the process of moving from AMQP to Google's Pubsub.
The docs suggest that pull might be the best choice for us since we're using compute engine and can't open our workers to receive via the push service.
It also says that pull might incur additional costs depending on usage:
If polling is used, high network usage may be incurred if you are opening connections frequently and closing them immediately.
We'd created a test subscriber in go that runs in a loop as so:
func main() {
jsonKey, err := ioutil.ReadFile("pubsub-key.json")
if err != nil {
log.Fatal(err)
}
conf, err := google.JWTConfigFromJSON(
jsonKey,
pubsub.ScopeCloudPlatform,
pubsub.ScopePubSub,
)
if err != nil {
log.Fatal(err)
}
ctx := cloud.NewContext("xxx", conf.Client(oauth2.NoContext))
msgIDs, err := pubsub.Publish(ctx, "topic1", &pubsub.Message{
Data: []byte("hello world"),
})
if err != nil {
log.Println(err)
}
log.Printf("Published a message with a message id: %s
", msgIDs[0])
for {
msgs, err := pubsub.Pull(ctx, "subscription1", 1)
if err != nil {
log.Println(err)
}
if len(msgs) > 0 {
log.Printf("New message arrived: %v, len: %d
", msgs[0].ID, len(msgs))
if err := pubsub.Ack(ctx, "subscription1", msgs[0].AckID); err != nil {
log.Fatal(err)
}
log.Println("Acknowledged message")
log.Printf("Message: %s", msgs[0].Data)
}
}
}
The question I have though is really whether this is the correct / recommended way to go about pulling messages.
We recieve about 100msg per second throughout the day. I'm not sure if running it in an endless loop is going to bankrupt us and can't find any other decent go examples.