I have a Go gRPC server with an endpoint that handles a stream. I have a client that streams a large file to the Go gRPC server. The performance of the server seems to be low when streaming a the large file. How can I improve performance?
Initially the server was setup on a VM and it took about 1 second to stream a file of size 8mb. Then I moved the server to my local machine where the client is located and now it takes about 250ms. I tried playing around with the ReadBufferSize server option but that did nothing. I am using the code recommended by the gRPC server for receiving data from the stream.
Here is how I'm configuring the server:
gRPCServer := grpc.NewServer(
// Hook up credentials for SSL/TLS
grpc.Creds(creds),
// Inject Intercepters
grpc_middleware.WithUnaryServerChain(
// Hook up the ctxtags to allow for tracing
grpc_ctxtags.UnaryServerInterceptor(ctxtagsOpts...),
// Hook up logrus logging lib to gRPC requests
grpc_logrus.UnaryServerInterceptor(logrusEntry, logrusOpts...),
// Hook up the open tracer to allow for distributed logging
grpc_opentracing.UnaryServerInterceptor(),
// Hook up custom middlewares (i18n and error)
middleware.ServerInterceptor,
// Hook up custom middleware for stack trace
middleware.Stacktrace(),
),
// Hook up chain stream server
grpc_middleware.WithStreamServerChain(
// Hook up ctxtags to allow for tracing
grpc_ctxtags.StreamServerInterceptor(ctxtagsOpts...),
// Hook up logrus logging lib to gRPC requests
grpc_logrus.StreamServerInterceptor(logrusEntry, logrusOpts...),
// Hook up the open tracer to allow for distributed logging
grpc_opentracing.StreamServerInterceptor(),
),
grpc.MaxRecvMsgSize(config.MaxRecvMsgSize),
)
How I'm receiving the stream:
var data []byte
log.Debug("Start reconstruction of stream")
start := time.Now()
// Open up a loop to read the stream, exit when stream has reached EOF
for {
content, err := stream.Recv()
if err != nil {
if err == io.EOF {
log.Debug("End reconstruction of stream")
break
}
log.Errorf("Reading chunks from stream failed unexpectedly with: %s", err)
// return err if reading stream fails
return err
}
// Append the stream content chunks to the global chunk holder
data = append(data, content.Data...)
}
t := time.Now()
fmt.Println(t.Sub(start))
Is there anyway to improve performance? Does it come down to a resource issue (i.e. hardware or connection speed)?