I want to deploy a Google Cloud Function, written in Go, with a code structure containing a subdirectory, like this :
function
├── module1
│ ├── go.mod
│ └── module1.go
├── go.mod
└── entrypoint.go
But when I deploy the function, using the GCP console or gcloud
command :
# from function/ directory
gcloud functions deploy myfunction --runtime go111 [...]
Only go.mod
and entrypoint.go
are uploaded (I checked on the Source tab of Function details in the GCP Console). Thus the function fails to deploy, because obviously entrypoint.go
use methods from module1/module1.go
.
The same happens if the source is a .zip
(containing multiple directories) on Google Cloud Storage :
gcloud functions deploy myfunction \
--runtime go111 \
--source gs://${BUCKET}/function.zip [...]
Is it possible to deploy functions using a code structure with subdirectories? I don't know if the same happens for other runtimes (Python, NodeJS), or if the problem is specific to Go.
Edit
I tried to follow this guide : https://cloud.google.com/functions/docs/writing/#functions-writing-file-structuring-go (2nd point : A package at the root of your project that imports code from a sub-package and exports one or more functions), as suggested in the comments, but without success. Here's the structure I used (works in local) :
.
├── function.go
├── go.mod
└── shared
├── go.mod
└── shared.go
go.mod
module testcloudfunction
require testcloudfunction/shared v0.0.0
replace testcloudfunction/shared => ./shared
function.go
package function
import (
"fmt"
"testcloudfunction/shared"
)
func HelloWorld(w http.ResponseWriter, r *http.Request) {
fmt.Fprint(w, shared.Hello())
}
shared/go.mod
module testcloudfunction/shared
shared/shared.go
package shared
func Hello() string {
return "Hello World!"
}