I am having trouble getting <gcp.sql.DatabaseInsta...
# google-cloud
m
I am having trouble getting gcp.sql.DatabaseInstance to play well with gcp.cloudfunctionsv2: 1. Each gcp.cloudfunctionsv2 is implemented by a gcp.cloudrun.Service that remains hidden from view in Pulumi-land (when using gcp.cloudfunctionsv2.Function). 2. To expose a gcp.sql.DatabaseInstance to a gcp.cloudrun.Service, I should set a
<http://run.googleapis.com/cloudsql-instances|run.googleapis.com/cloudsql-instances>
template metadata annotation equal to the to the gcp.sql.DatabaseInstance name when constructing the gcp.cloudrun.Service (see example). This exposes the database as a unix socket at
/cloudsql/${DATABASE_INSTANCE_NAME}
. I have already managed to mess with the IAM settings of the underlying gcp.cloudrun.Service by passing the location, project and name of the cloudfunctionv2 object to gcp.cloudrun.IamPolicy, and I suspect that it may be possible to also adjust the template metadata annotation with gcp.cloudrun.get_service. Does anyone have experience with this?
l
I don't do anything special to get my cloud run services to access CloudSQL -- no need for annotations or anything like that. I just grant the service account that runs the service "cloudsql.client" and "cloudsql.instanceUser"
Copy code
projects.IAMBinding(
            f"cloud-sql-client-binding",
            role="roles/cloudsql.client",
            members=[
                pulumi.Output.concat("serviceAccount:", sa.service_accounts[user].email)
                for user in sql_users
            ],
            project=project_id,
            opts=pulumi.ResourceOptions(depends_on=sql_user_resources),
        )
        projects.IAMBinding(
            f"cloud-sql-user-binding",
            role="roles/cloudsql.instanceUser",
            members=[
                pulumi.Output.concat("serviceAccount:", sa.service_accounts[user].email)
                for user in sql_users
            ],
            project=project_id,
            opts=pulumi.ResourceOptions(depends_on=sql_user_resources),
        )
Then in your service/function code, you just make use of Google's Cloud SQL Connector library e.g.
Copy code
import os
from google.cloud.sql.connector import Connector

def create_connection():
    with Connector() as connector:
        return connector.connect(
            os.getenv("CLOUD_SQL_CONNECTION_NAME"),
            "pg8000",
            db="my-db",
            user=os.getenv("CLOUD_SQL_USER"),
            enable_iam_auth=True,
        )
m
Thanks. That was my fallback as well, and it works. It's just quite a bit slower ons creating the connection compared to using a Unix socket. I don't mind this on a cloud run instance, but it is unpleasant with short lived cloud functions.
l
Ah... yeah, fair enough.
b
@miniature-computer-95401 did you happen to figure this out?
l
Funny enough, because I wanted to use psycopg3, I ended up creating a custom solution to connect to Cloud SQL via unix sockets while still leveraging Cloud IAM auth: https://github.com/GoogleCloudPlatform/cloud-sql-python-connector/issues/219#issuecomment-2038917429
b
Ah this is interesting. I'll give this a shot.