Is anyone else trying to use the pulumi-kafka pack...
# general
r
Is anyone else trying to use the pulumi-kafka package to manage topics and acls in confluent cloud? I am attempting to create topics with a provider that looks like this:
Copy code
provider = kafka.Provider(
                "kafka-provider",
                bootstrap_servers=confluent_environment.apply(
                    lambda e: [e.bootstrap_servers.lstrip("SASL_SSL://")]
                ),
                opts=opts,
                sasl_mechanism="plain",
                sasl_username=confluent_environment.apply(
                    lambda e: e.environment_credentials.api_key
                ),
                sasl_password=confluent_environment.apply(
                    lambda e: e.environment_credentials.api_secret
                ),
                tls_enabled=True,
                timeout=120,
            )
and getting this error:
Copy code
kafka:index:Topic (logs-topic):
    error: 1 error occurred:
        * kafka server: The client is not authorized to access this topic.
I have given the service account attached to my API key the
CloudClusterAdmin
entitlement and have confirmed with the confluent CLI that the API key can create a topic in the cluster... Any ideas what might be going wrong here?
b
Hi @rough-hydrogen-27449 - how are you usubg that provider that you instantiated?
r
I'm passing it in via the
opts
parameter in the
Topic
constructor like this
Copy code
kafka.Topic(
            "logs-topic",
            opts=pulumi.ResourceOptions(provider=provider),
            name="logs",
            replication_factor=1,
            partitions=1,
        )
Is it possible that the underlying provider code is unable to use an API key/secret combo? IIRC the v1 ccloud CLI required a meatspace username (e.g. email) and password. It didn't seem to work with service account credentials.
b
if you run
pulumi up
with the following TF_LOG=1 pulumi up --v=9 --logtostderr
are you able to see the correct config for the provider?
r
yes, and I see
Copy code
I1216 16:59:49.658760   14465 eventsink.go:59] resource registration successful: ty=pulumi:providers:kafka, urn=urn:pulumi:jgrillo-sandbox::grapl::pulumi:providers:kafka::kafka-provider
and I saw this:
Copy code
I1216 16:59:51.467635   14465 eventsink.go:59] lazy client init %!!(MISSING)s(<nil>); config, {0xc00012d8d8 120   ***** ***** true true <API_KEY> ***** plain}
I1216 16:59:51.468102   14465 eventsink.go:62] eventSink::Debug(<{%reset%}>lazy client init %!s(<nil>); config, {0xc00012d8d8 120   ***** ***** true true <API_KEY> ***** plain}<{%reset%}>)
I1216 16:59:51.471148   14465 eventsink.go:59] Timeout is 2m0s 
I1216 16:59:51.471599   14465 eventsink.go:62] eventSink::Debug(<{%reset%}>Timeout is 2m0s <{%reset%}>)
I1216 16:59:51.477125   14465 provider_plugin.go:1531] provider received rpc error `Unknown`: `1 error occurred:
        * kafka server: The client is not authorized to access this topic.

`
I1216 16:59:51.477394   14465 provider_plugin.go:1534] rpc error kind `Unknown` may not be recoverable
I1216 16:59:51.477632   14465 provider_plugin.go:760] Provider[kafka, 0xc0010db1a0].Create(urn:pulumi:jgrillo-sandbox::grapl::kafka:index/topic:Topic::logs-topic) failed: 1 error occurred:
kafka server: The client is not authorized to access this topic.
in these logs I replaced my API key with
<API_KEY>
out of an abundance of caution