I've run through the sample to get an `onPut` acti...
# general
e
I've run through the sample to get an
onPut
action in an S3 bucket to trigger a lambda using this code:
Copy code
// A storage bucket
const bucket = new cloud.Bucket("eshamay-test-bucket");
const bucketName = bucket.bucket.id;

// Trigger a Lamda function when something is added
bucket.onPut("onNewObject", (bucketArgs: any) => {
    console.log(`*** New Item in Bucket`);
    console.log(bucketArgs);
});
However, I plan to trigger the lambda from multiple buckets and will create/update some DynamoDB items to index the contents of the various bucket objects across multiple regions. Currently, the lambda above outputs something like this (taken directly from CloudWatch logs:
Copy code
2018-11-29T12:52:08.159-08:00[onNewObject] { key: 'some/foo/path/myfile.ts',  size: 384,  eventTime: '2018-11-29T20:52:07.166Z' }
However, my indexing will need to create a couple extra attributes to track region, bucket name, and perhaps some other info of the
put
such as the Principle ID, IP address, etc. I know that S3 events emit something like this: https://docs.aws.amazon.com/lambda/latest/dg/eventsources.html#eventsources-s3-put Is there a way to get access to that original event object? Also, is there a way to create a single lambda to do the DynamoDB update, and then create several buckets across regions that trigger that single lambda
onPut
?
l
Hi @early-musician-41645
What you can do instead, is that instead of doing bucket.onPut
do bucket.bucket.onXXX
where onXXX is one of:
onObjectCreated, onObjectRemoved or onEvent
these will give you teh real s3 events with all the details of them
e
great! Now the follow-up, what would it look like to trigger a lambda with each
onObjectCreated
? I'm assuming the code would be something like
Copy code
// Create a lambda

// Create the bucket

// Set up the bucket onObjectCreated
    // Trigger the lambda somehow from in here using the... lambda.arn?
l
is it an existing lambda you have?
e
In this case, no, it will be created as part of the same project. The S3 buckets are pre-existing, though, which is a separate issue
l
oh. if it's created in the same project, it's trivial
just pass the one you created in as the 'handler' arg.
Copy code
onObjectCreated(
            name: string, handler: BucketEventHandler,
A 'handler' is either a JS function taht we'll do magic pulumi-serialization on, or it's just an aws.lambda.Function instance.
since you already have the latter, you can just pass it in
e
👍 Can you confirm: will the
event
pass through as the arg to the
handler
? Or, what's the signature for the handler? I'm assuming it takes a single arg.
l
in this case, we aren't really doing anything in pulumi
we're just telling the bucket the following:
We just make a BucketNotification (an aws resource) that links the bucket id and the lambda id together
with a filter saying "for object-created events"
so you should get whatever amazon normally passed along in those circumstances
my expections would be that it would take 3 args.
First, the 'event' arg. this most naturally corresponds to what you think of as the s3 data about hte bucket change.
Second, the 'context' arg, this gives you information about the invoked lambda.
Third, the 'callback' arg, which is how you signal to aws that you're done (for things like asynchronous lambada that aren't using promises)
Does that help? 🙂
e
Okay, one more question - I have pre-created S3 buckets that are managed by a separate team. I'd like to attach this same lambda/handler to each of those buckets. I know the bucket names, regions, arns, etc. How can I load them as objects in Pulumi for use in the same way as if I'd created them?
Yes, that helps!!
l
aws.s3.Bucket.get(..., arn, ...)
e
very cool. I'll go try all this out. Thanks for the help!!
l
.get is how you say: this already exists, go fetch it and 'hydrate hte info about it' on the pulumi side
my pleasure!
e
One more thing - will Pulumi create all the needed IAM resources and policies for the bucket events to trigger the lambda if the buckets are pre-created?
l
here's what we do:
e
Okay, looks straightforward - very much like if I'd created the object instead of imported via
.get
l
If you needmore than the permission set, you'd have to do it yourself currently. If there were additional things you wanted us to do, def let us know
e
thanks again!
l
Definitely!
e
I'm trying out the
.get
but the buckets I need are in multiple regions. I tried this with no luck:
Copy code
const bucket = aws.s3.Bucket.get('prod-us-west-1', 'sdp-tsm-prod-eu1-artifacts', { region: 'us-west-1' });
Copy code
Diagnostics:
  aws:s3:Bucket (prod-us-west-1):
    error: Preview failed: refreshing urn:pulumi:s3-object-indexer-dev::s3-object-indexer::aws:s3/bucket:Bucket::prod-us-west-1: error reading S3 Bucket (sdp-tsm-prod-eu1-artifacts): BucketRegionError: incorrect region, the bucket is not in 'us-west-2' region
        status code: 301, request id: , host id:
The provider I set up has this config:
Copy code
config:
  aws:region: us-west-2
Not sure how to specify region otherwise
l
I don't know enough about this. i don't believe pulumi would let you use things from different regions. which i believe is a bit of an aws limitation as well. i.e. i don't think you could set up a bucket to invoke a lambda in a different region.
@incalculable-sundown-82514 @microscopic-florist-22719 is there some work scheduled at some point to make 'regions' not specified for an entire app right, but more like instance state? i feel like that was discussed, but not sure if anything ever came of it.
m
Settled this in a different thread
Short answer is that this is possible
l
interesting.
do you happen to know if lambda eventing works with that?
i.e. i can have a bucket in one region trigger a lambda in another?
m
No clue
e
Oh, that's something I hadn't considered - can lambdas get called across regions? I'll do some reading