This message was deleted.
# general
s
This message was deleted.
l
Hey Tirke! I've pushed changes here to show how to do things:
There are two interesting issues that i had to address
First, and probably easiest to understand is: "mutation after capture"
this relates to "slotsTableName" not showing up on "index"
the issue here is due to how pulumi executes your program and how it converts your JS/TS arrow functions into AWS Lambdas.
Specifically, as your program is executing, if we see you are doing something that is making a Lambda, and you've passed a JS-arrow function, then at that point we go and we do all the work to convert the JS-function to hte Lambda.
so, in context, when executing your program, we started first with 'index.js'
and in index.js you did this:
ts``` const api = new serverless.apigateway.API('api', { routes: [ {method: 'GET', path: '/events', handler: listEvents} ] }) ```
so we wanted to make the Lambda right then for that API.
but notice a small problem, in your original code, you did:
Copy code
ts
 const api = new serverless.apigateway.API('api', {
     routes: [
         {method: 'GET', path: '/events', handler: listEvents}
     ]
 })
 
-export const slotsTableName = slotsTable.name
so you assigned that export after creating the API
so as we're introspecting listEvents, we see it uses getAllBofEvents
getAllBofEvents then does slotsTableName.get()
however, slotsTableName hasn't even been assigned yet 🙂
so that was easily fixed by just changing the order
we first assign to slotsTableName before trying to create the apigateway API.
Issue #2 is how to properly import in a pulumi program
this is subtle and i'd like to discuss it when you're around
but you can tak ea look at my commit to see what i changed
there are important patterns to follow wrt to the imports used for code that needs to execute at deployment time, versus how to import things for the code you want running inside an AWS lambda.
so, the docClient part is important.
the intent of your code is: in my AWS lambdas i want to be able to use the aws-sdk
so there are a few important reasons why i've moved the 'imports' into the JS arrow functions (some necessary, some are just about clarity)
ok. so the more "stylistic/clarity" reason to do this is that: these lambda (or at lest some of them) represent your AWS lambda entrypoint.
when someone interacts with your cloud (i.e. through your apigateway) these are hte entrypoints that will actually be invoked.
so it's nice and clean to use that as the point where you say: let me acquire my imports, and hten use them (optionally passing them along to any helpers you need, instead of importing them in each function).
for packages you need only at cloud-runtime (as opposed to deployment time) this is our recommendation.
now, the more "necessary, because it will break otherwise" reason we do this is as follows:
When we convert a TS/JS function over to an AWS lambda, we can easily convert the code over to similar code that will execute on their end. However, what's harder to convert are the values you use.
i.e. when your TS/JS-function uses some shared value that exists outside of it.
let's take a simple example:
Copy code
const val { a: 1, b: 2 };
async function myApiGatewayEntrypoint() {
      // do something with 'val'
}
when converting that function over, we see that it uses 'val'. so we need some way for it to have that value of 'val' at cloud-runtime time.
in a case like this, it's pretty easy for us to handle this. we just serialize out that entire value, and embed it in your AWS lambda.
where it gets complex is when you're capturing something like "DocumentClient"
that's a very complex object, which does all sorts of things (like call native code and whatnot). We cannot actually serialize that object over.
however, if you move your imports inside your callback functions, then they just execute and give you the DocumentClient at cloud-runtime. and there's no value to 'capture' and try to inject into your code.
a
First I didn't see that there are messages in this thread, so I've caught up and first interesting thing is that export order is important. Which is perfectly normal and logic but my mind was too focused on the fact that export is kind of the opposite of import and import is hoisted so I thought export are hoisted too, but this is really stupid. So thanks for that one.
l
nothing here is stupid
this is definitely subtle stuff
🙂
a
Keeping the context of my code you fixed, you moved the imports at the top of the lambda.
Copy code
import { getAllBofEvents } from "./dynamo";
import { createResponse } from "./utils";
I was quite sure that I did that many times and each time it failed with something like cannot find module ./dynamo
But if I understand correctly this is perfectly fine and it will produce an inlined (serialized) version of getAllBofEvents inside my lambda
?
Or will it bundle dynamo.js in the final package
l
sorry, slack wasn't notifying me
one sec, trying to figure out notification options 🙂
I was quite sure that I did that many times and each time it failed with something like cannot find module ./dynamo
hrmm... i'd have to see teh code at that point.
it really should work.
a
Yes, it's working fine, just saw the actual output after deploying your code
l
so teh general rules that you can mostly follow are:
1. if you are using a node package for deployment time. i.e .you literally want to use the package when "pulumi update" runs, then you use a top level 'import'.
2. if you want to reference your own code, use a top level 'import'
3. if you want to use a node package for cloud runtime time (i.e. it runs inside an AWS lambda), then use an inner 'import'
Yes, it's working fine, just saw the actual output after deploying your code
let me rephrase
if you have trouble with a top level import. i.e. some message like
cannot find module ./dynamo
then let me know.
and we can figure out what's up
overall, those rules should get you quite far 🙂
a
Yes I'm quite sure there is nothing blocking me anymore
l
awesome!
a
But I do have concerns about the actual use of await import on massive package like aws-sdk
l
what could we have done to make this clearer?
note: we want to create some docs on this
a
This is why I was trying to break things a little in the first place
l
but that necessitates people finding those docs.
But I do have concerns about the actual use of await import on massive package like aws-sdk
a couple of things there:
first, consider how the AWS lambda is going to work
even if you wrote your own AWS lambda... you'd have to import this sdk
there's no way to avoid this.
a
Dont worry cyrus I know that aws-sdk is bundled in every lambda
l
ok 🙂
a
as a base dependency
l
right
that was my next bit
so the import should ideally actually be not that expensive
also, in terms of style. i imported it in each lambda that needed it.
you don't have to do that
a
I'm just wondering if there is any difference between an await dynamic import and a classic import like I used to do with serverless
l
you can import in your 'root arrow functions'
and hten pass it along to your helper functions.
I'm just wondering if there is any difference between an await dynamic import and a classic import like I used to do with serverless
based on what i know... no, there shouldn't be a difference. but i'm only about 80% confident on htat 🙂
if hte module has already been loaded, the module loader will give you the same one
so you still only take the perf hit once.
a
Ok this is fine for me, I will do a little bit of perf testing once I've finished porting my whole project to pulumi
l
but the hit is deferred if possible.
that would be excellent!
sorry, i hope my above explanations didn't come across as condescending BTW 🙂
sometimesi like to just to write out all my thoughts, just so i can actually check them myself to make sure what i'm thinking is reasonable
a
Nono don't worry you don't know what is my current understanding of the whole aws/serverless ecosystem so this is fine
l
ok great
a
And your explanations are crystal clear so explicit is always great
l
ok thanks 🙂
a
I'm just trying to get to the meat of my problems a bit faster. You asked about documentation but I have maybe one or two questions left before
l
ok. feel free to ping me. note that there may be whole swaths of topics that i'm useless at 🙂
but i can hopefully find the right person to help out
a
I asked about how Pulumi discover resources to deploy a bit earlier today, for example if I'm declaring my dynamo table in dynamo.ts is there a way to make Pulumi find that I want a table without importing in index.tx ?
l
good question.
so we follow hte node/js execution model
effectively we run your app as a real node app
meaning: we'll see any resources that were created as node actually executes your .js code
so if your .js code executes, but doesn't execute the code in dynamo.js, then we'd never know about it
a
yeah it's only my lambda that references the dynamo.js so this is why
I'm trying to find a logical way to structure a pulumi backend
l
note: i'm curious why importing is undesirable
the general approach we would recommend is:
index.ts is your entrypoint
it can have code in it, or not.
but it imports the things you want, ideally in the order you'd like your resources created
so, generally, in whatever layers make sense for your domain
i.e.
Copy code
import "./layer1"
import "./layer2"
etc.
(doesn't have to be called those names) just whatever your logical layers are
layer 1 then sets up whatever it needs
a
It's just that I can think dynamo.ts is my main file for dynamo interactions. I want to declare my table here because this is a direct concern of my dynamo workflow. i don't want to import it in index.ts because I would just import it to export the name and I don't need to export the name.
l
layer 2 can setup whatever it needs, calling onto layer1 if it wants.
so, to me, imports in JS serve two purpose
one is to actually get access to the things in that import
and one is to just have hte import execute
i don't want to import it in index.ts because I would just import it to export the name and I don't need to export the name.
note: you could still
import "./dynamo"
, but then not use anything from it
(or does TS elide that...)
a
Yeah so you would import it just so it is executed. I don't really like that but why not.
l
🙂
i get the distaste
🙂
but i do think that's idiomatic
bbiab
a
Working as expected
Really cool, I think that everything is ok on my side.
Going back to documentation. I've discovered pulumi the week it was announced and immediately started promoting it to my team.
Nobody really understood how powerful Pulumi is conceptually so I had to explain that. Some went and looked at the documentation. They told me that it was not good. I do find that the documentation is quite good but it is lacking in some way and the really important stuff doesn't really stand out.
There are also some inconsistencies and error that I will PR later.
And going back to the original problem, yes the whole Pulumi way of doing imports needs to be documented with precision and global guidelines like you gave me.
l
👍
thanks so much
i totally agree
a
Because many customers will certainly hit the problems highlighted here.
l
indeed
note: we're also investigating a model that may lead people down the pit of success more often
a
If I can help you further, feel free to ask. You are like a life savior for me today 😛
l
specifically, that you could do an "import aws-sdk" on the outside
but that would transform to that 'import" happening on the inside
a
l
yes. those are some of the issues to deal with
a
Will read thanks
AH, a last one for you. The final serialized lambda is exported as __index without the .js extension and this is messing up the lambda console cloud9 editor.
l
oh really?
a
Do you have a fix for that ? Another way to inspect generated code ?
l
our internet has gotten bad
are these messages getting through?
a
this one yeah :d