Okta/jwt-verifier cache on AWS Lambda

Greetings!

I am using the okta/jwt-verifier NPM package in my back-end app to verify tokens. I’ve got working local code that consumes this package, configures a verifier, and then verifies tokens. Sweet.

Of course, in order to verify token, the package internally needs to pull auth server keys from Okta, which would be absurdly slow if it needed to happen on every request. Okta appears to have anticipated this: the package will cache the keys (presumably someplace in the filesystem), and it does expose some userland params for cache lifetime and max remote requests per hour.

However, my back-end runs on AWS Lambda where there is no default persistent storage - like a filesystem - into which it can cache. It doesn’t seem like the package has any options for specifying the type/location of persistent cache storage.

Does this mean I have to abandon the package, replacing it with my own implementation of token verification which would have to include:

  • key retrieval and persistence (easy enough, really)
  • a reimplementation all the verification functionality of the package (ugh…)?

Any thoughts/ideas welcome.

Did you ever resolve this? I have the same implementation and was wondering about the caching

@alex.lewis Sorry for the delay in responding. Don’t really get into the forums much.

Ultimately I had to persist keys in my db and use them to validate requests from clients. That is: I maintain a collection of current keys from the auth server in my db. When my back-end receives a request from the client with a signed jwt, it extracts the key from the request, matches it against one of my stored keys, and then attempts to validate against that key. There is even a mechanism to get new keys from the auth server when the old key expires.

It was a little bit of a pain, but it was straightforward and seems to be working well. :crossed_fingers: