Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ExpressJS caching middleware #151

Open
TapaiBalazs opened this issue Jan 3, 2023 · 3 comments
Open

ExpressJS caching middleware #151

TapaiBalazs opened this issue Jan 3, 2023 · 3 comments
Labels

Comments

@TapaiBalazs
Copy link
Contributor

TapaiBalazs commented Jan 3, 2023

What do you want to build?

A set of express middlewares, that can be set up with redis (or any other caching solution) and would handle API endpoint caching with just simply dropping the middlewares into the route handlers.

Where the idea come from?

While working with ExpressJS APIs, with @BigAB we had an idea: what would it look like to have a set of caching middlewares for express? Could it be done?

If we could check the cache based on the request and return the result, skipping the rest of the middlewares/routers. And conversely looking at the response after the router handlers have run an caching the result.

Would it be a good idea? Then the individual controllers do not need to know anything about the cache (or maybe they would still need to know which bits to invalidate on update if we can't generalize that behaviour).

How long will it take to build?

First we would need a POC, maybe with an in-memory storage just to see if it can be done.
We are not entirely sure if res.send() would stop executing the following middlewares/handlers, that needs to be tested.

After the POC (and if it is successful), there are some tasks that needs to be done:

  • Create an express API example in the repo that uses the middlewares
  • Design a cache-connector interface, so people can implement their own connectors with cache mechanisms other than Redis
  • Create our own redis connector
  • Document everything properly (tsdoc, readme, etc)
  • Proper testing should be done
  • Package release
  • Additional features (force re-cache, skip-cache, etc)

I'd say that depending on the POC results the first version of the package should be done rather quickly, and then we could add more and more features and configuration depths to the lib.

I'm not good with estimations, but it seems reasonable that a first version with simply allowing TTL could be done in 25-30 hours (with proper planning and architecture discussions)

Additional note:

If this package is ready, we might also be able to extend it to another package that provides the functionality to NestJS, maybe with decorators or something.

@dustinsgoodman
Copy link
Contributor

How is this different than https://www.npmjs.com/package/express-cache-middleware?

@TapaiBalazs
Copy link
Contributor Author

@dustinsgoodman we weren't aware of such a package existing.

That said, the above package caches every route according to their documentation:

Then, attach it to your Express app instance or Router. Any routes attached to the app or Router after this will be intercepted and cached.


What we want is the ability to configure caching on a per-route basis:

const connector = new RedisConnector({ /* redis client config */ }) // The connector needs to implement an interface
const [ cachemiddleware, saveToCache ] =  createMiddlewares(connector, { ttl: 500 }); // this creates the middleware functions based on the connector provided and some additional config


app.get('/technology', cachemiddleware(), handler)

app.get('technology/:id', cachemiddleware({ ttl: 300 }), handler) // configuration overrides can be added

app.post('technology', handler) // there is no caching here

app.use(saveToCache) // this should be able to save the responses of the above routes without any extra logic put into the handlers themselves. This is what we currently don't know how to do, or if it can be done, we want to determine this with the POC

The POC would only be for determining if we can access the response body somehow after the request handler ran, based on the above package you linked it is possible, but it still needs to be investigated how to do that.

I think determining the above should only take a couple of hours. If we find out how to do it, then the larger part of the package is left to be implemented:

  • Redis connector
  • Maybe an in-memory connector
  • TTL support
  • force re-cache support
  • skipCacheWhen support
  • cache invalidation support
  • support for only updating cache on certain status codes (like cache only if 2xx status)
  • proper testing
  • documentation

@dustinsgoodman
Copy link
Contributor

I think this would still be theoretically possible using that package as you could attach the cache middleware to just a subset of routes using the Router.use method in Express. For example, I think you could import a set of routes

import { Router } from "express";
import routesToCache from "./some-routes";
import routesToNotCache from "./other-routes";

cacheMiddleware.attach(routesToCache)

const routes = Router();
routes.use("/cached-routes", routesToCache);
routes.use("/not-cached-routes", routesToNotCache);

I'd need to test that other library in full to verify but I think that's the idea so what I'm saying is a lot of the logic may already be in place. I think you're looking to add functionality to that implementation though.


Regarding your proposal though, sounds like you have that timeboxed SPIKE to verify if this is even possible before we would even proceed to do work here. I have some additional questions but going to move them to slack.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants