Creating Your Own Cache Plugin in Fastify Using RedisLast updated on September 21, 2022
InfrastructureAPITutorial
IntroductionAPI’s that face high I/O can benefit from implementing a cache system in conjunction with Fastify and Redis.Cache is data stored (usually in memory) that can be used to serve responses faster. For example, a user requests a price estimate for a house, the API checks if that data is in the cache, if not, it passes on the request to its algorithm, and then it stores the response into the cache. The next time someone requests the data (usually within some form of relevant time-frame) they get served the data from the cache.Usually, whether or not you are getting responses from the cache is referred to as as HIT or a MISS. You can find these values in the response headers.
  • HIT: Response from the cache.
  • MISS: Response is not from the cache.
Example Cached Response Headers
You can see that there is a x-cache header with the value of HIT, signifying that the response is from the cache.Usually, cache is set to expire some time after its set, depending on the type of data. For example, with the Module API, NFT collection data is set to expire 30 minutes after it is set, while other data such as new NFT listings, is set to expire 30-seconds after it is set. It mainly depends on how frequently the requested data changes, and if having the most up-to-date data is important.Some API’s also include an option to completely bypass the cache no matter what, usually by setting a header such as x-cache-bypass to true and passing along an API key.RedisThe open source, in-memory data store used by millions of developers as a database, cache, streaming engine, and message broker.Redis can be used as a place to store cached responses. Redis is usually ran on its own, and is connected to from the app that is setting or getting the cache. This is useful for large API’s running many containers, as the cache is stored in a place that all of the containers can access, instead of each container having its own independent cache.Node AppTo get started, let’s initialize a simple Fastify NodeJS App.
1npm install fastify
Create a file called server.js and copy and paste this example code for a basic fastify app inside:
1// Require the framework and instantiate it
2const fastify = require('fastify')({
3    logger: true
4})
5
6// Declare a route
7fastify.get('/', (request, reply) => {
8    reply.send({
9        hello: 'world'
10    })
11})
12
13// Run the server!
14fastify.listen(3000, (err) => {
15    if (err) {
16        fastify.log.error(err)
17        process.exit(1)
18    }
19})
launch the server with:
1node server
and you can test it with:
1curl http://localhost:3000
Now that you have your app running, lets edit the default route so that it responds after a 5 second timeout that we can use to test out cache later:
1// Replies after 5 seconds
2fastify.get('/', (request, reply) => {
3    setTimeout(() => {
4        reply.send({ hello: 'world' })
5    }, 5000)    
6})
Now, run the server and test it with:
1curl http://localhost:3000/
or visit [http://localhost:3000/test](http://localhost:3000/test) in your browser.It should respond with { hello: 'world'} after five seconds.Setting up RedisFor the next step of adding cache to your API, you will need to have access to a Redis instance. You can either set it up locally, or use a managed Redis host.Once you have Redis setup either locally or through a hosting provider, you are ready to go to the next step.IORedisStart by installing ioredis, which is a robust Redis client for Node.js
npm i ioredis
Next, make a new file called cache.js, and add the following code to it:
1const Redis = require('ioredis')
2const redis = new Redis({
3    port: 6379, // Redis port
4    host: "127.0.0.1", // Redis host 
5    // password: '' // Redis password (optional)
6    // family: 4, // 4 (IPv4) or 6 (IPv6)
7    // db: 0, // Database index to use
8})
Make sure to replace host with your Redis host, as well as uncomment the password, family, and DB and replace them with the respective values if necessary.Run node cache.js and make sure no errors are thrown.Fastify PluginTo add cache into our app, we will be using the fastify-plugin library, which is a plugin helper for fastify.
1npm i fastify-plugin
We will use the fastify-plugin to wrap our cache function to expose it to our app:
1const fp = require('fastify-plugin');
2async function cache(fastify, _options, next) {
3
4    next()
5}
6
7module.exports = fp(cache)
Now, inside, of the cache function, we will add the caching logic by using the onRequest fastify hook:
1fastify.addHook('onRequest', (req, reply, done) => {
2
3})
The logic will look something like this:
  • Get the URL path of the request and check if that key is present as in entry in Redis
    • If it does, set the x-cache header to HIT and reply with the cached data from redis.
    • Also, set the content-type when applicable
  • If it does not, set the x-cache header to MISS and let the request continue to execute.
This code looks like:
1fastify.addHook('onRequest', (req, reply, done) => {
2                // uncomment this if you need cache bypassing as an option
3        // if (reply.getHeader('x-cache-bypass') === 'true') {
4        //     return done();
5        // }
6
7        let key = req.routerPath;
8        redis.get(key, (err, val) => {
9            if (!err && val) {
10                const cachedData = JSON.parse(val);
11                reply.header('x-cache', 'HIT')
12                reply.type('application/json')
13                reply.send(cachedData)
14                return;
15            } 
16            // If there is no cached data, then we just proceed with the request
17            reply.header('x-cache', 'MISS')
18            done()
19        })
20    })
Now, we also need to add the logic to cache data from responses inside the cache function:
1fastify.addHook('onSend', function (req, reply, payload, next) {
2        if (reply.getHeader('x-cache') === 'HIT') {
3            next();
4            return;
5        }
6
7        let key = req.routerPath;
8                let ttl = 5; // seconds for cache to last before it expires
9        redis.set(key, payload, 'ex', ttl);
10        next()
11})
Wrapping upTo add the cache plugin to your app, import the cache plugin in your server.js
1const cache = require('./cache.js')
And then register the plugin:
1fastify.register(cache)
Now you can run your server and test the cache by going to localhost:3000/! The first time you visit the link it should take 5 seconds to respond, and then if you refresh again itll respond instantly!Full Code on Github