Limitly
Getting Started

Quick Start

Get your first rate limiter up and running in just a few lines of code. Examples for Next.js, Express, and more.

Quick Start

Get started with Limitly in minutes.

Basic Example

Recommended: Use your own Redis

import { createClient } from 'limitly-sdk';

// Recommended for production: Use your own Redis
const client = createClient({
  redisUrl: process.env.REDIS_URL || 'redis://localhost:6379',
  serviceId: 'my-app'
});

const result = await client.checkRateLimit('user-123');

if (result.allowed) {
  console.log(`Allowed. Remaining: ${result.remaining}`);
} else {
  console.log('Rate limit exceeded!');
}

Without Redis URL (development/testing):

// ⚠️ Shares hosted Redis - may collide with other users
const client = createClient({ serviceId: 'my-app' });

Next.js

App Router

// app/api/route.ts
import { createClient } from 'limitly-sdk';
import { NextResponse } from 'next/server';

// Recommended: Use your own Redis
const client = createClient({
  redisUrl: process.env.REDIS_URL,
  serviceId: 'nextjs-api'
});

export async function GET(request: Request) {
  const userId = request.headers.get('x-user-id') || 'anonymous';
  const result = await client.checkRateLimit(userId);
  
  if (!result.allowed) {
    return NextResponse.json({ error: 'Rate limit exceeded' }, { status: 429 });
  }
  
  return NextResponse.json({ success: true });
}

Pages Router

// pages/api/route.ts
import { createClient } from 'limitly-sdk';

const client = createClient({ serviceId: 'nextjs-api' });

export default async function handler(req, res) {
  const userId = req.headers['x-user-id'] || 'anonymous';
  const result = await client.checkRateLimit(userId);
  
  if (!result.allowed) {
    return res.status(429).json({ error: 'Rate limit exceeded' });
  }
  
  res.status(200).json({ success: true });
}

Express.js

// middleware/rate-limit.ts
import { createClient } from 'limitly-sdk';
const client = createClient({ serviceId: 'express-api' });

export async function rateLimitMiddleware(req, res, next) {
  const identifier = req.user?.id || req.ip || 'anonymous';
  const result = await client.checkRateLimit(identifier);
  
  if (!result.allowed) {
    return res.status(429).json({ error: 'Rate limit exceeded' });
  }
  next();
}

// app.ts
app.use('/api', rateLimitMiddleware);

Other Frameworks

Fastify

const client = createClient({ serviceId: 'fastify-api' });

fastify.addHook('onRequest', async (request, reply) => {
  const identifier = request.user?.id || request.ip || 'anonymous';
  const result = await client.checkRateLimit(identifier);
  
  if (!result.allowed) {
    return reply.code(429).send({ error: 'Rate limit exceeded' });
  }
});

Hono

const client = createClient({ serviceId: 'hono-api' });

app.use('*', async (c, next) => {
  const identifier = c.req.header('x-user-id') || 'anonymous';
  const result = await client.checkRateLimit(identifier);
  
  if (!result.allowed) {
    return c.json({ error: 'Rate limit exceeded' }, 429);
  }
  await next();
});

Custom Limits

const client = createClient({ serviceId: 'my-api' });

// Per-endpoint limits
const limits = {
  '/api/search': { capacity: 100, refillRate: 10 },
  '/api/upload': { capacity: 10, refillRate: 1 }
};

const result = await client.checkRateLimit({
  identifier: userId,
  ...limits[endpoint]
});

// User tier limits
const tierLimits = {
  free: { capacity: 100, refillRate: 10 },
  premium: { capacity: 1000, refillRate: 100 }
};

await client.checkRateLimit({ identifier: user.id, ...tierLimits[user.plan] });

For production deployments, always use your own Redis URL for full tenant isolation:

// Recommended for production
const client = createClient({
  redisUrl: process.env.REDIS_URL || 'redis://localhost:6379',
  serviceId: 'my-app'
});

// All rate limit data stored in your Redis - no collisions
const result = await client.checkRateLimit('user-123');

Why use your own Redis?

  • Full tenant isolation - No collisions with other Limitly users
  • Data privacy - Your rate limit data stays in your Redis
  • Better performance - Direct Redis connection (no HTTP overhead)
  • Production ready - Recommended for all production deployments

Without redisUrl:

  • ⚠️ Shares hosted Redis with other users
  • ⚠️ Potential collisions if multiple users use the same serviceId
  • ✅ Works out of the box (good for development/testing)