Chapter 74: Node.js MongoDB Limit
limit() in MongoDB queries with Node.js + Mongoose (2025–2026 style).
We will go through this slowly and carefully, as if I’m sitting next to you right now — typing code, running it, looking at results in the terminal and in MongoDB Compass, explaining every decision, every trap, and every production nuance.
1. Mental model — What does .limit() actually do?
|
0 1 2 3 4 5 6 |
await Task.find().limit(10) |
→ Returns at most 10 documents that match the query.
|
0 1 2 3 4 5 6 |
await Task.find().skip(20).limit(10) |
→ Skips the first 20 matching documents, then returns the next 10 (rows 21–30).
Together, .skip() + .limit() = pagination.
But:
- .limit() alone is very useful even without skip
- .skip() becomes very slow when the number is large (tens of thousands+)
- Most real applications use cursor-based pagination instead of skip for large datasets
2. Why we care so much about LIMIT
Real-world problems without proper limit:
- API returns 10,000+ documents → frontend hangs, network timeout, memory crash
- Attacker requests ?limit=1000000 → server dies or becomes very slow
- No pagination metadata → frontend doesn’t know how many pages exist
Good LIMIT usage solves:
- Performance
- Security (rate limiting + max limit)
- Good API UX (total count, hasNext, hasPrev)
3. Project setup (realistic & modern)
|
0 1 2 3 4 5 6 7 8 9 10 11 12 13 |
mkdir mongodb-limit-demo cd mongodb-limit-demo npm init -y npm pkg set type=module npm install express dotenv mongoose zod npm install -D typescript tsx nodemon @types/express @types/node |
tsconfig.json
|
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
{ "compilerOptions": { "target": "ES2022", "module": "NodeNext", "moduleResolution": "NodeNext", "esModuleInterop": true, "forceConsistentCasingInFileNames": true, "strict": true, "skipLibCheck": true, "outDir": "./dist", "rootDir": "./src", "sourceMap": true }, "include": ["src/**/*"] } |
package.json scripts
|
0 1 2 3 4 5 6 7 8 9 |
"scripts": { "dev": "tsx watch src/index.ts", "start": "node dist/index.js" } |
.env.example
|
0 1 2 3 4 5 6 7 |
PORT=5000 MONGODB_URI=mongodb://localhost:27017/limit_demo |
4. MongoDB connection
src/config/mongodb.ts
|
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
import mongoose from 'mongoose' import env from './env.js' export async function connectDB() { try { await mongoose.connect(env.MONGODB_URI, { maxPoolSize: 10, minPoolSize: 2, serverSelectionTimeoutMS: 5000, socketTimeoutMS: 45000, family: 4 }) console.log('MongoDB connected →', mongoose.connection.db.databaseName) } catch (err) { console.error('MongoDB connection failed:', err) process.exit(1) } } connectDB() |
src/config/env.ts
|
0 1 2 3 4 5 6 7 8 9 10 11 12 |
import { z } from 'zod' import 'dotenv/config' export const env = z.object({ PORT: z.coerce.number().default(5000), MONGODB_URI: z.string().url().startsWith('mongodb') }).parse(process.env) |
5. Realistic model with many documents
src/models/task.model.ts
|
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
import mongoose from 'mongoose' const taskSchema = new mongoose.Schema({ title: String, priority: { type: String, enum: ['low', 'medium', 'high'], default: 'medium' }, completed: Boolean, dueDate: Date, createdAt: { type: Date, default: Date.now }, points: Number }) taskSchema.index({ createdAt: -1 }) // for sorting by date taskSchema.index({ priority: 1 }) // for priority filtering export const Task = mongoose.model('Task', taskSchema) |
src/seed-many.ts — create 100 tasks
|
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 |
import { connectDB } from './config/mongodb.js' import { Task } from './models/task.model.js' async function seed() { await connectDB() await Task.deleteMany({}) const tasks = [] const priorities = ['low', 'medium', 'high'] for (let i = 1; i <= 100; i++) { tasks.push({ title: `Task ${i}`, priority: priorities[i % 3], completed: i % 4 === 0, dueDate: i % 5 === 0 ? new Date(Date.now() + i * 86400000) : null, points: Math.floor(Math.random() * 100) }) } await Task.insertMany(tasks) console.log('Inserted 100 tasks') process.exit(0) } seed().catch(console.error) |
Run once:
|
0 1 2 3 4 5 6 |
npx tsx src/seed-many.ts |
Now we have 100 tasks — perfect for testing pagination.
6. Basic LIMIT usage
src/controllers/task.controller.ts
|
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
import { Request, Response } from 'express' import { Task } from '../models/task.model.js' export const getFirst10Tasks = async (_req: Request, res: Response) => { try { // Returns only the first 10 documents (in natural order) const tasks = await Task.find().limit(10) res.json({ success: true, count: tasks.length, data: tasks }) } catch (err) { res.status(500).json({ success: false, message: 'Failed to fetch tasks' }) } } |
Note: Without .sort(), order is not guaranteed — usually insertion order, but you should never rely on it.
7. LIMIT + sort (most common real usage)
|
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
export const getRecentTasks = async (_req: Request, res: Response) => { try { const limit = 15 const tasks = await Task.find() .sort({ createdAt: -1 }) // newest first .limit(limit) res.json({ success: true, returned: tasks.length, data: tasks }) } catch (err) { res.status(500).json({ success: false, message: 'Failed to fetch recent tasks' }) } } |
8. Full pagination — page + limit (classic style)
|
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 |
export const getPaginatedTasks = async (req: Request, res: Response) => { try { const page = Math.max(1, Number(req.query.page) || 1) const limit = Math.min(100, Math.max(5, Number(req.query.limit) || 10)) const skip = (page - 1) * limit const tasks = await Task.find() .sort({ createdAt: -1 }) .skip(skip) .limit(limit) const total = await Task.countDocuments() res.json({ success: true, data: tasks, pagination: { page, limit, total, totalPages: Math.ceil(total / limit), hasNext: page * limit < total, hasPrev: page > 1 } }) } catch (err) { res.status(500).json({ success: false, message: 'Failed to fetch paginated tasks' }) } } |
Test URLs
- GET /api/tasks/paginated?page=1&limit=10
- GET /api/tasks/paginated?page=3&limit=20
9. Dynamic LIMIT (user-controlled — safe way)
src/controllers/task.controller.ts
|
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
export const getLimitedTasks = async (req: Request, res: Response) => { try { let limit = Number(req.query.limit) || 10 // Safety: enforce reasonable bounds limit = Math.max(1, Math.min(100, limit)) const tasks = await Task.find() .sort({ createdAt: -1 }) .limit(limit) res.json({ success: true, returned: tasks.length, limitUsed: limit, data: tasks }) } catch (err) { res.status(500).json({ success: false, message: 'Failed to fetch tasks' }) } } |
Security & UX rules
- Never trust req.query.limit directly
- Always clamp it: Math.min(100, Math.max(1, limit))
- 100 is a reasonable max for most APIs (adjust per use-case)
Very common beginner mistake
|
0 1 2 3 4 5 6 7 |
const limit = req.query.limit // ← string or undefined await Task.find().limit(limit) // ← crashes or wrong behavior |
Correct — always coerce & validate
|
0 1 2 3 4 5 6 |
const limit = Math.max(1, Math.min(100, Number(req.query.limit) || 10)) |
10. Summary – MongoDB .limit() best practices in Node.js 2025–2026
| Best Practice | Why it matters | Recommended pattern |
|---|---|---|
| Never trust client limit value | Prevents DoS / memory explosion | `Math.min(100, Math.max(1, Number(req.query.limit) |
| Always combine with .sort() | Order is otherwise undefined | .sort({ createdAt: -1 }).limit(10) |
| Use .lean() for read-only lists | 2–5× faster – plain objects | .find().lean().limit(20) |
| Use cursor-based pagination for large data | .skip() becomes very slow on big offsets | { _id: { $gt: lastId } }.limit(pageSize) |
| Use indexes on sorted fields | 10×–1000× faster | schema.index({ createdAt: -1 }) |
| Return pagination metadata | Great API UX | { total, totalPages, hasNext, hasPrev } |
| Log query time in production | Find slow endpoints | Add timing middleware or use mongoose.set(‘debug’, true) temporarily |
Which direction would you like to go much deeper into next?
- Login + JWT authentication with MongoDB
- Full task CRUD (create/read/update/delete + ownership check)
- Advanced pagination (cursor-based, infinite scroll, keyset pagination)
- Text search + relevance sorting
- Aggregation pipeline examples (group, match, unwind, etc.)
- Performance tuning (indexes, .explain(), profiling)
Just tell me what you want to build or understand next — I’ll continue with complete, secure, production-ready code and explanations. 😊
