Skip to main content

Mastering TypeScript's Reduce Method

You're calculating a cart total from an array of items. You try using forEach but end up declaring variables outside the loop, mutating state, and the code feels clunky. Or maybe you're transforming an array of database records into a lookup map, and you're three nested loops deep wondering if there's a better way.

The reduce method can handle both scenarios cleanly. It processes an array to produce a single value, whether that's a number, string, object, or even another array. But here's the catch: reduce is also one of the trickiest methods to type correctly in TypeScript, and it's easy to write code that's harder to read than a simple loop.

In this guide, we'll cover practical patterns for using reduce in TypeScript, when it's the right tool for the job, and how to avoid the common typing mistakes that trip up even experienced developers.

Adding Up Numbers in an Array

A straightforward use of reduce is to add all numbers in an array:

const numbers: number[] = [1, 2, 3, 4, 5];

const sum: number = numbers.reduce(
(total: number, current: number) => total + current,
0
);

console.log(sum); // Output: 15

This simple example shows how reduce works. The callback function takes two parameters: the accumulator (total) and the current value being processed. Each time the function runs, it adds the current number to the total, starting from 0.

For type safety, we explicitly annotate both the accumulator and current value as numbers. This ensures TypeScript can validate our operations and catch errors at compile time.

Transforming an Array of Objects

Reduce can also convert an array of objects into another form:

interface User {
id: number;
name: string;
}

const users: User[] = [
{ id: 1, name: 'John' },
{ id: 2, name: 'Jane' },
{ id: 3, name: 'Bob' },
];

const userDictionary: { [key: number]: string } = users.reduce((acc: { [key: number]: string }, user: User) => {
acc[user.id] = user.name;
return acc;
}, {});

console.log(userDictionary); // Output: { '1': 'John', '2': 'Jane', '3': 'Bob' }

In the example above, we convert an array of user objects into a dictionary for quick lookups by ID.

The type annotation { [key: number]: string } uses an index signature to define the shape of our result. You can also use Record<number, string>, which is a typescript utility types approach that creates a more precise type definition. This pattern is useful when working with Convex database queries that return multiple records you need to organize by ID.

When building applications with typescript map transformations or when you need to index your data differently than how it's stored, this pattern provides an efficient solution.

Flattening a Nested Array

Another application of reduce is flattening a nested array into a single-level array:

const nestedArray: number[][] = [[1, 2], [3, 4], [5, 6]];

const flatArray: number[] = nestedArray.reduce(
(total: number[], current: number[]) => total.concat(current),
[]
);

console.log(flatArray); // Output: [1, 2, 3, 4, 5, 6]

Flattening arrays is another common use case for reduce. While newer JavaScript methods like flat() exist, the reduce approach gives you more control over the flattening process.

This pattern works by starting with an empty array and concatenating each nested array to it. When working with complex data structures in typescript array operations, this technique can be extended to handle arbitrary nesting levels through recursion.

If you're working with Convex and need to flatten query results from related tables, this pattern can help simplify the data structure for frontend consumption.

Starting with an Initial Value

When using reduce, you can start with a specific value for the accumulator. This is helpful when you need a particular starting point.

const numbers: number[] = [1, 2, 3, 4, 5];

const startingValue: number = 10;

const sum: number = numbers.reduce(
(total: number, current: number) => total + current,
startingValue
);

console.log(sum); // Output: 25

When the initial value is omitted, TypeScript must infer the accumulator type from the first array element, which can lead to type errors if your array might be empty. Using an explicit starting value also makes your code more readable and intentional.

For complex use cases with Convex schemas, initializing your reducer with a properly typed object ensures consistent type checking throughout your data transformation pipeline.

Typing Mistakes with Reduce

The reduce method is one of the most common sources of TypeScript type errors. You'll often see messages like "Element implicitly has an 'any' type" or "Expression of type 'string' can't be used to index type ''." Here's what's happening and how to fix it.

The Empty Object Problem

The most common mistake is passing an empty object {} as the initial value without specifying its type:

interface Product {
id: string;
category: string;
price: number;
}

const products: Product[] = [
{ id: 'p1', category: 'electronics', price: 100 },
{ id: 'p2', category: 'books', price: 20 },
];

// This will cause a TypeScript error
const byCategory = products.reduce((acc, product) => {
if (!acc[product.category]) { // Error: Expression of type 'string' can't be used to index type '{}'
acc[product.category] = [];
}
acc[product.category].push(product);
return acc;
}, {}); // TypeScript infers this as type {}

TypeScript sees the {} and types it as an empty object literal with no properties. When you try to access acc[product.category], TypeScript rightfully complains because the type {} doesn't allow indexing.

The Solution: Explicit Type Parameters

Use the generic type parameter to tell TypeScript what type the accumulator should be:

// Option 1: Use the generic type parameter
const byCategory = products.reduce<Record<string, Product[]>>((acc, product) => {
if (!acc[product.category]) {
acc[product.category] = [];
}
acc[product.category].push(product);
return acc;
}, {});

// Option 2: Type the initial value directly
const byCategory = products.reduce((acc, product) => {
if (!acc[product.category]) {
acc[product.category] = [];
}
acc[product.category].push(product);
return acc;
}, {} as Record<string, Product[]>);

Both approaches work, but the generic parameter (Option 1) is generally preferred because it's more explicit and less prone to type assertion issues.

When Return Types Don't Match

Another common mistake is when your reducer returns a different type than your initial value suggests:

const numbers = [1, 2, 3, 4, 5];

// This won't work as expected
const result = numbers.reduce((acc, num) => {
return acc + num.toString(); // Error: Type 'string' is not assignable to type 'number'
}, 0);

The initial value 0 tells TypeScript the accumulator is a number, but we're returning a string. Fix this by explicitly typing the accumulator:

const result = numbers.reduce<string>((acc, num) => {
return acc + num.toString();
}, ''); // Initial value now matches the return type

When building apps with Convex, these typing issues often surface when transforming query results. Getting the types right from the start saves debugging time later.

Grouping Objects by a Property

Reduce can also group objects by a property. For example, you can group user objects by their roles.

interface User {
id: number;
name: string;
role: string;
}

const users: User[] = [
{ id: 1, name: 'John', role: 'admin' },
{ id: 2, name: 'Jane', role: 'moderator' },
{ id: 3, name: 'Bob', role: 'admin' },
];

const usersByRole: Record<string, User[]> = users.reduce(
(acc: Record<string, User[]>, user: User) => {
// Initialize the array if this is the first user with this role
if (!acc[user.role]) {
acc[user.role] = [];
}
acc[user.role].push(user);
return acc;
},
{}
);

console.log(usersByRole);
/* Output:
{
admin: [
{ id: 1, name: 'John', role: 'admin' },
{ id: 3, name: 'Bob', role: 'admin' }
],
moderator: [
{ id: 2, name: 'Jane', role: 'moderator' }
]
}
*/

When building apps with Convex, this pattern works well with data retrieved from queries before displaying it in your UI. You can combine this with typescript filter operations to further refine your groups.

The combination of reduce with TypeScript's strict typing ensures that your grouped data maintains its structure throughout your application, reducing runtime errors from missing properties or incorrect types.

Combining Objects with Reduce

Object merging is a clean use case for reduce. This example combines multiple partial objects into a complete one using the spread operator, which is more readable than manual property assignment.

interface Partial {
name?: string;
age?: number;
active?: boolean;
}

const userPartials: Partial[] = [
{ name: "Alice" },
{ age: 30 },
{ active: true }
];

const completeUser = userPartials.reduce<Partial>(
(result, partial) => ({...result, ...partial}),
{}
);

console.log(completeUser); // Output: { name: "Alice", age: 30, active: true }

This pattern is useful when you're working with typescript partial types or when building objects from multiple data sources. It's especially handy for merging configuration objects or updating state in frontend applications.

When developing with Convex, you might use this pattern to prepare data before inserting it into your database, combining user input with generated fields or defaults.

Handling Errors in Reduce

When processing data from APIs or user input, you can't always trust the data types. Here's how to handle potential errors gracefully without breaking your entire reduction.

interface ApiProduct {
id: string;
price: unknown; // API might return string or number
quantity: number;
}

const apiResponse: ApiProduct[] = [
{ id: 'p1', price: 10, quantity: 2 },
{ id: 'p2', price: '15', quantity: 1 }, // Oops, price is a string
{ id: 'p3', price: null, quantity: 3 }, // Oops, price is null
];

// Approach 1: Skip invalid items
const validTotal = apiResponse.reduce<number>((total, product) => {
const price = typeof product.price === 'number' ? product.price : 0;
return total + (price * product.quantity);
}, 0);

console.log(validTotal); // Output: 20 (only counted the valid product)

// Approach 2: Collect errors while processing
interface ReduceResult {
total: number;
errors: string[];
}

const resultWithErrors = apiResponse.reduce<ReduceResult>((acc, product) => {
if (typeof product.price !== 'number' || product.price <= 0) {
return {
...acc,
errors: [...acc.errors, `Invalid price for product ${product.id}`]
};
}

return {
total: acc.total + (product.price * product.quantity),
errors: acc.errors
};
}, { total: 0, errors: [] });

console.log(resultWithErrors);
// Output: { total: 20, errors: ['Invalid price for product p2', 'Invalid price for product p3'] }

The second approach is particularly useful because it doesn't throw away data. You can log the errors, display warnings to users, or trigger fallback behavior, all while continuing to process valid items.

Error handling is essential when processing data that might not match your expected types. This example demonstrates using union types (with the unknown type) to properly represent uncertain data from external sources.

When working with external data in Convex, validating inputs before processing is crucial. You can use typescript try catch blocks to gracefully handle these situations, and this approach works well with data validation patterns used in Convex's validation system to ensure type safety from your database to your UI.

When to Use Reduce vs Map vs ForEach

You have several options for processing arrays in TypeScript. So when should you reach for reduce instead of map, forEach, or a plain for loop? Here's a practical breakdown.

Use Map When You're Transforming Every Element

If you're creating a new array with the same length as the original, map is clearer:

// Use map for simple transformations
const prices = [10, 20, 30];
const withTax = prices.map(price => price * 1.1);

// Don't use reduce for this
const withTax = prices.reduce<number[]>((acc, price) => {
acc.push(price * 1.1);
return acc;
}, []);

The map version is more readable and communicates intent better. For one-to-one transformations, check out our guide on typescript array map to understand when to use this method. Save reduce for when you're actually reducing the array to something different.

Use Reduce When You're Aggregating or Transforming Structure

Reach for reduce when you're:

  • Calculating a single value (sum, max, min)
  • Converting array structure (array to object, grouping)
  • Combining multiple operations that would otherwise require chaining
interface OrderItem {
productId: string;
quantity: number;
price: number;
}

const items: OrderItem[] = [
{ productId: 'a1', quantity: 2, price: 10 },
{ productId: 'b2', quantity: 1, price: 25 },
{ productId: 'a1', quantity: 3, price: 10 },
];

// Good use of reduce: aggregate quantities by product
const itemsByProduct = items.reduce<Record<string, number>>((acc, item) => {
acc[item.productId] = (acc[item.productId] || 0) + item.quantity;
return acc;
}, {});
// Result: { a1: 5, b2: 1 }

Use ForEach for Side Effects

When you're not building up a result, forEach or a traditional loop makes your intent clearer:

// Good: forEach for side effects
orders.forEach(order => {
sendEmail(order.email);
updateDatabase(order.id);
});

// Awkward: reduce with no meaningful accumulator
orders.reduce((_, order) => {
sendEmail(order.email);
updateDatabase(order.id);
return undefined;
}, undefined);

Performance Considerations

For most applications, readability trumps performance. But if you're processing large datasets (think hundreds of thousands of elements), here's what matters:

Traditional for loops are fastest, typically 2-3x faster than reduce or map. The callback overhead adds up with large arrays.

However, reduce can be more efficient than chaining multiple array methods:

// Two passes over the array
const result = items
.filter(item => item.quantity > 1)
.map(item => item.price * item.quantity);

// One pass with reduce
const result = items.reduce<number[]>((acc, item) => {
if (item.quantity > 1) {
acc.push(item.price * item.quantity);
}
return acc;
}, []);

When processing data from Convex queries, this single-pass approach can matter if you're handling large result sets. But measure first. Premature optimization usually isn't worth sacrificing readability.

Practical Applications with TypeScript and Reduce

Beyond the basic examples, reduce shines when working with real-world data structures. Here are some practical scenarios you'll encounter in TypeScript applications.

Calculating Order Totals

interface CartItem {
productId: string;
name: string;
price: number;
quantity: number;
discount?: number;
}

const cart: CartItem[] = [
{ productId: "p1", name: "Laptop", price: 999, quantity: 1, discount: 0.1 },
{ productId: "p2", name: "Mouse", price: 25, quantity: 2 },
{ productId: "p3", name: "Keyboard", price: 75, quantity: 1, discount: 0.15 }
];

const orderSummary = cart.reduce<{ subtotal: number; totalDiscount: number; items: number }>(
(acc, item) => {
const itemPrice = item.price * item.quantity;
const itemDiscount = itemPrice * (item.discount || 0);

return {
subtotal: acc.subtotal + itemPrice,
totalDiscount: acc.totalDiscount + itemDiscount,
items: acc.items + item.quantity
};
},
{ subtotal: 0, totalDiscount: 0, items: 0 }
);

const finalTotal = orderSummary.subtotal - orderSummary.totalDiscount;
console.log(`Items: ${orderSummary.items}, Total: $${finalTotal.toFixed(2)}`);
// Output: Items: 4, Total: $1112.25

This pattern is common in e-commerce applications where you need to calculate multiple aggregates in a single pass. The generic type parameter reduce<{ subtotal: number; totalDiscount: number; items: number }> explicitly defines the return type, making your code more robust when refactored.

This approach integrates well with Convex's type system when building full-stack TypeScript applications. When building user interfaces that need to derive values from arrays of objects, consider whether a typescript foreach loop or a reduce operation would be clearer for your specific use case.

Building Type-Safe Reducers with Generics

// Generic reducer function
function safeReduce<T, R>(
array: T[],
reducer: (accumulator: R, item: T, index: number) => R,
initialValue: R
): R {
return array.reduce(reducer, initialValue);
}

// Example usage with different types
const strings = ["Hello", "TypeScript", "Reduce"];
const stringLengths = safeReduce<string, number[]>(
strings,
(lengths, str, index) => {
lengths.push(str.length);
return lengths;
},
[]
);

console.log(stringLengths); // Output: [5, 10, 6]

Creating reusable, type-safe reducers with typescript generics improves code maintainability and safety. The safeReduce function above ensures that both input and output types are properly defined and enforced.

This pattern is particularly valuable when working with Convex database queries that might return records of different types that need consistent processing.

By abstracting the reducer logic into a typed function, you can avoid type errors when processing collections of different types while maintaining the flexibility of the reduce method.

Using Reduce with Async/Await

When you need to process array items with async operations, reduce can handle sequential execution. This is useful when each operation depends on the previous one, or when you need to avoid overwhelming an API with concurrent requests.

interface ApiResponse {
id: string;
data: string;
}

async function fetchData(id: string): Promise<ApiResponse> {
const response = await fetch(`/api/data/${id}`);
return response.json();
}

const ids = ['user1', 'user2', 'user3'];

// Process sequentially with reduce
const results = await ids.reduce<Promise<ApiResponse[]>>(
async (accPromise, id) => {
const acc = await accPromise; // Wait for previous iteration
const result = await fetchData(id); // Fetch current item
acc.push(result);
return acc;
},
Promise.resolve([]) // Initialize with a resolved Promise
);

console.log(results); // All API responses in order

Key Points for Async Reduce

The initial value must be a Promise (use Promise.resolve(initialValue)). Without this, TypeScript will complain about type mismatches.

You need to await the accumulator in each iteration because it's always a Promise from the previous execution.

The return type annotation uses Promise<T> as the generic parameter, not just T.

When to Use Async Reduce

This pattern is valuable when:

  • Each async operation depends on the result of the previous one
  • You need to process items sequentially to avoid rate limiting
  • You're building up state that must be updated in order
interface DatabaseRecord {
id: string;
value: number;
timestamp: Date;
}

// Sequential processing with accumulation
const records = ['rec1', 'rec2', 'rec3'];

const summary = await records.reduce<Promise<{ total: number; records: DatabaseRecord[] }>>(
async (accPromise, recordId) => {
const acc = await accPromise;

try {
const record = await database.getRecord(recordId);
return {
total: acc.total + record.value,
records: [...acc.records, record]
};
} catch (error) {
// Handle errors but continue processing
console.error(`Failed to fetch ${recordId}:`, error);
return acc;
}
},
Promise.resolve({ total: 0, records: [] })
);

Alternative: Promise.all for Parallel Processing

If you don't need sequential execution, Promise.all with map is usually clearer and much faster:

// Parallel processing (faster)
const results = await Promise.all(
ids.map(id => fetchData(id))
);

// vs Sequential with reduce (slower but ordered)
const results = await ids.reduce<Promise<ApiResponse[]>>(
async (accPromise, id) => {
const acc = await accPromise;
const result = await fetchData(id);
return [...acc, result];
},
Promise.resolve([])
);

Use async reduce when the sequential nature is required, not just as a default pattern. When working with Convex database queries, you'll typically want parallel processing for better performance.

Mutation vs Immutability in Reduce

When building up arrays or objects with reduce, you have a choice between mutating the accumulator or creating new objects each iteration. This choice affects both performance and code clarity.

interface LogEntry {
level: 'info' | 'error' | 'warn';
message: string;
}

const logs: LogEntry[] = [
{ level: 'info', message: 'App started' },
{ level: 'error', message: 'Connection failed' },
{ level: 'info', message: 'Retrying...' },
{ level: 'error', message: 'Timeout' },
];

// Mutating approach (faster)
const byLevelMutable = logs.reduce<Record<string, LogEntry[]>>((acc, log) => {
if (!acc[log.level]) {
acc[log.level] = [];
}
acc[log.level].push(log); // Mutating the accumulator
return acc;
}, {});

// Immutable approach (safer but slower)
const byLevelImmutable = logs.reduce<Record<string, LogEntry[]>>((acc, log) => {
return {
...acc,
[log.level]: [...(acc[log.level] || []), log] // Creating new objects each time
};
}, {});

Which Should You Use?

For arrays and objects in reduce, mutation is usually fine because the accumulator is local to the reducer function. The immutable approach creates new objects on every iteration, which can be slow with large datasets.

However, immutability shines when debugging or when your reducer logic is complex. If you're ever unsure whether the accumulator is being shared elsewhere, immutability prevents surprising bugs.

A practical rule: mutate for performance-critical code with large datasets, but use immutability when correctness and debugging ease matter more. For complex data transformations in Convex applications, measure actual performance before optimizing. Most of the time, readability wins.

Final Thoughts on TypeScript Reduce

The reduce method is powerful, but it's not always the right choice. When you're simply transforming each element, reach for map. When you're performing side effects, use forEach or a loop. But when you need to aggregate, restructure, or combine operations, reduce gives you the flexibility to handle complex transformations in a single pass.

The key to using reduce effectively in TypeScript:

  • Always specify the generic type parameter when working with objects or complex types
  • Initialize with an explicit value that matches your accumulator type
  • Use Promise.resolve() when working with async operations
  • Measure performance if you're processing large datasets, but prioritize readability first

With proper typing and the right use cases, reduce becomes an essential tool for data transformation in TypeScript applications.