Array.fromAsync() and the End of Promise.all Map Patterns
Every JavaScript developer has written await Promise.all(items.map(async item =>...)). It works — until you hit a rate-limited API, a paginated async generator, or a ReadableStream. Array.fromAsync() is the purpose-built replacement you didn't know you needed.
Every JavaScript developer has written this at least once:
const results = await Promise.all(items.map(async item => {
return await processItem(item);
}));It works. It runs everything in parallel. For most cases, parallel is exactly what you want. But there are real situations where Promise.all is the wrong tool — and Array.fromAsync() is the replacement you didn't know you needed.
The problem with Promise.all(arr.map(async ...))
The pattern has two invisible behaviours that bite you when you're not expecting them.
It runs everything in parallel, always. You cannot throttle it. If items has 500 elements and processItem makes an API call, you're firing 500 concurrent requests the moment that line executes. Rate-limited APIs return 429s. Databases get hammered. Memory spikes from 500 in-flight operations.
It requires an actual array. You can't use it with an async generator, a ReadableStream, a paginated API that yields results in batches, or any other async iterable. You have to materialise the entire source into an array first.
// This doesn't work — you can't map an async generator directly
const gen = fetchPagedResults(); // AsyncGenerator<Item>
const results = await Promise.all(gen.map(async item => transform(item)));
// TypeError: gen.map is not a functionWhat Array.fromAsync() actually is
Array.fromAsync() is the async counterpart to Array.from(). It takes anything iterable — sync or async — and returns a Promise that resolves to an array once every item has been processed.
// Array.from() — synchronous
const arr = Array.from({ length: 3 }, (_, i) => i * 2);
// [0, 2, 4]
// Array.fromAsync() — asynchronous, sequential
const arr = await Array.fromAsync({ length: 3 }, async (_, i) => {
await delay(100);
return i * 2;
});
// [0, 2, 4] — after 300ms (sequential: 100ms × 3)The second argument is an optional mapFn — identical in shape to the one in Array.from(), but it can be async and Array.fromAsync() awaits each result before moving to the next. Available in Node.js 22+, Chrome 121+, Firefox 115+, Safari 16.4+. ECMAScript 2025 standard.
Basic usage — replacing the simple case
const userIds = [1, 2, 3, 4, 5];
// Old pattern — all 5 requests fire simultaneously
const users = await Promise.all(
userIds.map(id => fetchUser(id))
);
// Array.fromAsync — one request at a time
const users = await Array.fromAsync(userIds, id => fetchUser(id));If you still want parallel execution but a cleaner collection syntax, Array.fromAsync without a mapFn over an array of promises is equivalent to Promise.all:
// Still parallel — just cleaner collection
const promises = userIds.map(id => fetchUser(id));
const users = await Array.fromAsync(promises);
// Equivalent to Promise.all(promises)Where it actually shines — async iterables
This is the use case Promise.all cannot touch at all. Async generators — processing the output of a generator that fetches data lazily:
async function* fetchPages(endpoint) {
let cursor = null;
do {
const res = await fetch(`${endpoint}?cursor=${cursor ?? ''}`);
const data = await res.json();
yield* data.items;
cursor = data.nextCursor;
} while (cursor);
}
// Collect all items from a paginated API — no upfront array materialisation
const allUsers = await Array.fromAsync(
fetchPages('/api/users'),
user => ({ id: user.id, name: user.name.trim() }) // map while collecting
);Node.js ReadableStream — reading a large file line by line without loading it all into memory:
import { createReadStream } from 'fs';
import { createInterface } from 'readline';
async function* readLines(filepath) {
const rl = createInterface({
input: createReadStream(filepath),
crlfDelay: Infinity,
});
yield* rl; // readline is async iterable in Node.js 18+
}
// Process a large CSV without loading the whole file into memory
const rows = await Array.fromAsync(
readLines('./data/users.csv'),
line => line.split(',').map(cell => cell.trim())
);Web Streams API — consuming a ReadableStream from a fetch response:
async function* streamChunks(response) {
const reader = response.body.getReader();
const decoder = new TextDecoder();
while (true) {
const { done, value } = await reader.read();
if (done) break;
yield decoder.decode(value, { stream: true });
}
}
const res = await fetch('/api/large-export');
const chunks = await Array.fromAsync(streamChunks(res), chunk => chunk.trim());
const full = chunks.join('');The real difference — sequential vs parallel
const ids = [1, 2, 3];
// Promise.all — parallel, total time ≈ slowest single request
console.time('parallel');
await Promise.all(ids.map(id => slowFetch(id))); // ~300ms if each takes 300ms
console.timeEnd('parallel');
// Array.fromAsync — sequential, total time = sum of all requests
console.time('sequential');
await Array.fromAsync(ids, id => slowFetch(id)); // ~900ms
console.timeEnd('sequential');Sequential is correct when operations depend on each other, you're writing to a database where order matters, you're hitting a rate-limited API, or you're consuming an async iterable that must be read in order. Parallel is correct when operations are fully independent and speed is the priority.
For controlled parallelism — say, max 5 concurrent requests — neither Promise.all nor Array.fromAsync solves it natively. Use p-limit for that case:
import pLimit from 'p-limit';
const limit = pLimit(5); // max 5 concurrent
const results = await Promise.all(
ids.map(id => limit(() => fetchUser(id)))
);Common mistakes
- Using Array.fromAsync expecting parallel execution — it is sequential. Switching 100 independent API calls from Promise.all to Array.fromAsync turns a ~300ms response into a ~30s one. Know which one you need before you switch
- Forgetting Array.fromAsync returns a Promise — without await, you get Promise<Item[]> not Item[]. Always await it or chain .then()
- Passing a regular Array when you want async iterable behaviour — if your source is already an in-memory array, Array.fromAsync gives you nothing over Promise.all except sequential execution. The real value is with generators and streams
- Confusing the mapFn signature — Array.fromAsync(source, mapFn) passes (element, index), not (element, index, array). There is no third array argument because the array doesn't exist yet while you're building it
- Not polyfilling for older targets — Node.js 22+ and Chrome 121+ only. If you're targeting Node.js 20 or below, the function doesn't exist. Add the polyfill or check your engine target before shipping
// Polyfill for older environments — drop-in safe
if (!Array.fromAsync) {
Array.fromAsync = async (iterable, mapFn) => {
const result = [];
for await (const item of iterable) {
result.push(mapFn ? await mapFn(item) : item);
}
return result;
};
}The takeaway
Array.fromAsync() does not replace Promise.all for parallel execution — and it's not trying to. It replaces the awkward for await loop you write when collecting results from an async iterable into an array. It replaces the Promise.all pattern when you need sequential processing, when your source is a generator or stream, or when you want to map and collect in one readable expression. Reach for Promise.all when you need speed. Reach for Array.fromAsync() when you need order, streams, or async iterables.
Related Articles
You might also enjoy these
Angular Signals Forms — Replace ReactiveFormsModule in New Projects
Reactive forms were the right solution for 2018. Angular 21 ships Signal-based Forms — no valueChanges, no async pipe, no subscription management. Here's how to replace ReactiveFormsModule in every new component you write.
The Right Way to Structure a Node.js Monorepo in 2026
You split your backend into separate repos. Now you have twelve repos, nine package.json files with slightly different dependency versions, four copies of your validation utils, and six CI pipelines to coordinate for one feature. Here's the monorepo setup that actually works.
Stay in the loop
Get articles on technology, health, and lifestyle delivered to your inbox.
No spam — unsubscribe anytime.