With #71286, we implemented deduping of cache entries under certain circumstances. For example, the following constructed example was fixed with the PR: ```js async function getCachedRandom() { 'use cache' return Math.random() } const rand1 = await getCachedRandom() const rand2 = await getCachedRandom() assert(rand1 === rand2) ``` However, this implementation relied on awaiting the two calls sequentially. When rendering components, this can usually not be guaranteed. E.g. the following example was not properly deduped: ```jsx async function Cached() { 'use cache' return <p>{Math.random()}</p> } export default function Page() { return ( <> <Cached /> <Cached /> </> ) } ``` This did render the same value, but only because we triggered two render passes, and the last cached value was used for both elements in the final render pass. But during the first render pass, the `Cached` function was called twice, and the cache entry was also set twice. With #75786, we also fixed the render scenario by wrapping the cached function in `React.cache`. This however did not work for route handlers. E.g. the first example rewritten as follows, and used in a route handler, still wouldn't be deduped: ```js const [rand1, rand2] = await Promise.all([ getCachedRandom(), getCachedRandom(), ]) ``` Furthermore, with this solution, nested cached functions could not be deduped across different outer cache scopes. This is because each cache scope creates its own `React.cache` scope. Example: ```jsx async function Inner() { 'use cache' return <p>{Math.random()}</p> } async function Outer1() { 'use cache' return <Inner /> } async function Outer2() { 'use cache' return <Inner /> } export default function Page() { return ( <> <Outer1 /> <Outer2 /> </> ) } ``` We can dedupe those two remaining scenarios, without changing how the cache handlers are implemented, by adding cache handler transactions to the cache wrapper. The mechanism is similar to the `set` pending promises of the default cache handler, and complements them. A cache handler transaction wraps a set of `get` and `set` calls for the given cache key. Each invocation first awaits an already existing transaction, before proceeding with a `get` call. This works across concurrent invocations to ensure that a cache function is only called once, and its result is only set once, while still preserving the streaming capabilities of the cached function's result. Incidentally, this also allows deduping of cache entries across concurrent requests, which is a nice bonus. > [!NOTE] > This PR is best reviewed with hidden whitespace changes.
This issue appears to be discussing a feature request or bug report related to the repository. Based on the content, it seems to be still under discussion. The issue was opened by unstubbable and has received 2 comments.