Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: memo separate key for reading and writing cached values #333

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

hugo082
Copy link

@hugo082 hugo082 commented Dec 27, 2024

Summary

You can optionally use a separate key for reading and writing cached values by providing a setKey function.
This allows for more flexible cache management and sharing of cached values between different function calls.

For any code change,

  • Related documentation has been updated, if needed
  • Related tests have been added or updated, if needed
  • Related benchmarks have been added or updated, if needed
  • Release notes in next-minor.md or next-major.md have been added, if needed

Does this PR introduce a breaking change?

No

Bundle impact

Status File Size 1 Difference
M src/curry/memo.ts 573 +266 (+87%)

Footnotes

  1. Function size includes the import dependencies of the function.

@hugo082 hugo082 marked this pull request as ready for review December 27, 2024 10:01
@hugo082 hugo082 requested a review from aleclarson as a code owner December 27, 2024 10:01
@hugo082 hugo082 changed the title Dissociate memo cache read keys from set keys feat: memo separate key for reading and writing cached values Dec 27, 2024
@hugo082 hugo082 changed the title feat: memo separate key for reading and writing cached values feat: memo separate key for reading and writing cached values Dec 27, 2024
Copy link
Contributor

@MarlonPassos-git MarlonPassos-git left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@MarlonPassos-git MarlonPassos-git added the new feature This PR adds a new function or extends an existing one label Dec 27, 2024
@aleclarson
Copy link
Member

Hey, thanks for the PR. 🎉 Not sure about this one. Seems very niche at first glance. Especially considering the +87% size increase, we'll need to justify the feature using real world use cases (feel free to list some). Another option would be to come up with another function built purposely for this need.

@MarlonPassos-git MarlonPassos-git added the awaiting more feedback Wait on this until more people comment. label Dec 28, 2024
@hugo082
Copy link
Author

hugo082 commented Dec 31, 2024

The size increase is mostly because of the import of selectFirst, we can probably reduce with a .find(...) approach.
About the use case it's mostly to be able to share the cache accros multiple calls / parameters

@aleclarson
Copy link
Member

Hey @hugo082, good to hear from you!

Allow me to clarify what I'm asking for. A “use case” is a practical scenario where a function solves a specific problem or fulfills a need. It emphasizes why and when the function is used, rather than delving into the technical details of how it works. For instance, the use case for a capitalize() function might be standardizing user input in a form for consistent formatting.

  1. In what scenarios is it common to share the memo cache across multiple call signatures?

  2. Are these scenarios frequent enough to justify expanding the memo function, or would it be better to introduce a separate function? (This functionality may be too niche to warrant adding to Radashi at all. 🤔)

  3. Is this functionality present in another widely-used utility library, even outside of JavaScript? (While not a strict requirement, it could help demonstrate historical demand.)

@hugo082
Copy link
Author

hugo082 commented Jan 7, 2025

Use Case:

In some scenarios, caching results across multiple call signatures is critical for performance and consistency, especially when working with shared datasets or computations. For instance:

  • Data Fetching and Transformation: Imagine a scenario where you fetch data from an API and transform it based on different parameters.
    Example: fetchUserData(userId) and fetchUserData({ userId, includeMetadata: false }).
    Here, both calls rely on the same underlying dataset (userId), and caching across these signatures avoids redundant API calls and computations. But it's not necessarily true for fetchUserData({ userId, includeMetadata: true })
  • Expensive Computations with Shared Dependencies:
    When working with a mathematical function or data transformation where multiple inputs map to the same intermediate result.
    Example: calculateStats(dataSet) and calculateStats({ dataSet, normalize: false }) vs calculateStats({ dataSet, normalize: true }).
    Sharing cache ensures you’re not recomputing the stats unnecessarily for normalized and non-normalized results of the same dataset.

Are these scenarios frequent enough?

These scenarios appear in data-heavy applications, libraries dealing with API integrations, or anywhere complex computations are shared between layers. In modern development, especially with functional programming patterns and increasing API reliance.

Comparison with Other Libraries:

The concept of separating getKeys and setKeys aligns with strategies used in some caching libraries or frameworks in other ecosystems:

  • Java's Guava Cache: Provides explicit control over cache keys and allows flexible strategies for shared caches.
    While not prevalent in JavaScript utility libraries, introducing it into a widely-used library like Radashi could set a precedent and fill a gap.
Code example

import com.google.common.cache.Cache;
import com.google.common.cache.CacheBuilder;

public class CustomKey {
    private final String part1;
    private final String part2;

    public CustomKey(String part1, String part2) {
        this.part1 = part1;
        this.part2 = part2;
    }

    @Override
    public boolean equals(Object o) {
        if (this == o) return true;
        if (o == null || getClass() != o.getClass()) return false;
        CustomKey that = (CustomKey) o;
        return part1.equals(that.part1) && part2.equals(that.part2);
    }

    @Override
    public int hashCode() {
        return Objects.hash(part1, part2);
    }
}

// Usage
Cache<CustomKey, String> cache = CacheBuilder.newBuilder().build();
CustomKey key = new CustomKey("value1", "value2");
cache.put(key, "cachedData");

Proposal:

If this functionality feels too niche for the main memoize function, perhaps introducing a complementary createSharedMemoCache utility that supports shared cache use cases would be a middle ground. This way, the core function remains lean while allowing power users to achieve their goals.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
awaiting more feedback Wait on this until more people comment. new feature This PR adds a new function or extends an existing one
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants