Skip to content
This repository has been archived by the owner on Nov 11, 2023. It is now read-only.

Add cache #150

Closed
stereobooster opened this issue Sep 18, 2019 · 11 comments
Closed

Add cache #150

stereobooster opened this issue Sep 18, 2019 · 11 comments
Labels
enhancement New feature or request longer discussion

Comments

@stereobooster
Copy link
Contributor

stereobooster commented Sep 18, 2019

Is your feature request related to a problem? Please describe.
Add simplest caching mechanism (LRU with expiration time for cache) for useGet. Similar to https://github.com/CharlesStover/fetch-suspense/blob/master/src/fetch-suspense.ts

Describe the solution you'd like
By default cache is disabled (for backward compatibility). If TTL option provided (and it is more than 0) than cache results for that time. Time is provided in seconds.

Describe alternatives you've considered
none

Additional context
N/A

@stereobooster stereobooster added the enhancement New feature or request label Sep 18, 2019
This was referenced Sep 18, 2019
@threehams
Copy link

Caching is what stopped me the last time I tried to make a library like this. The cache needs to be selectively updated or invalidated when another call which returns the same resource is made (think: PUT /user needs to invalid/refetch or update the cache from the GET /user call for the same ID, and update GET /users). This becomes extremely difficult to get right since most data from REST APIs can't be automatically normalized - there's no way to automatically connect routes together, making that a very manual process.

The simplest solution would just invalidate the entire cache on any mutation. It would be an improvement over no cache, at least.

@TejasQ
Copy link
Contributor

TejasQ commented Oct 28, 2019

I agree. That's the most straightforward approach and a path of least resistance, but a more "clever" cache is what I'm more interested in. It's something we've been thinking about for months and have not yet come up with a solution that is easy to grok.

Perhaps we start with a basic cache and then continue?

@minipai
Copy link

minipai commented Oct 30, 2019

Maybe could expose an API like getCache and let the user do the rest.

@TejasQ
Copy link
Contributor

TejasQ commented Oct 30, 2019

That doesn't seem like it would be helpful if we expect users to roll their own. The point of the library is to handle such concerns for users.

Have y'all worked with @apollographql's cache on the client? That seems like a great DX model for a caching solution in restful-react.

@fabien0102
Copy link
Contributor

The apollo DX is not so good regarding the cache… It become easily a huge pain with complexe queries (with some dynamic filters in the query for example, a simple search param and you can say bye bye to the cache layer if you want to stay sane ^^)

This is still an open discussion since a long time (3 years) on the apollo-client project -> apollographql/apollo-feature-requests#4

We also have a even harder problem in the REST context, since the "spec" is not really a spec 😬

I'm totally in favor to give the full responsability in the user side about how to define cache keys and cache invalidation. I really want this simple and flexible, nothing is more painful than an API that don't permit to unvalidate your cache easily (my personal old apollo-client experience ^^)

// a collection
const { data } = useGet({path: "/poney", cacheKey: (param, res) => `poney`})

// an item
const { data } = useGet({path: "/poney/the-pink-one", cacheKey: (param, res) => `poney:the-pink-one`})

// a mutation
const { mutate } = useMutate({verb: "delete", path: "/poney/the-pink-one", updateCache: (prevCache) => {
return deleteCacheEntries(prevCache, "poney*")
}})

We can also provide some utilities as deteleCacheEntries, updateCacheEntries, etc… to help the user to maintain the cache, but since it's a simple reducer pattern, you can escape and bring your own logic!

@yosbelms
Copy link

yosbelms commented Nov 3, 2019

This is REST, so, we take the benefits of HTTP caching mechanism. Why not keep it simple and rely on HTTP?

@TejasQ
Copy link
Contributor

TejasQ commented Nov 4, 2019

@yosbelms how do you mean?

@fabien0102
Copy link
Contributor

@yosbelms Since it's just about http headers in the client side, I guess this is already possible with restful-react ^^

I really need to experiment a bit with this to see how it's behave in real life 😃

The idea of a "in memory cache" is more to fill usecases when you don't have control the backend, this is a bit more flexible but also more complicated (nothing is free in this world 😁) But I really like the idea of using the native caching of HTTP, I will definitely give a try! 💯

@CMCDragonkai
Copy link

If you have cache, then you also need cache-coherence and some form of garbage collection.

I suggest reference counting mechanism so that objects no longer used by any any component is then put on the generation that can be garbage collected by a TTL which itself can be prioritized by LRU or LFU.

My concern is that if two components fetches a resource, that same resource should be available to both components and not involve repeated fetching.

Idempotency is key here too. So we have to assume that GET, PUT, DELETE are all idempotent.

@nediamond
Copy link

Bumping this because a caching mechanism in this library would be 🚀

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request longer discussion
Projects
None yet
Development

No branches or pull requests

8 participants