I’ve been in this situation myself and would like to tell you how I tackled this problem and give a few examples of how you could do this. Do you want to cache some data in your app, but you’re unsure how to implement it in your codebase? In one of my previous apps, I found a few endpoints that were called very frequently. The network calls were initiated by many different places – view models, use cases, repositories, etc. Each caller was making an network call, using the response, and wasn’t it with the other callers. This was very inefficient. Android GET individual locally sharing Do you have a similar situation in your codebase? This resulted in unnecessary network calls and battery drain. This was noticeable to the users since we were operating in Emerging Markets where network traffic is costly and phones aren’t the most capable. Also, we were putting a redundant load on the backend, and as our user base kept growing, our requests started to be blocked by the backend . To fix this, I cached the data in memory and created a to read and mutate it. This allowed me to the data and replace most network calls with a cache read. rate limiter runtime global Single source of truth share This was a good start, but there was another problem. The fetching of the data wasn’t synchronized. If multiple callers needed the data at the same time, but the cache was empty, they were still making network calls, for example, on login or app launch. To fix this, I added a to the existing solution. This allowed me to synchronize the calls and the data when it was simultaneously requested by multiple callers. individual synchronous fetch share Finally, there was a problem that this data was needed on app launch, but it wasn’t available in memory after a . I had to make a network call to fetch it and display a loading spinner in the meantime. This slows down the app launch and even makes it fail if the network call fails. To fix this, I added to the existing solution. This allowed me to use a cached version of the data on app launch and unblock it from waiting for a network call. runtime process restart persistence Do you also have data in your app that you can benefit from caching? Impact As a result, the number of to the cached endpoints has decreased by . This was measured on the backend side. Given that these were the most frequently accessed endpoints, this has significantly cut down the cost of running our backend services. requests per second (RPS) 66% Secondly, the app now launches and since it doesn’t depend on a network call. Instead, it now reads from a persistent storage, which is much more reliable. Previously, if the network call failed, an error UI with a retry prompt was displayed. faster always successfully Runtime Cache As the first step, I cached the data in memory and created a to read and mutate it. This allowed me to the data and replace most network calls with a cache read. Previously, each caller was making an network call and using the data only . Now, callers can access the data through a repository. runtime global Single source of truth share individual locally shared Internally, I stored the data in a reactive data holder – , which allowed me to make it . This turned out to be very handy since my screens were already implemented using and could react to the data changes. MutableStateFlow observable MVVM For example, imagine the following scenario. The data is used in and , the user navigates forward from to , changes the data in and then returns back to . In this case, will have to be manually reloaded unless it can observe the data and react to its changes. Screen A Screen B Screen A Screen B Screen B Screen A Screen A https://gist.github.com/vlad-kasprov/ca73e916708493381c33f2aaa6999873?embedable=true Notes All operations are because and are thread-safe. thread-safe value update In this example, is the and value, but you can adjust the code to have other behavior. null initial empty emits only distinct values because skips values to the current one. observe() MutableStateFlow equal If you prefer RxJava, you can use instead of . BehaviorSubject MutableStateFlow Runtime Cache with Synchronous Fetch As the second step, I added the ability to the data by calling . This is needed when multiple callers try to fetch the data. For example, on login or app launch. synchronously fetch updateAndGet simultaneously Previously, if multiple callers tried to fetch data at the same time, this could result in a race condition and multiple network calls. Now, the fetch operation is , only one network call will happen, and the response will be with all the awaiting callers. synchronized shared Furthermore, by passing an update strategy, callers can specify when to fetch fresh data or a cached version can be returned. You can implement other strategies based on your use case. https://gist.github.com/vlad-kasprov/83337787c0026fec59a88e0723fcebef?embedable=true Notes All operations are because they are synchronized with . thread-safe Mutex You need to write your own implementation of . ExampleRemoteDataSource Runtime & Persistent Cache with Synchronous Fetch Finally, along with caching the data in memory, I it as well. This is needed on app launch when the data isn’t available in memory after a but is required to display the first screen. runtime persisted runtime process restart Previously, on app launch, , I had to make a network call and display a loading spinner until it was completed. If it failed, I had to display an error UI with a retry prompt. This meant that the user couldn’t interact with the app until that network call had been successfully completed. before showing the first screen UI Now, on app launch, I simply load the previously cached data from the storage. This made the app launch and . persistent faster always successful https://gist.github.com/vlad-kasprov/c5e3f0a2efc104f654eb7ebd9856834e?embedable=true Notes All operations are because they are synchronized with . thread-safe Mutex The initial value is loaded from the persistent storage during construction. Until the load has been completed, delays emissions and returns . ExampleRepository observe get null It’s up to you how to implement persistence, so write your own implementation of . You can store the data in a database using or simply serialize it into JSON and use . ExampleLocalDataSource Room SharedPreferences Conclusion This three-step refactoring process allowed me to safely and incrementally introduce caching into the existing codebase. Each step delivered measurable value to the company and the users, such as fewer network calls and a faster and more reliable app launch. If the answer is, but you’re unsure where to start, consider my story as a possible guide and example. Can caching benefit your app as well?