This is a attempted improvement on Eser's answer (Version2). The Lazy class is thread safe by default, so the lock can be removed. It is possible that multiple Lazy objects will be created for a given key, but only one will have it's Value property queried, causing the starting of the heavy Task. The other Lazys will remain unused, and will fall out of scope and become garbage collected soon.
The first overload is the flexible and generic one, and accepts a Func<CacheItemPolicy> argument. I included two more overloads for the most common cases of absolute and sliding expiration. Many more overloads could be added for convenience.
using System.Runtime.Caching;
static partial class MemoryCacheExtensions
{
public static Task<T> GetOrCreateLazyAsync<T>(this MemoryCache cache, string key,
Func<Task<T>> valueFactory, Func<CacheItemPolicy> cacheItemPolicyFactory = null)
{
var lazyTask = (Lazy<Task<T>>)cache.Get(key);
if (lazyTask == null)
{
var newLazyTask = new Lazy<Task<T>>(valueFactory);
var cacheItem = new CacheItem(key, newLazyTask);
var cacheItemPolicy = cacheItemPolicyFactory?.Invoke();
var existingCacheItem = cache.AddOrGetExisting(cacheItem, cacheItemPolicy);
lazyTask = (Lazy<Task<T>>)existingCacheItem?.Value ?? newLazyTask;
}
return ToAsyncConditional(lazyTask.Value);
}
private static Task<TResult> ToAsyncConditional<TResult>(Task<TResult> task)
{
if (task.IsCompleted) return task;
return task.ContinueWith(t => t,
default, TaskContinuationOptions.RunContinuationsAsynchronously,
TaskScheduler.Default).Unwrap();
}
public static Task<T> GetOrCreateLazyAsync<T>(this MemoryCache cache, string key,
Func<Task<T>> valueFactory, DateTimeOffset absoluteExpiration)
{
return cache.GetOrCreateLazyAsync(key, valueFactory, () => new CacheItemPolicy()
{
AbsoluteExpiration = absoluteExpiration,
});
}
public static Task<T> GetOrCreateLazyAsync<T>(this MemoryCache cache, string key,
Func<Task<T>> valueFactory, TimeSpan slidingExpiration)
{
return cache.GetOrCreateLazyAsync(key, valueFactory, () => new CacheItemPolicy()
{
SlidingExpiration = slidingExpiration,
});
}
}
Usage example:
string html = await MemoryCache.Default.GetOrCreateLazyAsync("MyKey", async () =>
{
return await new WebClient().DownloadStringTaskAsync("https://stackoverflow.com");
}, DateTimeOffset.Now.AddMinutes(10));
The HTML of this site is downloaded and cached for 10 minutes. Multiple concurrent requests will await the same task to complete.
The System.Runtime.Caching.MemoryCache class is easy to use, but has limited support for prioritizing the cache entries. Basically there are only two options, Default and NotRemovable, meaning it's hardly adequate for advanced scenarios. The newer Microsoft.Extensions.Caching.Memory.MemoryCache class (from this package) offers more options regarding cache priorities (Low, Normal, High and NeverRemove), but otherwise is less intuitive and more cumbersome to use. It offers async capabilities, but not lazy. So here are the LazyAsync equivalent extensions for this class:
using Microsoft.Extensions.Caching.Memory;
static partial class MemoryCacheExtensions
{
public static Task<T> GetOrCreateLazyAsync<T>(this IMemoryCache cache, object key,
Func<Task<T>> valueFactory, MemoryCacheEntryOptions options = null)
{
if (!cache.TryGetValue(key, out Lazy<Task<T>> lazy))
{
var entry = cache.CreateEntry(key);
if (options != null) entry.SetOptions(options);
var newLazy = new Lazy<Task<T>>(valueFactory);
entry.Value = newLazy;
entry.Dispose(); // Dispose actually inserts the entry in the cache
if (!cache.TryGetValue(key, out lazy)) lazy = newLazy;
}
return ToAsyncConditional(lazy.Value);
}
private static Task<TResult> ToAsyncConditional<TResult>(Task<TResult> task)
{
if (task.IsCompleted) return task;
return task.ContinueWith(t => t,
default, TaskContinuationOptions.RunContinuationsAsynchronously,
TaskScheduler.Default).Unwrap();
}
public static Task<T> GetOrCreateLazyAsync<T>(this IMemoryCache cache, object key,
Func<Task<T>> valueFactory, DateTimeOffset absoluteExpiration)
{
return cache.GetOrCreateLazyAsync(key, valueFactory,
new MemoryCacheEntryOptions() { AbsoluteExpiration = absoluteExpiration });
}
public static Task<T> GetOrCreateLazyAsync<T>(this IMemoryCache cache, object key,
Func<Task<T>> valueFactory, TimeSpan slidingExpiration)
{
return cache.GetOrCreateLazyAsync(key, valueFactory,
new MemoryCacheEntryOptions() { SlidingExpiration = slidingExpiration });
}
}
Usage example:
var cache = new MemoryCache(new MemoryCacheOptions());
string html = await cache.GetOrCreateLazyAsync("MyKey", async () =>
{
return await new WebClient().DownloadStringTaskAsync("https://stackoverflow.com");
}, DateTimeOffset.Now.AddMinutes(10));
Update: I just became aware of a peculiar feature of the async-await mechanism. When an incomplete Task is awaited multiple times concurrently, the continuations will run synchronously (in the same thread) one after the other (assuming that there is no synchronization context). This can be an issue for the above implementations of GetOrCreateLazyAsync, because it is possible for blocking code to exist immediately after an awaited call to GetOrCreateLazyAsync, in which case other awaiters will be affected (delayed, or even deadlocked). A possible solution to this problem is to return an asynchronous continuation of the lazily created Task, instead of the task itself, but only if the task is incomplete. This is the reason for the introduction of the ToAsyncConditional method above.
Note: This implementation caches any errors that may occur during the asynchronous lambda invocation. This may not be a desirable behavior in general.
I possible solution could be to replace the Lazy<Task<T>> with the AsyncLazy<T> type from Stephen Cleary's Nito.AsyncEx.Coordination package, instantiated with the RetryOnFailure option.