Achieving zero heap memory allocations with LINQ operations.
This is a follow up on my exploration into improving LINQ performance. This is a work in progress. Go to my previous article to find what triggered this.
The source code is available at https://github.com/NetFabric/NetFabric.Hyperlinq
I’ve been emphasizing the importance of the use of value types for raw performance but, another advantage is that they are allocated on the stack. This means that, they don’t contribute to increasing the frequency of the garbage collection. The deallocation is deterministic. It’s immediately performed when it gets out of scope.
One of my objectives for this project is to get as close as possible to zero heap allocations.
To achieve zero allocation in this project, I declare all enumerables and enumerators as
struct but, this is still not enough. We have to avoid copies of these value types due to passing-by-value, boxing or defensive copies. Check my previous articles in this series to find how I take care of that.
There was a particular pattern in my code that I missed and where there was a heap allocation in disguise.
public static int Count<TEnumerable, TEnumerator, TSource>(this TEnumerable source, Func<TSource, bool> predicate)
where TEnumerable : IEnumerable<TSource>
where TEnumerator : IEnumerator<TSource>
if (source == null) ThrowHelper.ThrowArgumentNullException(nameof(source));
var count = 0;
using (var enumerator = (TEnumerator)source.GetEnumerator())
A new instance of an enumerator is created by calling
GetEnumerator() and stored in the
enumerator variable. The variable is of type
TEnumerator that, by using generics constraints, allows the enumerator not to be boxed. Well, in this case, that’s not true…
GetEnumerator() method returns
IEnumerator<TSource> that boxes the enumerator. The cast to