Posts with Tag: Code Kata

A LRU cache is a container that ensures its maximum capacity is never exceeded by using a Least Recently Used strategy to discard elements. The LRU algorithm keeps track of the order used to access the cache elements in order to know the ones to discard when the container is full. We present here a simple implementation using a Linked Hash Map in order to ensure O(1) for both access and insertion time. The space complexity is O(1) too as the number of elements can’t exceed the cache’s capacity.

Continue reading

In this post I am sharing a code kata we did recently at Sinbadsoft: sorting large files with limited memory. By large files I mean files that would not fit in the available amount of RAM. A regular sort algorithm will not perform well in this case as the whole data cannot be loaded at once in memory. The solution of this classic problem is external sorting. It consists of two steps: first, split the file into small chunks that would fit in memory, load each chunk, sort it, and write it back on disk. Second, perform a k-way merge on all the sorted chunks to get the final result.
Continue reading

In this post I am sharing another code Kata: an implementation of a binary heap. Once we have the heap implemented, we will easily deduce a heap sort and a priority queue based on it. It takes about 100 lines of C# code.

Continue reading

I find merge sort elegant and easy to implement and to understand for both iterative and recursive approaches. In this post I’ll share a quick (and probably dirty) iterative and recursive implementations of merge sort. Both versions share exactly the same merge operation. The implementation takes less than 30 lines of C#.
Continue reading