I think you’ve generally answered your own question, however, this:
Not as compact as tree. (for practical purposes load factors is never 1)
is not necessarily true. Each node of a tree (we’ll assume it’s a red-black tree) for a type T
utilizes space that is equal to at least 2 * pointer_size + sizeof(T) + sizeof(bool)
. This may be 3 * pointer size
depending on whether the tree contains a parent
pointer for each tree node.
Compare this to a hash-map: there will be wasted array space for each hash map due to the fact that load factor < 1
as you’ve said. However, assuming the hash map uses singly linked lists for chaining (and really, there’s no real reason not to), each element inserted take only sizeof(T) + pointer size
.
Note that this analysis ignores any overhead which may come from extra space used by alignment.
For any element T
which has a small size (so, any basic type), the size of the pointers and other overhead dominates. At a load factor of > 0.5
(for example) the std::unordered_set
may indeed use up less memory than the equivalent std::set
.
The other big missing point is the fact that iterating through a std::set
is guaranteed to produce an ordering from smallest to largest, based on the given comparison function, while iterating through an std::unordered_set
will return the values in a “random” order.