forked from boostorg/unordered
Clean up wording on Iterator Invalidation to recommend using reserve()
This commit is contained in:
@ -120,23 +120,33 @@ or close to the hint - unless your hint is unreasonably small or large.
|
|||||||
|
|
||||||
== Iterator Invalidation
|
== Iterator Invalidation
|
||||||
|
|
||||||
It is not specified how member functions other than `rehash` affect
|
It is not specified how member functions other than `rehash` and `reserve` affect
|
||||||
the bucket count, although `insert` is only allowed to invalidate iterators
|
the bucket count, although `insert` is only allowed to invalidate iterators
|
||||||
when the insertion causes the load factor to be greater than or equal to the
|
when the insertion causes the load factor to be greater than or equal to the
|
||||||
maximum load factor. For most implementations this means that `insert` will only
|
maximum load factor. For most implementations this means that `insert` will only
|
||||||
change the number of buckets when this happens. While iterators can be
|
change the number of buckets when this happens. While iterators can be
|
||||||
invalidated by calls to `insert` and `rehash`, pointers and references to the
|
invalidated by calls to `insert`, `rehash` and `reserve`, pointers and references to the
|
||||||
container's elements are never invalidated.
|
container's elements are never invalidated.
|
||||||
|
|
||||||
In a similar manner to using `reserve` for ``vector``s, it can be a good idea
|
In a similar manner to using `reserve` for ``vector``s, it can be a good idea
|
||||||
to call `rehash` before inserting a large number of elements. This will get
|
to call `reserve` before inserting a large number of elements. This will get
|
||||||
the expensive rehashing out of the way and let you store iterators, safe in
|
the expensive rehashing out of the way and let you store iterators, safe in
|
||||||
the knowledge that they won't be invalidated. If you are inserting `n`
|
the knowledge that they won't be invalidated. If you are inserting `n`
|
||||||
elements into container `x`, you could first call:
|
elements into container `x`, you could first call:
|
||||||
|
|
||||||
```
|
```
|
||||||
x.rehash((x.size() + n) / x.max_load_factor());
|
x.reserve(n);
|
||||||
```
|
```
|
||||||
|
|
||||||
Note:: ``rehash``'s argument is the minimum number of buckets, not the
|
Note:: `reserve(n)` reserves space for at least `n` elements, allocating enough buckets
|
||||||
number of elements, which is why the new size is divided by the maximum load factor.
|
so as to not exceed the maximum load factor.
|
||||||
|
+
|
||||||
|
Because the maximum load factor is defined as the number of elements divided by the total
|
||||||
|
number of available buckets, this function is logically equivalent to:
|
||||||
|
+
|
||||||
|
```
|
||||||
|
x.rehash(std::ceil(n / x.max_load_factor()))
|
||||||
|
```
|
||||||
|
+
|
||||||
|
See the <<unordered_map_rehash,reference for more details>> on the `rehash` function.
|
||||||
|
|
||||||
|
Reference in New Issue
Block a user