This code is taking around 1300 ms. (size of set shouldn't exceed more than 1 at any time, Since I am inserting same element every time)
Spoiler
But , After commenting the s.erase(), it just took 15 ms.
Spoiler
Can anyone please explain the reason behind this? Thanks in advance :)
my guess is that the compiler sees that the set isn't actually used for anything, so it's optimized out
Can we somehow avoid these optimizations? It was the only one moment in my life when I thought to use keyword
volatile
. But after I did that, code refused to compile, and I don't understand why. Would be glad if someone explained why it is happening.Actually compiler gives errors on erase and insert functions, so I suspect existing versions of these function doesn't implemented for volatile case. So it's tempting me to think maybe there is no optimization at all?
I think it has to do with memory allocation (e.g.,
malloc(..)
), which is quite an expensive operation.Without erasing, only the first insertion causes allocation (RB-tree nodes and all), and subsequent insertions only require reads (if the operations are not optimized away). On the other hand, with erasing, every inserting invokes memory allocation. And 7 million times is a lot!