Блог пользователя waipoli

Автор waipoli, история, 10 месяцев назад, По-английски

Hello codeforces!

Recently in the problem I came across with this data structure, which can perform this operations:

  • add element

  • delete element

  • return the maximal of it

with O(1) for query.

So the question is simple: is it exist?

  • Проголосовать: нравится
  • -27
  • Проголосовать: не нравится

»
10 месяцев назад, # |
  Проголосовать: нравится +13 Проголосовать: не нравится

Auto comment: topic has been updated by waipoli (previous revision, new revision, compare).

»
10 месяцев назад, # |
  Проголосовать: нравится +13 Проголосовать: не нравится

Auto comment: topic has been updated by waipoli (previous revision, new revision, compare).

»
10 месяцев назад, # |
Rev. 2   Проголосовать: нравится +8 Проголосовать: не нравится

You can use heap but it is O(log n) for each. I think it is the best for your need.

»
10 месяцев назад, # |
  Проголосовать: нравится +57 Проголосовать: не нравится

No. Consider there exists some ds which perfoms given operations in O(1). Suppose I insert N elements.

Further I would store current maximal element (say in an array) and delete it in O(2) and continue this process N times. The resultant vector would contain sorted elements in effective time of O(3N) which is not possile.

(It can be proved worst case time complexity of sorting an array cannot be less than NlogN.)

»
10 месяцев назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

You can write a Fibonacci's heap, which can add element and get min/max in $$$O(1)$$$. But it can delete only in $$$log(n)$$$ time

»
4 месяца назад, # |
  Проголосовать: нравится +10 Проголосовать: не нравится

You can use soft heap. It can add element or delete minimum ( and some other operations ) in O(1), but not delete some elements. Deleting I see as memorizing number of elements in some hashset and when we have x as minimum and number of x in that hashset is 0, we can say that it is minimum ( for maximum multiply all numbers by -1 ) and otherwise decrease it's number from hashset and perform operation one more time. The only issue there that the elements can be corrupted. You choose some E and then you can performe operations in O(log(1/E)) time and at most E*n out of n elements are corrupted. It is even more complicated them Fibonacci's heap.