Блог пользователя fiire_ice

Автор fiire_ice, история, 3 месяца назад, По-английски

This is the leetcode problem from a recent contest: https://leetcode.com/problems/minimum-number-of-seconds-to-make-mountain-height-zero/

For some reason, this solution gives a TLE: https://ideone.com/00tSfJ.

I have an alternative solution which passes, my question is why does the above solution give TLE?

This solution should not be giving TLE because:

-the while loop can have upto 56 turns (since 2^56 > 1e8 * 1e8)

-the for loop upto 10^4 turns

-and, the map.upper_bound() would take 17 turns (since, log_base2(2^17) > log_base2(10^5) > log_base2(no. of items in map))

So overall, 9.52 * 10^6 operations. But I still get a TLE here. Please help me understand this better.

PS: I know that above is a simplification because complexity represents order of growth with respect to input size but I still don't see why the above solution should not pass.

  • Проголосовать: нравится
  • +7
  • Проголосовать: не нравится

»
3 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Why would you use a map to store it? you can use binary search to find it in log, while with lower constant factor, or you can find it in O(1) with some math.

  • »
    »
    3 месяца назад, # ^ |
      Проголосовать: нравится 0 Проголосовать: не нравится

    I did find it in O(1), my question is why this solution fails when it still falls within the limits of total calcs allowed on Leetcode.

    • »
      »
      »
      3 месяца назад, # ^ |
        Проголосовать: нравится 0 Проголосовать: не нравится

      I think one of the following happened:

      a) There's a part of your code that causes undefined behavior, causing the program to crash (giving TLE)

      b) The constant factor of the map is simply too large

      • »
        »
        »
        »
        3 месяца назад, # ^ |
        Rev. 2   Проголосовать: нравится 0 Проголосовать: не нравится

        a. If you are referring to the part where I decrement an iterator after doing upper_bound(), i.e., it-- would always exist based on the constraints of the problem input.

        b. I would think that the high constant factor of map would come into play when a huge number of inserts and deletes are done on a map (as memory needs to be allocated/deallocated, balancing needs to be done). But here the map operation that's being done large number of times is upper_bound, which shouldn't give a high constant factor.

        If constant factor is the reason here, which operation in the solution is leading up to it? Please help me understand.

        • »
          »
          »
          »
          »
          3 месяца назад, # ^ |
            Проголосовать: нравится 0 Проголосовать: не нравится

          It's due to the fact that map itself has a huge constant, no matter what operation you do on it, it'll be slow. It can be 3-4 times (or more) slower than normal for any operations

          • »
            »
            »
            »
            »
            »
            3 месяца назад, # ^ |
              Проголосовать: нравится 0 Проголосовать: не нравится

            Thanks. One last question, is there some way to find/estimate this constant factor or is that something too implementation-specific?

            • »
              »
              »
              »
              »
              »
              »
              3 месяца назад, # ^ |
                Проголосовать: нравится 0 Проголосовать: не нравится

              I mostly do it by trial and error, if you do enough problems you'll be able to estimate the time complexity and whether it's enough

        • »
          »
          »
          »
          »
          3 месяца назад, # ^ |
            Проголосовать: нравится 0 Проголосовать: не нравится

          may be you should check if your code cause a segmentation fault when it runs large test cases