zxqfl's blog

By zxqfl, history, 10 years ago, In English

Can anyone recommend resources for preparing for the Distributed Code Jam on June 14th? Some past problems of a similar nature would be really helpful.

  • Vote: I like it
  • +31
  • Vote: I do not like it

»
10 years ago, # |
Rev. 2   Vote: I like it +21 Vote: I do not like it

One problem which appeared in AE last year was to compute LCS of two strings up to 105 characters.

»
10 years ago, # |
  Vote: I like it +24 Vote: I do not like it

Some Simple Problems :
Problem 1
Given a Sequence A of integers , find i and j such that A(i) + A(i+1) ...+ A(j) is Maximum/Minimum , where length of A can be upto 10^10 . TL : 1 sec
Idea
Basically , we divide A into say 10^8 lengthed 100 segments , and then process each of them on different machines and later combine the information to find the answer .
Problem 2
Given a number N as large as say 10^20 , find it`s number of divisors .
Idea
Divide the set of possible divisors <= 10^10 , into 100 groups and then add the result from each of them.

  • Some other problems can be found on this(polish site). The distributed ones are "Maksymalna podtablica", "Kółko informatyczne", and "Sekretarka".
  • »
    »
    10 years ago, # ^ |
      Vote: I like it +13 Vote: I do not like it

    But problem 2 can be solved on a single computer, there is probabilistic algo

    • »
      »
      »
      10 years ago, # ^ |
        Vote: I like it +3 Vote: I do not like it

      By the way, I always wonder why people don't multiply the complexity by log(n) which goes from finding gcd on every step? Let alone multiplying it once again when we work with big numbers and every operation consumes not constant, but logarithmic amount of time.

      • »
        »
        »
        »
        10 years ago, # ^ |
          Vote: I like it 0 Vote: I do not like it

        Because on average gcd will not really be log, and with sensible limits big numbers is just a constant

        • »
          »
          »
          »
          »
          10 years ago, # ^ |
            Vote: I like it +8 Vote: I do not like it

          Because on average gcd will not really be log

          How to prove it? If I got you right, you say that there is an upper bound for that is better than log(n).

          • »
            »
            »
            »
            »
            »
            10 years ago, # ^ |
              Vote: I like it 0 Vote: I do not like it

            It's indeed O(log(n)) number of steps on average. About Pollard's rho algorithm, probably textbooks usually mention only expected number of iterations to avoid analyzing complexity for different arithmetic operations implementations.

            • »
              »
              »
              »
              »
              »
              »
              10 years ago, # ^ |
                Vote: I like it 0 Vote: I do not like it

              Thank you for the link!

              Yeah, your claim is correct for elimination of the second log(n), but after reading the wiki article I still have no idea why they don't mention the first one in the expected complexity.

    • »
      »
      »
      10 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it
      • Yeah , that means it is very unlikely to appear in the contest, and same for problems having such an algo . — I just mentioned it because it is a simple example which conveys the idea of this contest ,that is, Speeding up Using Multiple Instances
      • So if that does not count as problem 2 . Here is another one .
        Problem
        Given a sequence of N(<=10^10) integers , find if there exists a Majority element.
        Idea
        Divide sequence into say K segments , then we process each of them individually to find candidates of Majority element and then the Majority element itself , if any .
        So Complexity reduces to O( N / k )