I'm learning number theory and I cannot figure out the solution of following problem:
Given an array of integer A with N element with constrains N <= 10^5, A[i] <= 10^12
. We call GCD(L, R)
is greatest common divisor of all element A[L], A[L + 1], ..., A[R]
. Return the sum of all contiguous sub-sequence in A.
I did have some idea that, for fixed index i, the value of GCD(i, j)
will be decreasing as GCD(i, j) >= GCD(i, j + 1)
. Also, because GCD(i, j + 1)
must be equal or divisor of GCD(i, j)
, the number of distinct value X of GCD(i, j), GCD(i, j + 1), ..., GCD(i, N)
will satisfies 2^X <= A[i]
.
But I cannot go further. Please give me some more insights.