First of all, Sorry for my poor English.
I know there's a blog which discusses about this problem before: https://codeforces.net/blog/entry/52860. However I am not really understand the solution for it.
Well, here's the problem statement: You are given an array $$$a$$$ with $$$n$$$ elements. Your task is to divide the array $$$a$$$ into $$$k$$$ subarrays so that the difference between the largest and the smallest value of all subarrays is minimum (The value of a subarray is the sum of all elements in that subarray). Constraints: $$$ 1 \leq K \leq N \leq 250,\ 0 \leq a[i] \leq 10^6\ \forall i \in [1, n] $$$
So let's take an example: $$$N=6$$$ and $$$K=3$$$ and the array $$$a = {7, 4, 6, 1, 2, 10}$$$
The optimal way is to split it into $$$[7, 4][6, 1, 2][10] \rightarrow$$$ The value of each subarray is $$$11\ (= 7 + 4), 9\ (= 6 + 1 + 2), 10\ (= 10)$$$. Here the largest value is $$$11$$$ and the smallest one is $$$9$$$ so the answer is $$$11 - 9 = 2$$$
Actually I did attempt to code this problem. If you want to see, I put it below
My algorithm is to use Dynamic Programming where $$$dp[i][j]$$$ denotes the answer for the segment from $$$1$$$ to $$$i$$$ which is being divided into $$$j$$$ subarrays. So the final answer would be $$$dp[N][K]$$$ and time complexity would be $$$O(N^2 \times K)$$$. However, I don't know if this is the exact solution which solves all the cases for this problem. Therefore, I want you guys to point me out is there any errors in my code as well as give me some hints if they are wrong. Thanks a lot!
10 5 172175 314751 42278 124175 323274 156404 723022 324395 809542 290212
I have a solution that works in $$$O(\frac{n^3k \log{n}}{64})$$$ time, 0.6 seconds on a max test locally. The idea is to make a sorted array of all subarray sums and to use 2 pointers to see, for each sum that we choose to make the maximum, what's the highest minimum sum we can make. To figure out if it's possible to split the array into $$$k$$$ subarrays given a minimum and maximum sum, let $$$dp[i][j]$$$ be 1 if the first $$$i$$$ elements can be split into $$$j$$$ subarrays with the given minimum and maximum sum, and 0 otherwise. For transitions, we can use 2 pointers to see for each right end of the subarray what range of the left ends of the subarray has the sum between the minimum and maximum allowed sum, and $$$dp[i][j]$$$ is true if $$$dp[x][j-1]$$$ is true for any $$$x$$$ inside that range. This can be handled with a segment tree and bitsets; if we consider each $$$dp[i]$$$ as a bitset of size $$$k$$$, then $$$dp[i]$$$ is simply the bitwise OR of all $$$dp[x]$$$ in the range shifted by 1. Code:
Thank you very much
Couldn't you maintain a monotonic deque for the bitsets in the valid range, and remove the $$$\log n$$$ factor?
I'm not sure how exactly a monotonic deque would work here, but simulating a queue with 2 stacks does remove the log factor.
Yeah that is what I meant by monotonic queue.
I just realized that I wrote deque in the original comment lol
this
I don't know how to solve this so I'll try to fix your English instead lol
I am not really understand -> I don't really understand take an example -> for example and the array -> and an array
I might not be able to point out any errors in your code or hint you how to solve this problem but i hope i can help you improve your English
Hope this helps !