Recently I have encountered an interesting problem:
Given a grid of N x N (1 <= N <= 1500) consist of integers. Define F(i, j) is the maximum value can be obtained by moving from (1, 1) to (i, j) using only moving right and moving down operation, i.e from (i, j) you can only move to (i + 1, j) and (i, j + 1). Your task is calculate sum of F(i, j) over 1 <= i <= N and 1 <= j <= N.
Now you have Q (Q <= 1500) queries, each query denote by character c and two integers x and y. c is either '-' or '+'. If c is '-', then a[x][y]--, otherwise a[x][y]++. After each query you have to print on a line the sum of F(i, j) over 1 <= i <= N and 1 <= j <= N.
I couldn't optimize my O(Q * N * N) algorithm. Could anyone have an idea to solve this problem?