Hello everyone!
JOI Open Contest 2019 will be held on July 14. (04:00-09:00 UTC) and July 14. (10:00-15:00 UTC). As the tasks in these rounds are the same, you can participate in whichever round you like.
The contest duration is 5 hours and there will be 3 tasks. Each submitted source program must be written only in C++. Problem statements will be provided both in Japanese and in English.
You can find the details in the contest page.
The past contests are also available in this page. Since the judge server isn't available after the contests, please download the testcases when you want to test your code.
Good luck and have fun!
UPD : The third round will be held from 15:30 UTC.
UPD2 : Reviews, sample sources and test data are uploaded to the contest page.
Isn't the second window overlapping with AGC? I believe it's a better idea to postpone it by a couple of hours. This way the 2 options would better cover the timezones and the time clashing would be fixed as well.
We decided to hold another round. It will be held from 15:30 UTC. But it is late at night in Japan, so I think clarifications won't be answered.
So on the contest page it says
Each submitted source program must be written in C++ (C++14). Allowed extensions for source codes are: C++: .cpp You cannot use Pascal nor Java.
but in this blog you sayEach submitted source program must be written in either C++ or Java
. So is Java supported or not?It's my mistake. Information on the official website is correct.
Auto comment: topic has been updated by yutaka1999 (previous revision, new revision, compare).
Can somebody please explain how to solve the second task for 27 points. Also for 100 if possible.
It can be seen, that you will need to use one of the top $$$O(logn)$$$ in the triplet.
So you should try each of the log(n) biggest elements as part of the triplet?
Exactly.
hmm thanks but do you know how to prove it or what is the intuition behind it?
If you pick $$$O(logn)$$$ numbers, you can get a triple from them. Thus maximum element from optimal triple is at least that value.
ohh ok thanks.
So doesn't that mean that we can use the same idea for 100 points. Storing in each node of the segment tree the elements in a sorted order, then pick every log(n) biggest elements from each node from an interval and then try to form a triplet from this elements? It's about O(n*log^2).
But only one element out of three is guaranteed to be in top logn. Others might be smaller.
Yes you are right, got it.
It was possible to get 55 points on Problem Remittance by using Gaussian-Elimination. Isn't it explicitly excluded from IOI Syllabus?
How?
Let $$$c_i$$$ be the total amount that $$$i$$$-th person will give to the next person. So it must be true that $$$a_i - 2c_i + c_{i - 1} = b_i$$$. You can notice that if answer exists it is unique. So you can find a solution and then simulate to check if it actually works. Because a person may not have enough money to pay the required amount found in the solution.
Yes, but we can see that money cannot go more than $$$O(logn)$$$ steps. I think it was for terrible implementations of this algorithm. Wouldn't we get massive precision errors using gaussian?
No. We only want integer solutions. And if it exists you'll get it nicely. Any non integer value means there is no solution.
How to solve problem A?
I had one observation: one of the triple elements is at least as big as $$$O(logn)th$$$ element on segment $$$[L, R]$$$. My guess is that we can use data structures if we fix one element of the triple. I guess that 2 elements from best triple is from top 100 elements.
There are $$$O(n)$$$ pairs $$$(a,b)$$$ which can be candidates in the answer (the same $$$O(n)$$$ for any segment).
That is because for each $$$a < k < b$$$ you should have $$$arr_k \leq arr_a$$$ and $$$arr_k \leq arr_b$$$.
So you can find these pairs with stack.
After that, for each query $$$(l,r)$$$ you need to find
$$$l \leq a_i \leq 2b_i - a_i \leq j \leq r$$$
with maximum $$$arr_{a_i} + arr_{b_i} + arr_{j}$$$
Let's answer these queries in offline, for that let's move $$$l$$$ from the right and maintaining the largest $$$arr_{a_i} + arr_{b_i}$$$ for fixed $$$2b_i - a_i$$$, let's call this value $$$f_{2b_i - a_i}$$$.
To answer the query at the fixed $$$l$$$ you should find $$$l \leq i \leq j \leq r$$$ with maximum possible $$$f_i + arr_j$$$.
You can do it with segment tree, maintaining the maximum possible $$$arr_i$$$, $$$f_i$$$, and $$$f_i + arr_j$$$ among all $$$tree_l \leq i \leq j \leq tree_r$$$.
I think this problem is very good. Unfortunately, haven't solved it during the contest, because I was trying to improve the idea of $$$\log$$$ maximums.
Cheers to gamegame and tmwilliamlin168 for showing me this brilliant idea of $$$O(n)$$$ pairs $$$(a,b)$$$.
Was virus super tricky Kosaraju's on a grid?
Well, I solved it with some creepy BFS where you walk around for the fixed cell, and when you can understand that you are not one of the minimal guys, you should go out.
Lots of constant optimizations and $$$100$$$ points in the end.
You can do it in $$$O(RC\log (RC)).$$$
Maintain a list of components in the graph, each of which has a special cell which is infected regardless of which cell in the component is infected initially. Initially, each nonzero cell is a special cell of its own component.
Now, start a BFS from each special cell $$$A$$$ until the BFS ends or reaches a cell $$$B$$$ outside of its component. In the former case, update the answer with the number of vertices visited and mark $$$A$$$ as "dead" (represented by -1 in my solution). If $$$B$$$'s component is dead, mark $$$A$$$'s component as dead. Otherwise, merge the component of $$$A$$$ with the component of $$$B$$$. The special vertex of the union of the two components is simply the special cell of the newly visited component.
Doing the BFS from each special cell takes $$$O(RC)$$$ time. After we process all the merges, the number of components which are not dead is reduced by a factor of at least two. This can repeat at most $$$\log_2(RC)$$$ times (similar to Boruvka's algorithm).
Code
It doesn't seem to pass test 34. Here's my code: code. Is it because I use Kosoraju's algorithm or am I missing something necessary to get correct complexity? UPD: Oh, I'm not using "special nodes" in components :(
Consider the following example. Suppose that you have nodes $$$A_1,A_2,\ldots,A_k,$$$ each with an edge to another node $$$B.$$$ After one iteration of merging, you find one additional edge from $$$B$$$ to $$$A_k$$$ and no others. In this case, the number of SCC's would be reduced by only one (certainly not by a factor of 2).
In my solution, I would merge all of these into one component with special node $$$B$$$.
(By the way, I've made a few adjustments to the above post.)
Is there a reason u choose to use Kosaraju instead of Tarjan? Also can Tarjan be used in this case? Thanks! :)
They are basically the same. Look at this page to learn more.
What's the intended solution of problem Remittance ?
Here's my code. I just thought that traversing the array in two ways will give a correct answer.
My solution was probably a massive overkill. Suppose there are $$$c_{i}$$$ coins on house $$$i$$$ (I shall use zero based indexing). Consider the sum $$$I = \sum_{i = 0}^{n - 1} 2^{i} c_{i}$$$. This sum does not change when you send coins from house $$$i$$$ such that $$$i < n -1$$$ because $$$2^{i}(c_{i} - 2) + 2^{i + 1}(c_{i + 1} + 1) = 2^{i} c_{i} + 2^{i + 1} c_{i + 1}$$$. Also when we send coins from house $$$n - 1$$$ to house $$$0$$$, $$$I$$$ decreases exactly by $$$2^{n} - 1$$$. So if we send coins from house $$$n - 1$$$ to house 0 $$$k$$$ times, we shall have $$$\sum_{i = 0}^{n - 1} 2^{i} a_{i} - \sum_{i = 0}^{n - 1} 2^{i} b_{i} = k(2^{n} - 1)$$$. From this we can uniquely find the value of $$$k$$$.
We are almost done, first send $$$k$$$ coins to house $$$0$$$ from the last houses greedily. Now we can solve this problem for non-cyclic array which can be done with a simple loop.
Auto comment: topic has been updated by yutaka1999 (previous revision, new revision, compare).
ojuz, will it be added to oj.uz ?