TCO19 Algorithm Round 3A and Parallel Rated Round is scheduled to start at 12:00 UTC -4 on July 6, 2019. Registration is open for the Round in the Arena or Applet and it closes 5 minutes before the match begins, so make sure that you are all ready to go.
Click here to see what time it starts in your area.
Algorithm Round 3 Qualified Competitors (https://tco19.topcoder.com/algorithm/round-2a and https://tco19.topcoder.com/algorithm/round-2b) are eligible to compete in Round 3A. Those who have already qualified for Round 4 (https://tco19.topcoder.com/algorithm/round-4) from online stages are not eligible to compete.
All contestants who will end up with a positive score in either of Round 3A or Round 3B will be awarded a TCO19 T-shirt. The t-shirt will be shipped after the TCO19 Finals to held in November.
All the best! See you in the arena!
The link to timeanddate has incorrect time!
Correct Time
Gentle Reminder: The match begins in ~ 1hours and 30 mins.
Any hint for 250?
Fix the number of tables with parents, then divide the parents to the tables, and then divide the children to them.
Meh, n^3 doesn't deserve to pass in hard, but I expect sent solutions in O(n^3) to pass :/
Hint for 500?
You can express probability that no bus arrives at time smaller than $$$t$$$ as a polynomial, then probability that bus $$$i$$$ arrives at time $$$[t, t+\mathrm{d}t]$$$ as a polynomial and integrate, that's basically the idea. The coefficients of these polynomials will explode, though. You need to keep them multiplied by something suitable, binomial coefficients or something.
Don't listen to Xellos cause you won't make these coefficients manageable. Read what Lewin wrote below.
Ha! I did it!
I start with the same thing mnbvmar wrote below: multiply $$$1-\frac{f_{min}}{f_i}+\frac{1}{f_i}x$$$ to get a polynomial with positive coefficients. However, I'm doing it for all $$$i$$$, not for "all $$$i$$$ except". Let's call this polynomial $$$P(x) = \sum p_j x^j$$$. I'll just focus on getting the probability for one bus $$$i$$$, which can be expressed as
for $a = f_{min} / (f_i - f_{min})$. Of course, we need to handle the case $$$f_i = f_{min}$$$ separately, but that's just $$$\int_0^1 P(x)/x = \sum_{j=0}^N p_j/(j+1)$$$.
Let's informally compute the integral
The last sum is just the Taylor series of $$$\log$$$ without the first $$$j$$$ terms, so
This last expression will be useful for sufficiently large $$$a$$$. If $$$a^N \ge \epsilon$$$, then the integral is at least $$$1/(1+j)(1 + \log \epsilon)$$$ and all terms are at most $$$\mathrm{max}(\log 2 / \epsilon^2, 1)$$$ in absolute value, so we can compute powers $$$(-a)^{-k}$$$, compute all coefficients and add them up normally without any risk of catastrophic cancellations. We can pick $$$\epsilon$$$ at around 1e-5, for example.
What if $$$a^N \lt \epsilon$$$? In that case, we don't need the integral at all, since $$$a < 1$$$ and all series are absolutely convergent. In fact, they converge pretty quickly, since $$$a^k$$$ for $$$k=5N$$$ is smaller than $$$\epsilon^5$$$, which is 1e-25 for the choice I used above. We can also find a recursive formula
which again doesn't blow up or get so small that there would be significant precision loss, and if we can compute $s_N$, then all other numbers $$$s_j$$$ can be found in $$$O(N)$$$. We can exploit the fact that high terms don't matter in the resulting sum and compute $$$s_N$$$ by starting with $$$s_{5N} = 0$$$ and applying the recursive formula. That's all for the algorithm.
Time complexity: In general, if we assume that $$$a^\alpha \approx 0$$$, this takes $$$O(N\alpha)$$$ time. We can assume that just like $$$\epsilon$$$, $$$a^{\alpha}$$$ is comparable to the precision of our reals, just on the other side: $$$a^\alpha \approx \epsilon^{\alpha/N} = \epsilon^\beta$$$, where $$$\beta$$$ is $$$O(1)$$$, in our case $$$\beta = 5$$$. That doesn't depend on $$$N$$$ or what reals we use — for example, if doubles have precision ~1e-14, then slightly less than sqrt of that is $$$\epsilon$$$ and something like the square of that is $$$\approx 0$$$ just to be sure, so $$$\beta \approx 4$$$. Then, the time complexity is $$$O(N^2\beta) = O(N^2)$$$.
500: Why don't they use modulo instead of real number? They have another solution and want to kill obvious solution with integral?
Didn't you want to ask the question the other way around? If so then I think it is reasonable to not let pass solutions that have minus sign at any place if we can do this without catastrophic cancellation.
I think they could still pass, but with much more effort. Figuring out how to avoid your coefficients blowing up is also a good exercise.
Some solutions are just doomed from the start and I don't know what you could do to rescue them if intermediate computations need to have precision of thousands of digits after decimal point.
Are you saying there exist a solution with integration and avoiding coeeficients blowing up. Can you give a brief about how to avoid coefficients to blow up.
Let's scale f[i]'s so that $$$\min_i f[i] = 1$$$. Now, the probability that the $$$x[k]$$$ is the smallest is $$$\frac{1}{f[k]} \int_0^1 \prod_{i \neq k} \left(1 - \frac{t}{f[i]}\right) \mathrm{dt}$$$ ($$$t$$$ is the value of $$$x[k]$$$ here).
Substitute $$$s := 1 - t$$$ to get $$$\frac{1}{f[k]} \int_0^1 \prod_{i \neq k} \left[ \left(1 - \frac{1}{f[i]}\right) + \frac{s}{f[i]}\right] \mathrm{ds}$$$. Each linear function in the product has both its coefficients non-negative, so the product can be computed without any significant precision loss and each resulting coefficient is non-negative. Each coefficient must be fairly small as well (or else the resulting probability would be larger than $$$1$$$).
One can compute the products $$$\prod_{i \neq k} \left[ \left(1 - \frac{1}{f[i]}\right) + \frac{s}{f[i]}\right]$$$ for each $$$k$$$ in $$$O(n^2 \log n)$$$ time in the same way Swistakk mentioned before.
Have you implemented this? I tried doing exactly that(I think) but only got zeroes for the max test.
Yup. It's still kinda tricky to avoid any catastrophic cancellations, but it can be done.
Hi, I was the author of the problems. Hope you enjoyed them.
Here are some solution sketches and code
First, let's only seat parents, and also let's fix the number of empty tables. This can be computed by some dp (or you can do something with stirling numbers). Each child has (empty+1) different tables they could choose to sit at, so you just multiply this by (empty+1)^(sum a).
Find the bus with smallest interval. We know that there will be a bus that arrives within f[mn] minutes. Each other bus has probability f[mn]/f[i] of arriving within f[mn] minutes. If k buses will appear within f[mn] minutes, all those buses have equal probability of appearing first. Thus, this can be reduced to the following problem: you have some items with value v_i and probability p_i of appearing. What is the expected value of their average?
This can be solve with dp. Let dp[i][j] be the expected value of the average given there are exactly j items that appear from the first i items. You can see the code for more details on how to maintain this dp.
I forgot n^3 is pretty fast when n <= 1000. Some of them might pass. Next time, I'll remember to set the bounds to be bigger.
First, fix two lines. This splits the plane into four regions. Then, the other lines can be split into four different classes, based on which region it doesn't touch. The number of ways the sum of 2^(number of lines in each class)
This gives an n^3 solution. To do it faster, you can use bitsets. Or, alternatively, there is a n^2 log n solution as follows. First sort the lines by angle. Then fix one line. Sweep over the other lines in order of their intersection point with the fixed line. When processing the i-th of this line, you can compute how many lines fall into each class with a binary indexed tree.
Each problem separately was nice (except for a small limit in hard) but together they created a bad problemset IMO. Similar types/categories. I'm not complaining though because I like dp and geometry.
I don't think easy and med were of similar type and they were definitely different from hard, so in my opinion each problem separately was nice and together they created a good problemset 8).
Your solution to 500 is better than what I will talk about, but for the sake of completeness I will mention that for every bus we can explicitly determine probability that it will be first one if we will use technique "knapsack of all elements but one" in $$$O(n^2 \log n)$$$.
Can you elaborate on this "knapsack of all elements but one" technique?
Suppose we have some elements $$$a_1, \ldots, a_n$$$ and we have some data structure D that supports insertions of elements in time T and answers some queries about inserted elements. For every $$$i$$$ at some point in time we want to have a data structure with all elements except for $$$a_i$$$ added.
In this problem elements $$$a_1, \ldots, a_n$$$ are buses and data structure D is an array telling us "for every k, what is the probability that exactly k out of considered buses will come before time $$$t_0$$$?" (where $$$t_0$$$ is the shortest period of a bus) and $$$T = O(n)$$$.
We are actually able to do that in $$$O(nT \log n)$$$ time whereas naive algorithm would give $$$O(n^2 T)$$$ time. We design a recursive function $$$Rec(L, R, D)$$$ which takes some interval $$$[L, R]$$$ as an argument and a data structure $$$D$$$ so that all buses with indices outside of that interval are already added. Then we create two copies of $$$D$$$, namely $$$D_L, D_R$$$, set $$$m = (L + R) / 2$$$, add elements with indices from $$$[L, m]$$$ to $$$D_R$$$, elements from $$$[m+1, R]$$$ to $$$D_L$$$ and call $$$Rec(L, m, D_L)$$$ and $$$Rec(m +1, R, D_R)$$$. If $$$L=R$$$ then we are in a leaf of a recursion which is a data structure omitting $$$L-$$$th element. Of course we call $$$Rec(0, n-1, \emptyset)$$$ at the beginning.
Thanks! Makes sense.
I got into issue with the Java Applet Arena. My first submission (at 222 pts) was through, but did not show up in room result. When I tried to resubmit, there was no warning box (10 percent penalty) but it accepted the submission right away.
Not really making any difference since I did not qualify, but just to note a bug in the Applet.
@organizers, can you take a look at my div1hard? I have times under 1.5s testing on random big tests and when submitting in all displayed tests in practice — except for sometimes test 88 and sometimes test 89 (plus tests after ~100 aren't displayed). It doesn't look deterministic. I see 1.35s now on test 88 and it was TLE (above 3s) in the previous run. I think the issue might be the test length because it's over 19,000 characters. I tried both
double
andlong double
now and the behavior is similar FYI. I heard people had similar problems in the past with unreliable systesting in TC. Can that be a case here?#appeal
EDIT: There are more tests like that, e.g. test 90. Also, I added a special if for test 88 (
if(a[0] == 5779) return 557320504;
) at the beginning of my code and I got 999ms a moment ago for that test...That sure sounds like something is broken. I'll ping the people who are responsible for the backend to take a look at what's going on.
I'd like to join the ranks of appealers. I copied the test case 60 that TLE'd my 500, and around 50% of the time, it finishes in 1.3s.
Same here. I tried my code of 500 in practice room, sometimes it got TLE on #51, sometimes it passed in 1.1s :O
Same is happening for me!
Apologies for the issue! We are manually reviewing the solutions that failed system tests for TLE and will update the results accordingly.
Also, we are looking into the issue and will update asap.
Any news on this? For me, this decides between advancing and waking up at 3am on Friday.
Hey Yeah, We will try and have the leaderboard updated today EOD :) Sorry for taking a lot of time.
Following are the members who have qualified to round 4 from 3A: I will update the website and you will also receive emails about it later in the day:
So... did nothing change? My rating definitely didn't and that's a good hash of final results. Or what happened with that bug?
Sorry was travelling and had to paste results with limited connectivity.
We are still exploring the issue. Seems to be only with C++.
In the meantime, we are manually again reviewing the submissions and email/add to the list of round 4 qualifiers. Don't want anyone to miss out Round 4 because of the issue.
I am unsure we will be able to update the results in the near-time till we have a fix for the issue. But will have the bonus qualified members informed as soon as possible.
So... I was testing my 1000 on Arena today on some max tests, and on each invocation Arena reported either 0.6s or 2.3s execution time. It was totally random, and both results appeared when I ran the same test multiple times. WTF happened?!
(Disclaimer: my solution is totally deterministic. I don't think there were any UBs even though it eventually failed systests. I tested my solution locally on the same tests and no discrepancies occurred.)
This seems to be the same issue Errichto described above. We'll have someone look into this.
can somebody from topcoder check this link? https://apps.topcoder.com/wiki/display/tc/Algorithm+Problem+Set+Analysis
Throwing error with response code 502
hmehta said he will export all the old editorials in a PDF and put it on the website asap. Though it's been five days since.