Блог пользователя atcoder_official

Автор atcoder_official, история, 4 месяца назад, По-английски

We will hold Toyota Programming Contest 2024#8(AtCoder Beginner Contest 365).

We are looking forward to your participation!

  • Проголосовать: нравится
  • +82
  • Проголосовать: не нравится

»
4 месяца назад, # |
Rev. 2   Проголосовать: нравится 0 Проголосовать: не нравится

Dislike it because it's a speed-round. It seems that most people (at least my mates) get the first 5 problem and the time has become even more important.

Like it because I'm faster than them.

True, speed is an important factor in cp. Unfortunately, our Chinese participants often train using the OI format (no feedback during , evaluation after), so they may get disappointed again.

»
4 месяца назад, # |
Rev. 3   Проголосовать: нравится 0 Проголосовать: не нравится

Problem E is way too standard,almost all have seen it or they can easily google it.

Just type — "sum of xor of all subarrays".

»
4 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Speedforces (or Speedcoder)

The difficulty spike is insane because E has almost 3000 ACs but F and G has less than 300 ACs each

»
4 месяца назад, # |
  Проголосовать: нравится +22 Проголосовать: не нравится

E — Xor Sigma Problem is a strictly easier version of CF1879D : Sum of XOR Functions. In fact, when I solved the harder version 10 days back, I had already prepared model solution for the easier version (was planning to write a blog and create an easy version). I was able to get AC by just changing 2 lines in the code from hard version. Well, lucky me, haha.

Code for ABC365E : XOR Sigma Problem

Code for CF1879D : Sum of XOR Functions

If you're a beginner to DP on Bit Manipulation, here are some easy problems.

If you're at an intermediate level, you can try these:

If you're at an advanced level, you can try these:

»
4 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

what is the idea for G?

»
4 месяца назад, # |
  Проголосовать: нравится +8 Проголосовать: не нравится

Liked G but wasn't able to figure out F, Sol for F?

  • »
    »
    4 месяца назад, # ^ |
    Rev. 4   Проголосовать: нравится +9 Проголосовать: не нравится

    You can solve it using DSU. Sort all queries using $$$s_x$$$. Query $$$x_1\, y_1\, x_2\, y_2$$$ is equivalent to $$$x_2\, y_2\, x_1\, y_1$$$ because operations can be reversed so we can use the same number of operations to go from start to target or from target to start.

    Iterate over $$$x$$$ from $$$1$$$ to $$$n$$$ and represent each query as a point equal to the query's current value of $$$y$$$.

    1. Start considering queries that have $$$s_x = x$$$ and add a point with value $$$s_y$$$.
    2. Move and merge all points that have value $$$\gt U_x$$$ into a single point at $$$U_x$$$.
    3. Move and merge all points that have value $$$\lt L_x$$$ into a single point at $$$L_x$$$.
    4. Answer queries with $$$t_x = x$$$ and discard their points. To answer a query consider the operations on all the ancestors (representatives) of this query's point. Note that the number of ancestors of an item in the DSU is $$$O(\log Q)$$$ if we merge a group with a smaller size in a group with a larger size.

    Analysis: each point (query) will be added once in a set, removed (after merging) once from the set, and answered in $$$O(\log Q)$$$. So overall time complexity is $$$O\left(\left(Q + N\right)\cdot \log Q\right)$$$.

    Here is my submission.

»
4 месяца назад, # |
  Проголосовать: нравится +6 Проголосовать: не нравится
»
4 месяца назад, # |
Rev. 2   Проголосовать: нравится +23 Проголосовать: не нравится

Origin & difficulty & data aside, I think F and G are good.

»
4 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

has anyone solved d with recursive dp?

»
4 месяца назад, # |
  Проголосовать: нравится +4 Проголосовать: не нравится

Brute force passed G. Code

  • »
    »
    4 месяца назад, # ^ |
      Проголосовать: нравится +2 Проголосовать: не нравится

    If I commit suicide and don't come back, G will be the reason.

  • »
    »
    4 месяца назад, # ^ |
      Проголосовать: нравится +18 Проголосовать: не нравится

    Brute force is supposed to pass if implemented in a certain way, you can proof an upper bound of $$$O(n^{1.5} logn)$$$ but i don't think the upper bound applies for this implementation, it most likely passes due to weak tc.

    • »
      »
      »
      4 месяца назад, # ^ |
        Проголосовать: нравится 0 Проголосовать: не нравится

      Can you elaborate more on this upper bound?

      • »
        »
        »
        »
        4 месяца назад, # ^ |
        Rev. 2   Проголосовать: нравится +21 Проголосовать: не нравится

        1)Let's assume you save the answers of queries in a map to process repeated queries. Now, you only need to worry about distinct queries.

        2) Second, For a particular query A and B, Let $$$d=min(size_A,size_B)$$$ where size_i=number of times person i entered, then exists a solution in $$$O(d.log(max(size_A,size_B))$$$. Basically you iterate on the segments in the lower sized vector and binary search on the other vector to find the left most and right most segment the current segment intersects with. Now, use prefix sums to find the answer for the current segment.

        Now that you know these two things, what is the time complexity?

        $$$O(\sum min(A_i,B_i).log)$$$ where the sum is accross all distinct queries.

        Finally the worst case will be when all the vectors have equal size since for unequal sizes you can just take the smaller sized vector. Which means that the total size is M/N for every person from 1 to N.

        Now,For fixed M=2e5,you need to find the min N such that you can have Q=2e5 distinct queries this is $$$N_{C_2}=Q=2e5$$$ which is around $$$Q^{0.5}$$$ or $$$M^{0.5}$$$. In this case the individual query is processed in $$$O(M^{0.5}log)$$$ and there are Q queries. So, Final Time complexity is $$$O(Q.M^{0.5}.log)$$$

        • »
          »
          »
          »
          »
          4 месяца назад, # ^ |
            Проголосовать: нравится 0 Проголосовать: не нравится

          Thanks!

        • »
          »
          »
          »
          »
          4 месяца назад, # ^ |
            Проголосовать: нравится 0 Проголосовать: не нравится

          Can you elaborate more about how does the prefix sum work to obtain the answer for current segment?

          • »
            »
            »
            »
            »
            »
            4 месяца назад, # ^ |
              Проголосовать: нравится 0 Проголосовать: не нравится

            Handle the leftmost segment and rightmost segment seperately and for the all the segments in between you only need to find the sum of their lengths. My Submission should make it more clear.

  • »
    »
    4 месяца назад, # ^ |
      Проголосовать: нравится +3 Проголосовать: не нравится

    Thank you all for helping me prove the time complexity and the after contest will be fun

»
4 месяца назад, # |
Rev. 2   Проголосовать: нравится +3 Проголосовать: не нравится

In problem G I came up with the idea that is similar to the Japanse editorial: comparing people with less than $$$\sqrt M$$$ segments by brute force (two pointers, for example), and precomputing something like prefix sums in $$$O(N \cdot \sqrt M)$$$ on compressed coordinates for people with more than $$$\sqrt M$$$ segments, so I could calculate in $$$O(\sqrt M)$$$ intersection for people with small and large amount of segments. But what should I do in case where I have to calculate intersection for people that both have more than $$$\sqrt M$$$ segments? The amount of such pairs still would be $$$O(M)$$$ worst case, or am I missing something?

P.S. I miss written English editorials :(

»
4 месяца назад, # |
  Проголосовать: нравится +3 Проголосовать: не нравится

I think there is binary lifting solution to F as well but seems too much details will be involved.

  • »
    »
    4 месяца назад, # ^ |
      Проголосовать: нравится 0 Проголосовать: не нравится

    See my reply!

  • »
    »
    4 месяца назад, # ^ |
      Проголосовать: нравится +3 Проголосовать: не нравится

    One way to do it, is to use a sparse table to do jumps where before the end of a jump you move without changing $$$y$$$ and then to move to the final $$$x=e$$$ you first need to move to $$$y = L_e$$$ or $$$y = U_e$$$. This allows us to have a reasonable number of states by also having for the initial $$$x=s$$$ a $$$y = L_s$$$ or $$$y = U_s$$$. Then the sparse table allows you to do many of these jumps quickly, and you move forward as long as you don't go beyond $$$t_x$$$.

    Additionally to precalculate the jumps you also need a couple of sparse tables, and you need to do a jump from the initial $$$s_x$$$ (but you can reuse the code needed for the precalculation).

    You can check my submission for an implementation.

»
4 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Please share recursive memo solution for D ?

»
4 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Can anyone tell me why this dp idea won't work(or is there something wrong with my code)
I have used dp[i][j] as maximum number of games that takahasi can win till index 'i' if
j=0, if the ith round is drawn
j=1, if the ith round is won by takahashi

  #pragma GCC optimize("-O2")
  #pragma GCC optimize("O3,unroll-loops")  
  #include <bits/stdc++.h>
  #define lli long long int
  #define pii pair<int,int>
  #define mii map<int,int>
  #define vi vector<int>                      //(num<<i) == num* pow(2,i);
  #define pb push_back                        //(num>>i) == num/pow(2,i);
  #define vlli vector<long long int>
  #define nl cout<<'\n'
  #define yy cout << "YES\n" 
  #define nn cout << "NO\n" 
  #define all(a) a.begin(),a.end()
  #define IM  INT_MAX
  #define IMN INT_MIN
  #define pc cout<<count<<'\n'
  #define int long long int
  #define mp make_pair
  #define vpii vector<pair<int,int>>
  #define Yy cout<<"Yes\n"
  #define Nn cout<<"No\n"
  using namespace std;
  signed main(){
     ios_base::sync_with_stdio(false);
      cin.tie(NULL);
     int n;cin>>n;
     string s;cin>>s;
     vector<vector<int>>dp(n,vector<int>(2,0));
     dp[0][1]=1;
     for(int i=1;i<n;i++){
        if(s[i]==s[i-1]){
            dp[i][0]=dp[i-1][1];
            dp[i][1]=dp[i-1][0]+1;
        }
        else{
            if(s[i]=='R'&&s[i-1]=='P'){
                dp[i][0]=max(dp[i-1][0],dp[i-1][1]);
                dp[i][1]=dp[i-1][1]+1;
            }
            if(s[i-1]=='R'&&s[i]=='P'){
                dp[i][1]=max(dp[i-1][0],dp[i-1][1])+1;
                dp[i][0]=dp[i-1][1];
            }
            if(s[i]=='R'&&s[i-1]=='S'){
                dp[i][0]=dp[i-1][0];
                dp[i][1]=dp[i-1][1]+1;
            }
            if(s[i-1]=='R'&&s[i]=='S'){
                dp[i][0]=max(dp[i-1][0],dp[i-1][1]);
                dp[i][1]=dp[i-1][1]+1;
            }
            if(s[i-1]=='P'&&s[i]=='S'){
                dp[i][0]=dp[i-1][0];
                dp[i][1]=max(dp[i-1][0],dp[i-1][1])+1;
            }
            if(s[i-1]=='S'&&s[i]=='P'){
                dp[i][0]=max(dp[i-1][0],dp[i-1][1]);
                dp[i][1]=dp[i-1][1]+1;
            }
        }
        
     }
   
    cout<<max(dp[n-1][0],dp[n-1][1]);
  }
  • »
    »
    4 месяца назад, # ^ |
      Проголосовать: нравится 0 Проголосовать: не нравится

    It's different situation when he play different thing to win, for example, there's difference between you give 'R' to win and give 'P' to win.

    And here is my solution which solve the different situation separately, maybe you'll understand better after reading it.

»
4 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Could anyone tell me what's the time complexity of Problem C

Is there any simpler solution for it?

  • »
    »
    4 месяца назад, # ^ |
    Rev. 2   Проголосовать: нравится 0 Проголосовать: не нравится

    The time complexity of the intended solution (in the Japanese editorial) is $$$O(N\log\max{A_i})$$$. However, there does exist an $$$O(N\log N)$$$ solution, which can be achieved by sorting the sequence and using a single linear scan to find the answer.