demoralizer and I were discussing our 2+ years old code and I thought it'd be a good motivation for newbies and pupil who write cancer currently.
Please contibute by sharing code(in spoiler tag) or links to your of your big brain moments from back when you were low rated.
My biggest big brain moment was when I assumed inbuilt std::sort must be slower, so copied merge sort from gfg. https://www.codechef.com/viewsolution/23270721
This is where this discussion began:
Ultimate toga moment for all programmers.
Bravo Napoleon.
What about gauss elimination by hand?
I really love
panic
in your code.You were too dumb to even spell eight correctly back then XD
I had the opposite problem, I was trying to solve a puzzle where you needed to fill in around 30 cells with either 0 or 1, so I wrote 30 nested for loops
assumed inbuilt std::sort must be slower
atleast you had knowledge of stl,I used to write gcd ,sort functions on my own and instead of pair I was using 2d array.Don't underestimate me, I still write cancer solutions to simple problems. (I remember spamming maximum bipartite matching on a div2 B in-contest.)
I once used Backtracking + CircularLinkedList + Sets and Randomization to solve a div2 A, felt empty despite the AC, but at that moment, not solving A would've been harder to digest. submission
As someone already pointed out, all of us write cancer solutions from time to time.
For example, this problem from the ICPC WF Invitational mirror had a trivial $$$O(n^2)$$$ solution, but I didn't read the constraints and used lex and yacc to generate this monstrosity of a solution that works in $$$O(n)$$$ instead.
Edit: since someone said the submission isn't visible, here you go.
scrolled mouse wheel 28 times to reach bottom of your solution . [on ideone]
E P I C :)
I remember one time I solved a rather tough problem for my level but ended up not implementing it because I thought time complexity for nCr is O(n) and not O(r).
I thought writing a $$$O(n^2)$$$ in cpp instead of java would get me an Accepted solution.
In some special cases you can actually do it as there are some optimization tricks available on C++
Not quite sure why am I wrong here. It is not all the cases but rather than special cases.
Such as:
I thought that binary search is so slow in special cases so I use random middle pivot to boost it in those cases :)
I thought inserting
#pragmas
in my code will magically make my code fast, so I should not bother about time complexities anymore!I thought maps are magical. Just insert elements into a map, and voila, you have sorted the input in O(n). I remember telling this to my friend and then proceeding to have a clown moment for the rest of the day.
Well, if you consider frequency array as map...
O(n+maxN) is O(n)
Yet another big brain moment xD
I thought most of the $$$O(n^2)$$$ solutions would pass if I use correct pragma :(
Knowing generic_placeholder_name, that is most likely the case.
I thought printing sample test cases output will give me accepted
I used to think set operators like |(Union),&(Intersection) in python has O(1) time complexity.
I used to learn Z-function by heart without understanding how it works at all before I attended offline competitions. I was afraid of tasks about strings and thought that somehow z-function might save me. It did not work out even once)
I used to think both 1 and 2 were equally efficient.
I still think they are equally of same time complexity :( Aren't they?
Let $$$n$$$ be the size of $$$a$$$ and $$$m$$$ the size of $$$b$$$, then the first one is $$$O(n + m)$$$ and the other is $$$O(m)$$$.
This is because it returns a new string by doing
a + b
.a += b
on the other hand, just appendsb
toa
, it's like push back in vector. As a side note, push back also works there:a.push_back(b)
.It's not strictly $$$O(m)$$$, but it's $$$O(m)$$$ on average and $$$O(n+m)$$$ at max. That's also important to know, though I don't have a sample of code, where it matters.
Are you talking about the case when
a
reaches its maximum capacity by doinga += b
?I am
Thanks for clarification
The naming was adequate though
Wrote an solution with exponential complexity where $$$n = 2e5$$$, got TLE, searched Google:
How to speed up program in C++
then addedios_base::sync_with_stdio(false); cin.tie(NULL);
, spent the rest of the contest wondering why I still got TLE...Will let you know once I become high rated :).
When in my first contest on codeforces (div3 round) I got +500, I thought I will become red after few more contests.
Anyone else once thought that
vector.insert
works in $$$O(1)$$$?me thinking that $$$O(n^2)$$$ is faster than $$$O(n log n)$$$
Those cases in which I was getting TLE or WA on gfg, I just copy those cases, and put if condition for those and print the result in O(1)
I solved nearly 10-12 problems in similar manner on interviewbit and gfg :):
I had some weird obsession with macros.. warning: disturbing content below
in case the image doesnt show up, click here
long long :D
I used to think that std::lower_bound in a set was $$$\mathcal{O}(\log N)$$$ and surprisingly solved many problems, until the day when I TLE'd
Didn't you mean std::lower_bound on set?
Yes, thanks for pointing it out.
Umm.. Noob here ! Isn't
lower_bound
operation on a set or sorted vectorO(log n)
??For sorted vectors it's logarithmic but for sets it's linear
std::lower_bound on set
Are you sure?, I still think that it's logn and used it many times and didn't got TLE.
As mentioned here,
std::set
doesn't has random access iterators, so the time complexity is linear.If you have a set S, then "S.lower_bound(number)" is O(logN), but "lower_bound(S.begin(), S.end(), number)" is O(N). One is a member function, the other is a generic function in the "algorithm" header. Make sure to not mix these up!
True
Lesson: use std::set::lower_bound instead of std::lower_bound on set
Another edition of "I still write cancer":
Switch case is something that I will not use in 100 years. If else it is. So at my latest test at the university, we had a trivial problem with dates (day,months,years) and some operations with them. So my code has like a million of:
Not high rated, but in my first contest which was a google kickstart, I didn't knew that godly things like
std::sort
exist, and as I just completed the MIT course on algorithms (because for some reason I thought I'd clear IOI by watching it), I know heap sort is the fastest thing I know so I copied the heap sort code from gfg (that too in java cuz aman dhotarchor said java is better) and ended up in getting an RTE which I spent whole hour figuring out.Second one: my first contest on CF was by coincidence an April Fools round and I ended up solving three problems, so I thought I am cyan level from the start
I still do this for ternary search. :P
Valid
learning not basic data structures while being green
also tried reading about splay tree when i was 1300 to flex on my friends with it
When I heard people talking about "node", I thought it was some graph shits. Turned out to be nodeJS.
I used to think rating was directly proportional to how many macros you used.
Wrote an exponential solution to this problem and it passed pretests and FSTed. I thought I solved a Div2 C for the first time :(
I used to believe std::lower_bound and std::upper_bound is all that binary search algorithm is.
Well , I believe that too even today as noob coder
i thought $$$\frac{10^6}{2} = 10^3$$$ in a contest and proceeds to spend whole contest wondering why my code got TLE.
and this has happened more than once :)
I thought the compiler on cf wont be able to tell what the result of program was. So I quoted with the output
cout<< "The answer is : " << ans << endl;
And then also you got WA :)
Me doing binary search as: checking mid element and if mid<element : then doing linear serach from 1 to mid else linear search from mid to n
To get the minimum integer of an array I used
*min(vec.begin(), vec.end())
. To test this I took a "random" array in which the first element was the minimum of the array :). So it took me a long time to figure out what was wrong.Fun fact: You can now use
std::ranges::min(vec)
from C++20 onwards.I used to think that stl
size()
function has complexity ofO(n)
and always stored the size in a variable before writing a loop.I once thought I had invented an O(n) sorting algorithm. Didn't know time complexity of map back then.
-
You can't maintain sorted order in unordered_map :p
And that's what happens when you comment before coffee! Yeah no way that works
56589098, I solve this during the contest when I was newbie, a Few days later I got that this was a "prefix sum" approach.
A fairly long time ago I had no idea functions exist and wrote everything in the main()
Without an understanding how to do functions, when I encountered any problem with some graphs, I would be trying to make graph traversal algorithms using for loops, gotos(yeah...), and bools, all in main(), and it was WILD.
I remember how I would have 5 nested for loops and a million ifs inside and would be thinking "why does it have to be so complicated". Debugging was a total nightmare, and it was more or less like "if i don't fix this within an hour i give up". Unfortunately i cant get any code from that time cause it was on an old laptop that doesnt work anymore lol.
You can imagine how it felt to find out about recursions and dfs lmao.
I once had an $$$O(n)$$$ solution to a problem. Complexity was ok but the problem had $$$1e5$$$ testcases(so solution must be calculated in a single $$$O(n)$$$ pre-processing). Got a TLE on pretests so I made a global map which stores the values after printing them in each testcase. Passed pretests and FSTed.
More than 6.5 years ago. Must've been doing competitive programming for 7-8 months or so, I didn't know about digit dp. Was trying to solve https://projecteuler.net/problem=164, so I ended up writing code whose output was the code that solves this problem. https://ideone.com/XgqZDh
Did it work?
Oh yeah, it did :)
That's hilarious XD
I used Pascal
Recently I invented centroid decomposition on array
(I forgot that it's called segment tree)
Back when I didn’t know what the Sieve of Eratosthenes was, I used to make an array with all the prime numbers within a range for any solution that required them. Ex. 104820322
Needless to say, the speed at which I came up with this type of solution was awfully slow.
I used t think that we need to learn whole C++ along with all the Data Structures and Algorithms in the world before starting Competitive Programming, as the word Competitive seemed too scary. I used to think that only the best coders need to compete in order to deicide the best of the best, like in the Olympics.
Thanks to my college seniors for kicking my ass right from the beginning, I didn't end up making that mistake.
never trusted function swap(a, b); I always wrote:
void my_swap(int a, int b)
{
}
Used to generate short permutations without using array.
I thought printing sample cases brings me above those who do not attempt the questions and gives me + rating.
I used to be obsessed with doing the XOR trick to swap two integers w/out a dummy variable. I thought it was a galaxy brain thing to do.
Once doing a problem during some contest I was too amateur to understand what sum of k over test cases meant . So had an idea to solve the problem using two nested for loops one for N and another for K . Thought the complexity was O(N*K) and it would Tle , so didnt proceeded with the solution only to find out editorial had the same solution and spent like two weeks wondering why it would not Tle
Later on understood complexity was O(N+K) because it was written sum of K over all test cases was less than 1e5
I used to assign whatever phrase came to mind at the time to strings whenever I needed a way to set them to null.
I also thought array indices started from 1.
Enjoy: 26783948
thought that problems were stupid and C++ was too slow because my O(n^2) solution gives TLE.
learned
dinic's flow algo
while struggling to get out of green!I remember reading about what union find does and then thinking "oh that sounds easy enough to implement".
I then went on to code some $$$O(n^2)$$$ algorithm for the unite and find functions and was confused why it was too slow
That was the day I learnt about the importance of time complexity.