175iq's blog

By 175iq, history, 5 years ago, In English

I recently found out that inserting a vector in a map is possible :

map< vector<int>, int > mp;
vector<int> vec = {1,2,3,4,5};
mp[vec] = 5;
cout<<mp[vec];
// prints 5
  1. If there are N vectors in mp present as keys already, what is the time complexity to insert a new vector of size M in mp ? Is it O(M*log(N) or something else depending upon the sizes of vectors present in mp already?
  2. What would be the time complexity of a query operation such as mp[vec]? Please help.
  • Vote: I like it
  • +21
  • Vote: I do not like it

| Write comment?
»
5 years ago, # |
  Vote: I like it +12 Vote: I do not like it

Both are $$$O(M*lg(N))$$$

»
5 years ago, # |
Rev. 2   Vote: I like it +46 Vote: I do not like it

Any type which has operator< defined can be inserted in the map. vector has this operator defined, that's why vector can be inserted in the map.

map is implemented as a self-balancing tree (usually red-black tree). Time taken in inserting or searching a key in self-balancing trees is O(log(N) * T(operator<)). For the case of vector time required for operator< is linear in terms of the size of the vector as this operator will require to traverse all the elements of the vector. Hence it will be O(log(N) * M).

»
5 years ago, # |
  Vote: I like it 0 Vote: I do not like it

Can I ask a similar question ?

if N is the size of vector

Is adding a map to vector having the complexity of O(N)

And if I sort a vector, is the complexity O(N log N)

  • »
    »
    5 years ago, # ^ |
    Rev. 3   Vote: I like it 0 Vote: I do not like it

    Yes.

    map < int , vector < int > > insertion has complexity O(log(map_size)+vector_size).

    sort(vec.begin(),vec.end()) has complexity O(vector_size*log(vector_size)*complexity_of_comparing).

    • »
      »
      »
      5 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      What is the worst case of comparing ? I mean is it affect much ?

      • »
        »
        »
        »
        5 years ago, # ^ |
          Vote: I like it 0 Vote: I do not like it

        It depends on the vector's content. For example, vector < int > has complexity_of_comparing=O(1).

        • »
          »
          »
          »
          »
          5 years ago, # ^ |
            Vote: I like it 0 Vote: I do not like it

          oh, ok.

          And can I ask something more. If we want to use something to store sorted adj vertex. What is the best to use, vector<set> or vector<map<int, bool>>

»
5 years ago, # |
Rev. 3   Vote: I like it +8 Vote: I do not like it

Well, strictly speaking, the complexity of calling mp[ver] or insert(vector,int) can be as high as log(N)*max_vector_length_in_mp, as c++ standard only guarantees O(log(N)) times of comparing, while not specifying which elements will be compared.

This is also related to how the compiler implements < between vectors. If comparing A and B costs O(|A|+|B|) instead of O(min(|A|,|B|))(as we would normally implement it), complexity can reach log(N)*max_vector_length_in_mp as well.

However, the two scenarios mentioned above are not so likely, so I guess for most compilers that complexity will be O(M*log(N)). Don't rely on that too much though, as priority_queue in GCC has already turned out to have used a rather strange(and counter-instinct) implemention, which makes the complexity of inserting a vector into it become log(N)*max_vector_length_in_mp.

  • »
    »
    5 years ago, # ^ |
    Rev. 5   Vote: I like it 0 Vote: I do not like it

    Follow-up: another thing that I was interested in at some point is why std::sort of a vector of std::string elements would be linearithmic in the total size of the strings. I struggled to come up with a guarantee of any sort.

    Moreover, I couldn't figure out any algorithm that would sort a list of $$$N$$$ strings in the given complexity that would be easier than shuffling the $$$N$$$ strings and then inserting them into a BST one string at a time.

    In particular, I'm not sure how a regular binary max-heap would guarantee (even amortized) $$$O(\text{sum_of_lengths }log(N))$$$ for $$$N$$$ operations.

    Even harder follow-up: In some harder string problems, the solution is to sort strings with the comparator defined by the relationship $$$a \prec b \Leftrightarrow a + b < b + a$$$. This comparator has $$$O(|a| + |b|)$$$ complexity instead of $$$O(min(|a|, |b|))$$$. Why is it fast enough to std::sort using this comparator. Moreover, which sort algorithm would guarantee linearithmic complexity with this comparator?

    Any thoughts on these?

    • »
      »
      »
      5 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      On sorting of strings, I think quicksort can handle the job in O(sum(length) log n) in expectation, even if comparing of strings A and B costs O(|A|+|B|). Mergesort can give the same complexity (without randomness) if comparing costs O(min(|A|,|B|)).

      On the complexity of binary heap (under the assumption that comparing is O(min(|A|,|B|))), let's consider two cases: 1. pop: After each comparing, one of the operands will take a step up in the heap, and let's count the cost in the element who goes up. 2. push: Similarly, with each comparing one operand goes up. The only exception is the last comparing. We count all these costs in the inserted element.

      It's not hard to prove that each element can only be counted O(log) times.

      • »
        »
        »
        »
        5 years ago, # ^ |
          Vote: I like it 0 Vote: I do not like it

        On the $$$|A| + |B|$$$ problem, even though it might go well in expectation, quicksort on strings can act quadratically with a probability of $$$1/n$$$, which is a lot higher than the "normal" quick-sort, and it is probably unwanted. For example, this might be problematic if you have one long string and (not too many) short strings.