Пожалуйста, прочтите новое правило об ограничении использования AI-инструментов. ×

What is the proof for optimal merge pattern problem?

Правка en1, от SenbonzakuraKageyoshi, 2021-07-06 08:40:15

Hi everyone. I have been stuck on this problem for quite sometime and would like some help from the community. For those who aren't familiar with the name, let me define the problem quickly.

You are given a set of n numbers. In a single merge operation you can remove any 2 elements from the set and put their sum back inside. The cost of each merge operation is also equal to the sum. Repeat this until there is only 1 element left in the set. The total cost is the sum of cost of individual operations. We need to minimize the total cost.

Example-

Intitially : cost=0, v={1,2,3}

merge 1 & 2, cost+=3, v={3,3}

merge 3 & 3, cost+=6, v={6}

So finally,cost=9

So basically the greedy approach works here. In each iteration just merge the 2 smallest elements. I sort of get the intuition but I can't prove why this works. I can see that this can be converted to the Huffman Coding problem but the math in proofs is beyond me. I would really appreciate if someone could give me a less mathy solution...

Thank you for your time!!

Теги #greedy, #proof

История

 
 
 
 
Правки
 
 
  Rev. Язык Кто Когда Δ Комментарий
en1 Английский SenbonzakuraKageyoshi 2021-07-06 08:40:15 1109 Initial revision (published)