现充|junyu33

An interesting problem in a freshman course

In response to the college leadership's "call to enhance programming skills," a problem-solving platform has recently been added to the freshman programming course (so now they have four assignments, game projects, security projects, and programming exercises—quite a heavy load indeed). Originally, the platform featured relatively basic language-specific problems, but yyx added data structure problems and large-scale simulations, significantly increasing the difficulty of the assignments. I decided to incorporate some thinking-oriented problems to balance the innate coding advantage of younger students.

Idea

This came up during lunch with yjp one day. yjp quickly came up with the correct solution for part 0 but had some issues with the approach for part 1. I managed to derive a method with lower complexity through a certain approach (which I'll explain below).

Question

part0

Given a number x initially 0, you have two operations: add 1 or multiply by 2. What is the minimum number of operations required to change 0 to a natural number a?

part1

Given a number x initially set to 1, you can perform three operations: add 1, subtract 1, or multiply by 2. What is the minimum number of operations required to change 1 to a natural number a?

solution

Part 0

Thinking from a binary perspective: Multiplying this number by 2 is equivalent to shifting it left by one bit, while adding 1 to it is equivalent to appending a 1 at the end. The timing of adding this 1 can be arbitrary. Therefore, the optimal solution is to add a 1 immediately during the left-shift process whenever the corresponding bit is eventually 1 (otherwise, more 1s would need to be added later to compensate). We can derive a formula:

Minimum number of operations = number of binary digits + number of 1s in the binary representation + C

Substituting a sample value gives C=1.

Converting a number to binary requires O(logx) time, so the time complexity is O(logx), and the overall time complexity is O(Tlogx).

Part 1

yjp initially wanted to handle the case where the number ends with a series of 1s, but the specific details weren't fully thought out, so they had to give up.

First, let's set up a dynamic programming (dp) approach. Let dp[i] be the minimum number of operations required to change 1 to i. Clearly, dp[1] = 0. The transition equations can be written as:

if (i % 2 == 0)
   dp[i] = min(dp[i/2] + 1, dp[i]);
dp[i] = min(dp[i-1] + 1, dp[i]);    
dp[i] = min(dp[i+1] + 1, dp[i]); 

Obviously, this dp approach lacks the Markov property (no aftereffect). So what can be done? Trying a few more times might help.

However, the total complexity in this case would be O(Tkx), which would definitely result in a TLE.

Then, I had a sudden inspiration—I looked up the optimal solution on OEIS and found a better transition equation:

if (i % 2 == 0)
   dp[i] = min(dp[i/2] + 1, dp[i]);
else
   dp[i] = min({dp[i/2] + 2, dp[i/2+1] + 2, dp[i]});

Now, the number of possible states becomes logarithmic. We can use a map to maintain the states and perform memoized search.

Since map is implemented using a red-black tree, both reading and writing operations have logarithmic complexity. Therefore, the overall time complexity is O(Tlog2x).

PS

Part 0

The number of people who solved the first problem was within my expectations. Around 10 people achieved AC within 24 hours.

Part 1

The second problem had no AC (Accepted) submissions in the first 24 hours, which was slightly unexpected.

On the second day, a non-young student managed to solve it, which aligns with my intention when designing the problem.

However, later on, a young student came up with an O(Tlogx) solution. After reviewing some of the AC codes, it turns out they indeed used simulation and case analysis—seems like yjp's wish has finally come true. Congratulations!