Counting Bits
Problem
Given an integer n
, return an array ans
of length n + 1
such that for each i
(0 <= i <= n
), ans[i]
is the number of 1
's in the binary representation of i
.
Example 1:
Input: n = 2 Output: [0,1,1] Explanation: 0 --> 0 1 --> 1 2 --> 10
Example 2:
Input: n = 5 Output: [0,1,1,2,1,2] Explanation: 0 --> 0 1 --> 1 2 --> 10 3 --> 11 4 --> 100 5 --> 101
Constraints:
0 <= n <= 105
Solution
/**
* @param {number} n
* @return {number[]}
*/
var countBits = function(n) {
const dp = [0];
for (let i = 1; i <= n; i++) {
dp.push(1 + dp[i & (i - 1)]);
}
return dp;
};
We will implement a DP solution. Let dp[i]
be the number of 1
's in the binary representation of i
.
- For the base case, if
i = 0
,dp[0] = 0
since there are no0
s. - For the recursive case, notice that
i & (i - 1)
removes the least significant1
bit fromi
(and produces a smaller value ifi
is not0
). This meansdp[i] = 1 + dp[i & (i - 1)]
.
For example, if n = 14
, its binary representation is 1110
, so n - 1
is 1101
, n & (n - 1) = 1100
.