[A+DS]

Bits

Problem: Count the number of bits set to `1' in a word of length `w' bits, as quickly as possible. This can be done with loops, e.g. from 0 to w-1, in O(w)-time, but it can be done in O(log(w))-time which is much faster, e.g.

        76543210 - columns
        --------
input: `01101101' has 5 bits set to `1';  w=8.

stage1:

   01101101  >> 1  = 00110110
 & 01010101        & 01010101
   --------          --------
 = 01000101          00010100
                        |
                        |
   01000101             |   NB. adding two one-bit #s gives one
 + 00010100  <-----------       two-bit #, (1+0),(0+1),(1+1),(1+0)
   --==--==
 = 01011001                   = 1,1,2,1 base4!
   --==--==


stage2:

   01011001  >> 2  = 00010110
 & 00110011        & 00110011
   --------          --------
   00010001          00010010
                        |
                        |
   00010001             |
 + 00010010 <------------
   ----====
   00100011  NB. 2 (0010) bits set in cols 7-4, 3 (0011) in 3-0
   ----====  2,3 base16


stage 3 (i.e. stage log2(w)):

   00100011  >> 4  = 00000010
 & 00001111        & 00001111
   --------          --------
   00000011          00000010
                        |
                        |
   00000011             |
 + 00000010 <------------
   --------
   00000101 = 5256 = 510, i.e. 5 bits were `1'
   --------

So, w=8 --> 3 stages, w=16 --> 4 stages, w=32 --> 5 stages, etc.. Each stage involves a right-shift `>', two bit-wise ands `&', and an addition.

This is not `core' material for CSE2304; it just shows that apparently simple problems can have amazing solutions. I'm not sure where it comes from originally, it might be one of Bentley's books?
- L.A. 5/3/1999