Let's try to understand ''It From Bit'' more clearly.
There are the very basic idea and corresponding Math formula.
With ''one bit of information'', a smallest piece of data, we can represent only two numbers - zero and one. 0 or 1.
This means, that a bit is a ''placeholder'' for a ''state'' which cold be either ''on or off'', 1 or 0.
How many different states (numbers) could we represent having just ''one bit of information''? only 2 (2 to the 1st power).
To represent more states we need to use more bits (placeholders). Each bit can only represent two states. What if we have 2 bits? Well, 2 to the 2nd power is 4. That means we can represent 4 different states (or 4 different numbers - 0, 1, 2 and 3) with 2 bits.
We have very natural habit to count on our fingers. All the fingers closed - 0, one open finger - 1, two -2, three - 3, and so on, which finger - does not matter. How many numbers could we represent this way? 10.
Now suppose that we have 10 bits - one bit of information for each finger of your hands. A closed finger corresponds to 0, and stretched out one to 1, and ''the order matters''. How many different numbers could we represent this way? A lot. 1024 different numbers, from 0 to 1023.
How we do that? We need rules. Lets say that we start from the right-most finger of our right hand and going to the left, till the left-most finger of the left hand. This is an order in which we use our bits. Now, all fingers closed is 0, Thumb up on the right hand is 1. Only Index finger up - 2. Index ''and'' thumb - 3, and so on. Very simple rules, btw.
The problem is - it is murderously hard to without years of training to read or present numbers this way. But for a computer it is the only possible representation - an ''ordered sequence of bits''. Each bit could be set either to 1 or 0.
We have a sequence of bits, which depending of how many bits we use, could represent a set of numbers. The question is - what is a maximum length of a such sequence?
The answer is 'infinite'. With an infinite sequence of bits we can represent an infinite big number. Just imaging a very-very long sequence of ones and zeroes.
What if we just have a 64-bit ''chunk'' (or a block) of information - a distinct sequence of 64 placeholders? How many different numbers could we represent (encode) within 64 bits of information? 18,446,744,073,709,551,616.
That distinct chunk of information is usually called a ''word''. A modern 64-bit processor, usually, has 64-bit ''machine words''. But it is just a convenience, a way to deal with very long sequences of bits by splitting them into fixed-size chunks.