Here's a number for you: 72. There's nothing special about it; it's no better or worse than 27, 323, or 112. By itself, this number means absolutely nothing. But when we attach it to a context, it takes on meaning. For example, we could give it some context by putting a dollar sign in front of it: $72. Now it means something: some wealth. Or we can add units afterwards: 72ºF, or 72 pounds, or 72 bottles of beer on the wall. It is the combination of number AND context that creates meaning.
Inside a computer, we don't even have numbers; we have only bits. The number 72 shows up in a computer as 1001000. However, we like to bundle bits together into bytes, which are groups of eight bits, so inside our computer, the number 72 would look like this: 01001000. Computer programmers prefer to express bytes in hexadecimal format, in which 72 turns out to be $48 (the dollar sign specifies that the number is in hexadecimal format).
Inside a computer, numbers can be used in many different contexts. In the early days of the Macintosh, which had a pure black-and-white display, the ones and zeros of the bits represented black pixels and white pixels, so our number 72 would look like this:
Nowadays, a single pixel is represented by four bytes: one specifying how red the pixel is, one specifying how green the pixel is, one specifying how blue the pixel is, and the last one specifying how transparent the pixel is. Here are some color swatches showing some colors:
But our number 72 can mean many more things inside a computer. It can represent a single character of text - in the ASCII system, it represents the capital letter H. It can also represent the amplitude of a sound wave in music. It can also represent part of a computer program. Inside the 6502 processor chip (the chip used in the Apple II, the Atari computers, and the Nintendo game machines), 72 means "PHA": push the value in the accumulator onto the stack. OK, that might not mean anything to you, but it meant a LOT to the programmer. But on the competing Z80 chip, 72 meant something else entirely, and in the modern X86 processors, it takes on another meaning.
The concept goes even further. Consider the word “gut”. It denotes the intestines, right? That’s true only in the context of the English language. In German, it means “good”.
The key idea here is that information is only meaningful within an assumed or specified context. The context is just as important as the data.