Let’s Make Pretty Pictures!
The last chapter was rather tedious; dealing with all that techie nonsense was definitely a hassle. The good news is that, now that we have that behind us, we can have a little fun without much more effort. It’s always fun to draw things on the screen, so let’s try some of that.
So let’s try drawing some lines on the screen. Here’s the same program from the last chapter, but now it has four new lines added to it. Copy it from here and paste it into your text editor new document, then save the document and double-click on it to see it in your browser:
Wowie Zowie! Isn’t that just the coolest thing you’ve ever seen? You just drew a line!
Well, OK, maybe it isn’t the coolest thing you’ve ever seen, but it’s a start. Here’s what those four new lines do:
This says “OK, computer, I’m want you to get ready to draw a path on the screen. The stuff I tell you next will define that path.”
“Now I want you to take one of your digital pens and move it to the coordinates (0, 0).” Remember, these coordinates specify the upper left corner of the screen.
Now draw a line from where you are now to the coordinates (300, 150).
This way of drawing a line is rather clumsy, but it does have one nice benefit: you can draw another line segment by adding just one more lineTo command. For example, type this new line between the “lineTo” command and the “stroke” command:
This will cause the computer to draw another line segment, going straight down from the first line segment. You can do this as many times as you want, adding more lines of code and thereby drawing line segments zigging and zagging all over the screen. Try it!
You are now on speaking terms with your computer. The next task is to learn a few simple expressions, the computer equivalent of "My name is Fred", "Does this bus go to Notre Dame?", or "Where is the bathroom?". This chapter will introduce you to three absolutely fundamental facets of computing: arithmetic, deferred execution, and input. We begin with
Many people mistakenly think that performing arithmetic computations is the prime function of a computer. In truth, computers spend most of their time doing far less exalted work: moving bits of information around from one place to another, painstakingly examining huge piles of data for those few scraps of data that are just what the user ordered, or rewriting the data in a form that is easier for the user to appreciate. Nevertheless, arithmetic is an excellent topic to begin studying because it is familiar to people. If you can do arithmetic on a calculator, you can do arithmetic on a computer. In fact, it’s even easier on the computer. Try this with your computer:
The computer will type under your command the answer, 12, so quickly that you might suspect that it’s up to some trickery. OK, type in some different numbers. Use some big, messy numbers like 3254 or 17819. The general rule is: first, type the word PRINT in capital letters. Then put a space. Then the first number, an asterisk to mean "multiply" and then the second number. If you make a mistake, use the BackSpace key to go back over the mistake, then type it over. When you have it right, press the RETURN key.
You may get a few minor items wrong. For example, when you used a number like 3254, did you type it as 3254 or as 3,254? That comma in between the 3 and the 2 will generate a syntax error. It may seem picayune, but I warned you that computers have no sense of context. Because commas are so small and hard to notice, they cause more syntax errors than any other character. So watch your commas!
The spaces are also important. Some versions of BASIC use a space as a "delimiter". A delimiter is a marker that tells you where the end of one word is and where the beginning of the next word is. It may seem silly untilyoutrytoreadabunchofwordswithoutanydelimitersatall. So give the computer a break and give it spaces where it needs them. B u t d o n ’ t p u t i n e x t r a s p a c e s o r t h e c o m p u t e r w i l l g e t v e r y c o n f u s e d , O K ?
There is no reason why you have to restrict yourself to multiplication. If you wish, you can do addition, subtraction, or division just as easily. The symbol for multiplication is an asterisk: *. The symbol for addition is a plus sign: +. The symbol for subtraction is a minus sign: -. And the symbol for division is a slash: /. With division, the computer will divide the first number by the second number. With subtraction, the computer will subtract the second number from the first number. So to subtract 551 from 1879 you type:
To divide 18 by 3 type:
But what if you want to do more complex calculations? Suppose, for example, that you want to add 8 to 12 and divide the sum by 4. The first idea that comes to most people’s minds is to type:
which will yield a result of 11. Why? Because this command is ambiguous. I told the computer to do two operations &emdash; an addition and a division. Which one did I want done first? It makes a difference! The way I described the problem, I wanted the addition done first, then the division. Instead, the computer did the division first, dividing 12 by 4 to get 3. Then it added 3 to 8 to get 11. If it had done what I wanted it to do, it would have added 8 to 12 to get 20, then divided the 20 by 4 to get 5. Quite a mixup, yes?
How does one avoid mixups like this? The primary means is through an idea called "operator precedence". This is a big phrase that means very little. Whenever we have a situation in which two operators (an operator is one of the four arithmetic operation symbols: +, -, *, or /) vie for precedence, we automatically yield to the * or the /. It’s one of those arbitrary rules of the road like "Y’all drive on the right side of the road, y’hear?" Thus, in our example above, the computer gave precedence to the division operation over the addition operation, and performed the division first.
If you are a reasonable and thoughtful person, you probably have two quick objections to this system of operator precedence. First, you might wonder what happens when two operators with equal precedence contest each other. Who wins? Well, it turns out that it doesn’t really matter. For example, if I type:
It doesn’t matter one bit whether the addition or the subtraction is done first. Try it. 3+4 is 7; subtract 2 gives 5. If you do it backwards, 4-2 is 2; add 3 gives 5. See? It doesn’t matter what order you do them in. The same thing applies to multiplication and division:
If we do the multiplication first, we get 3*4 is 12; divide by 2 gives 6. If we do the division first, then we get 4/2 is 2; multiply by 3 gives 6. So operator precedence doesn’t matter with operators of equal precedence.
Your second objection might be, "OK, how do we get the computer to do the calculation that we really wanted:"
In other words, how do we get the computer to add 8 to 12 before it divides by 4? The answer is to bring in a new concept, the parenthesis pair. If you want a particular operation done first, bundle it up with a pair of parentheses, like so:
I always imagine parentheses as a pair of protective arms huddling two numbers together, protecting them from the cold winds of operator precedence. In our example, that 12 belongs with the 8, not the 4, but the cruel computer would tear our hapless 12 away from the 8 and mate it in unholy union with the 4. The parentheses become like the bonds of true love, protecting and preserving relationships that a cold set of rules would violate. To adapt a phrase, "Parenthesis conquers all." So much for ridiculous metaphors.
You can use parentheses to build all sorts of intricate arithmetic expressions. You can pile parentheses on top of parentheses to get ever more complex expressions. Here is an example:
What does this mess mean? The way to decode a monstrosity like this is to start with the innermost operation(s) and work outward. In this example, the 3+4 is an innermost operation, and so is the 6-2. They are innermost because no parentheses serve to break up the computation. If you were to mentally perform these operations, you would see that the big long command is equivalent to:
All I did to get this was to replace the "3+4" with a "7", and replace the "6-2" with a "4". Now notice that both the 7 and the 4 are surrounded by a complete pair of parentheses. Now, a pair of parentheses around one single number is a waste of time, because you don’t need to protect a solitary number from anything. Remember, parentheses protect relationships, not numbers. Having a pair of parentheses around a number is like putting a paperclip on a single piece of paper. So get let’s get rid of those excess parenthesis:
Now we have another pair of uncluttered operations: 7/7 and 4/2. Let’s make them come true:
Well, gee, now we have more numbers floating inside extraneous parenthesis. Out go the extra parentheses:
Now we’re getting so close we can smell it. Finish up the operation:
Clear out the parentheses:
And there is the answer:
This long exercise shows how the computer figures out a long and messy pile of parentheses.
How do you create such a pile? There is no specific answer to this question, no cookbook for building expressions. I can give you a few guidelines that will make the effort easier. First, when in doubt, use parentheses. Whenever you want to make sure that a pair of numbers are calculated first, group them together with a pair of parentheses. Using extra parentheses is like using extra paper clips: it is a little wasteful but it doesn’t hurt, and if it gives you some insurance, do it.
Second, always count your parentheses to make sure they balance. If you have five right parentheses, then you must have five left parentheses -- no more, no less. If your parentheses don’t balance, you will generate a syntax error.
Congratulations! All of this learning has catapulted you to the level at which you can use your expensive computer as a $10 calculator. If you are willing to continue, I can now show you an idea that will take you a little further than you could go with a calculator. It is the concept of indirection as expressed in the idea of a variable.
Indirection is one of the most important concepts associated with computers. It is absolutely essential that you understand indirection if you are to write any useful programs. More important, indirection is a concept that can be applied to many real-world considerations.
In the simplest case of indirection, we learn to talk not of a number itself, but of a box that holds the number, whatever it might be. The box is given a name so that we can talk about it. For example, try this command on your computer:
This command does two actions: first, it creates a box -- a variable -- that we will call "FROGGY"; second, it puts the number 12 into this box. From here on, we can talk about FROGGY instead of talking about 12.
You might wonder, why do we need code words for simple numbers? If I want to mess around with the number 12, why don’t I just say 12, instead of going through all this mumbo-jumbo about FROGGY?
The trick lies in the realization that the actual value at any given instant is not the essence of the thing. For example, suppose we talked about a different number: the time. Let’s say that you and I are having a conversation about time. You say, "What time is it?" I say, "The time is 1:22:30." That number, 1:22:30, is formatted in a strange way, but you have to admit that it is a bonafide number. Thereafter, whenever you think of time, do you think of 1:22:30? Of course not. Time is a variable whose value was 1:22:30 for one second. When we think about time, we don’t fixate on the number 1:22:30; we instead think of time as a variable that can take many different values. This is the essence of a variable: something that could be any of many different numbers, but at any given time has exactly one number. For example, your speed is a variable: sometimes you are going 55 mph and sometimes you are going 0 mph. Your bank balance is a variable: one day it might be $27.34 and another day it might be $5327.34.
The importance of indirection is that it allows us to focus our attention on grander relationships. Do you remember your grammar school arithmetic exercises: "You are traveling at 40 mph. How far can you travel in two hours?" This is a simple arithmetic problem, but behind it lies a much more interesting and powerful concept. It lies in the equation
distance = speed * time
The big idea here is that this equation only makes sense if you forget the petty details of exactly what the speed is, and what the time is. It is true whatever the speed is, and whatever the time is. When we use an equation like this, we transcend the petty world of numbers and focus our attention on grander relationships between real-world concepts. If, to understand this equation, you must use examples ("Well, 40 mph for 2 hours gives 80 miles"), then you have not fully grasped the concept of indirection. Examples are a useful means of introducing you to the concept, of supporting your weight as you learn to walk, but the time must come when you unbolt the trainer wheels and think in terms of the relationship itself, not merely its application in a few examples. Variables are the means for doing this.
There is an experimental effort underway at some computer science laboratories to develop a computer language in which the user is not required to think in terms of indirection. It is called "programming by example", and is a total perversion of the philosophy of computing. The user of such languages does not describe concepts and relationships in their true form; instead, he provides many examples of their effects. The computer then draws inferences for the user and engages in the indirection itself. In an extreme application of this philosophy, the user would not tell the computer that "distance = speed * time". Instead, the user would tell the computer that "When the speed was 40 mph and the time was 2 hours, the distance was 80 miles; when the speed was 20 mph and the time was 1 hour, the distance was 20 miles." After the user succeeds in listing enough examples, the computer is able to infer the correct relationship.
Programming by example appears to be a new application of artificial intelligence that will make computers more accessible to users by allowing them to program the computers in simple terms, without being forced to think in terms of grand generalities. In truth, it is a step backwards, for it reverses the relationship between human and computer. It forces the human to do the drudge work, listing lots of petty examples, while the computer engages in the exalted thinking. The proper relationship between human and computer makes the human the thinker and the computer the drudge. To realize this relationship, you must have the courage to use your mind, to think in larger terms of relationships between variables, not merely individual numbers. Unless, of course, you enjoy being a drudge.
The concept of indirection is not confined to mathematical contexts. We use indirection in our language all the time. When we say, "Children look like their parents", we are making a general statement about the nature of human beings. Only the most literal of nincompoops is troubled by this statement, asking "Which children look like which parents?" We all know that the noun "children" applies to any children. It is a variable; if you want a specific case, then grab a specific child off the street and plug him into this verbal equation. Take little Johnny Smith; his parents are Fred and Wilma Smith. Then the statement becomes "Johnny Smith looks like Fred and Wilma Smith." Again, the important concept is not about Johnny and Fred and Wilma, but about children and parents in general.
Time to get back to variables themselves. A variable is a container for a number. We can save a number into a variable, and thenceforth perform any operations on the variable, changing its value, multiplying or dividing other numbers by the variable, using it just as if it were a number itself. And it is a number, only we don’t care when we write the program whether that number is a 12 or a 513; our program is meant to work with the variable whatever its value might be.
Some exercises are in order. Try this:
Now make some changes:
Not only can you put a number into a variable, but you can also take a number out, as demonstrated by this example. The computer remembers that FROGGY has a value of 15, and retrieves that value to calculate the value of BIRDIE.
Now for something that might really throw you:
If you think in terms of algebra, this equation must look like nonsense. After all, how can a number equal itself plus 1? The answer is that the line presented above is not an equation but a command. It is called an assignment statement, for its true function is not to declare an equality to the world but to put a number into a variable. An assignment statement tells the computer to take whatever is on the right side of the equals sign, calculate it to get a number, and put that number into the variable on the left side of the equals sign. Thus, the above assignment statement will take the value of FROGGY, which happens to be 15 just now, and add 1 to it, getting a result of 16. It will then put that 16 into FROGGY.
It is time to summarize what we have learned before we move on to deferred execution:
1) You can form an expression out of numbers, operators, and variables.
2) Multiplication and division have precedence over addition and subtraction.
3) Parentheses defeat the normal rules of precedence.
4) Variables are "indirect numbers" and can be treated like numbers.
5) You set a variable’s value with an assignment statement.
6) Anything you can calculate, you can PRINT.
With these items under our belts, let’s move on to the next topic.
This rather imposing term, sounding like a temporary reprieve from a death sentence, in truth means something far less dramatic. In the context of computers, execution means nothing more than the carrying out of commands. One does not idly converse with a computer; one instead issues commands. All of the things you have learned so far, and all of the things that you will learn, are commands that tell the computer to do something. You issue the command, and the computer executes the command. The question I take up in this section is, When does the computer execute your commands?
You might think the question silly. After all, you didn’t buy the computer to sit around and wait for it to execute your commands at its leisure. I can imagine you barking in true military style, "Computer, when I issue a command, I want it executed NOW, not later!"
But there are indeed times when it is desirable for the computer to be able to execute your commands later. A command that is executed now happens once and is gone forever, but a command that can be executed later can be executed later tomorrow, and later the next day, and the next, and the next, as many times as you want. We can give a command right now and expect that it be executed right now; but it would be even more useful to be able to record a command right now and execute it at any later date.
This still may seem a bit silly. Why should anyone bother recording a command for later reference? If I want to PRINT 3+4 sometime next week, why don’t I just type "PRINT 3+4" next week when I need it? Why go to the bother of some scheme for storing that command for later use?
The answer is, it all depends on how big a command you consider storing. There isn’t much point in storing a simple command such as "PRINT 3+4". But what if you have a big calculation that has many steps? Typing in all those steps every single time you wanted to do the calculation would be a big job. If you could store all those steps the first time, and then call them automatically every time you needed to do the calculation, then you would have saved a great deal of time. What a wonderful idea!
There is a term we use for this wonderful idea: we call it a computer program. A computer program is nothing more than a collection of commands for the computer, saved for future reference. When you tell the computer to run a particular program, you are instructing it to execute all those commands that were stored by the programmer.
There is an interesting analogy here. Suppose that you were the boss at a factory. It would be wasteful to stand over each worker, telling him or her what to do at each step of the manufacturing process. ("OK, now put that short screw into the hole at the top. Good. Now put the nut onto the bolt. Now...") A much more efficient way is to explain the entire process to the worker before he or she starts work. Once the worker has memorized the process, you don’t have to worry about him or her any more. This is analogous to the storing of commands for a computer. What is particularly curious is the concept of a program that you, the user, did not write. When you buy a computer program and put it into your computer, it is rather like the boss at the factory saying to the workers, "Here is a book of instructions for how to build a new machine. I don’t even know what the instructions are, but I like the machine. Follow these instructions."
The concept of deferred execution is not unique to the computer. We see it in a variety of places in our regular lives. A cookbook is a set of commands that tell you how to make food. In the corporate world we have the venerable "Policies and Procedures Manual" that tells us how to get along in the corporate environment. But my favorite example is the Constitution of the United States of America. This document is composed of a set of commands that prescribe how the government of the USA will operate. It specifies who will do what, and when, and how. Like a computer program, it has variables: the President, the Congress, the Supreme Court. Each variable can take different "values" -- the President can be Washington, Lincoln, Roosevelt, and so on, but the commands are the same regardless of the "value" of the President, the Congress, or the Supreme Court. Like any computer program, a great deal of effort was expended getting each part of the Constitution just right, tightening up the sloppy wording, making sure that the commands would work in all conceivable situations. And like any real computer program, the programmers have spent a long time getting all the bugs out. Despite this, it has worked very well for nearly two hundred years now. Show me a computer program with that kind of performance record.
So the concept of deferred execution is really not some weird new idea that only works in the silicon minds of computers. It’s been around for a while. With computers, though, deferred execution is used in a very pure, clean context, uncluttered by the complexities of the real world. If you really want to understand the idea of deferred execution, the computer is the place to see it clearly.
How do you get deferred execution on your computer? With BASIC, the technique is simple: give numbers to your commands. Where earlier you typed:
Now type this:
3 PRINT BIRDIE
Those numbers in front of the commands tell the computer that these instructions are meant to be executed later. The computer will save them for later use. To prove it to yourself, type LIST. Sure enough, the computer will list the program that you typed in. It remembers! Even better, it can now execute all three commands for you. Type
The computer will respond almost instantly by printing "20" immediately below the command RUN. It executed all three commands in your little program, and those three commands together caused it to print the "20". Congratulations. You have written and executed your first computer program. Break out the champagne.
Those numbers in front of the commands -- the 1, 2, and 3 that began the lines -- those numbers actually tell the computer more than the mere fact that you intend the commands to be executed later. They also specify the sequence in which the commands are to be executed. The computer automatically sorts them and executes command #1 first, command #2 second, and command #3 third. The sequence with which commands are executed can be vitally important. Consider this sequence of commands:
1. Put the walnut on the table.
2. Move your hand away from the walnut.
3. Hit the walnut with the hammer.
Now, if you got the commands in the wrong sequence, and executed #1, then #3, then #2, you would truly appreciate the importance of executing commands in the proper order. That’s why we give numbers to these commands: it makes it very easy for the stupid computer to get the right commands in the right order.
By the way, it isn’t necessary to number the commands 1, 2, 3, . . . and so on. Most BASIC programmers number their commands 10, 20, 30, . . . and up. The computer is smart enough to be able to figure out that 10 is smaller than 20, and so it starts with command #10, then does command #20, then command #30, and so on. It always starts with the lowest-numbered command, whatever that is, and then goes to the next larger number, then the next, and so on. You might wonder, why would anybody want to number their commands by 10’s instead of just plain old 1, 2, 3, . . . Well, consider the wisdom of the Founding Fathers. They wrote the best Constitution they could, and then they made a provision for adding amendments to their masterpiece. They knew that, no matter how good their constitution was, someday there would be a need to change it. Now, if you number your commands 1, 2, 3, . . . and someday you need to change your program, what are you going to do if you need to add a new command between command #2 and command #3? Sorry, the computer won’t allow you to add a command #2 1/2. But if you number your commands 10, 20, 30, . . ., then if you need to add a command between #20 and #30, you just call it command #25. Unless, of course, you are wiser than the Founding Fathers, and expect no need to change your program. . .
You can now write very large programs. Just keep adding commands, giving each a line number, making sure that they are in the order you want, and trying them out with the RUN command. When you get a program finished the way you want it, or if you want to save it before ending a session with the computer, you must tell the computer to save the program to your diskette. The command for doing this will probably look something like this:
Unfortunately, since all computers are different, you will probably need to type something slightly different from this. Look up the exact wording in your BASIC manual under "Saving a Program". The "MYPROGRAM" part is the name that you give your program. You can give your program almost any name you want. Call it "THADDEUS" or "AARDVARK" or "ROCK"; about all the computer will care is that 1) you don’t give it a name that it’s already using for something else, and 2) that the name isn’t too long &emdash; usually 8 characters is the limit.
If you don’t save the program, then it will be lost as soon as you turn off the computer or load another program. When you want to get your program back, you will have to type something just like the SAVE command, only you type "LOAD" instead of "SAVE". You’ll still have to tell it the name of the program that you want to load.
One last topic and we are done with this chapter. I want to introduce you to the INPUT command. This little command allows the computer to accept input from the keyboard while the program is running. An example shows how simple it is:
10 INPUT FROGGY
30 PRINT BIRDIE
If you were to RUN this program, you would see a question mark appear on the screen. The question mark is a prompt, the computer’s way of telling you that it is expecting you to do something. In this case, it is waiting for you to type a number and press RETURN. When you do this, it will take that number and put it into FROGGY. Then it will proceed with the rest of the program. That’s all the INPUT statement does; it allows you to type in a number for the computer to use.
Despite its simplicity, the INPUT command has vast implications for programming. Up until this point, the programs you could write would always do the same thing. Your first program, for example, would always calculate 3*5+5 to be equal to 20. Now, this may be an exciting revelation the first ten or twenty times, but eventually it does get a little boring to be told for the umpteenth time that 3*5+5 is 20. With the INPUT statement, though, you can start to have some variety. Using the program listed above, you could type in a different number each time you ran it, and get a different answer. You could type in 8, and discover that 8+5 is 13; then you could type in 9, and learn that 9+5 is 14. For real thrills, you could type in a big, scary number like 279, and find out that 279+5 is 284. Wowie, zowie! Aren’t computers impressive?
Have patience, this is only chapter 3.