
             BACKGROUND
  To begin operating a computer, you find its power switch, turn 
it on, and then what?
  What do you type? What do you do? How will the computer 
respond?
  The answers to those questions depend on which operating system 
your computer uses.
  Most IBM clones use an operating system called MS-DOS, 
supplemented by Windows (which lets you more easily use a mouse). 
Mac computers use a different operating system instead, called 
the Mac System. This book explains how to use all three: MS-DOS, 
Windows, and the Mac System.
  Other kinds of computers use different operating systems 
instead.

             Five issues
  To judge which operating system is best for you, consider five 
issues. . . . 
  Hardware requirements Which hardware must you buy before you 
run the operating system? Does the operating system run on a 
cheap microcomputer, or does it require a computer that's more 
expensive? If it requires an expensive computer, can you justify 
the high cost? Which CPU chip is required? How much RAM? How much 
space on the hard disk? Does it require a mouse?
  Speed Does the operating system run quickly, or must you wait a 
long time for it to obey your commands? After you've given a 
command, do you spend a long time twiddling your thumbs, waiting 
for the operating system to catch up to you? If the operating 
system runs too slowly, maybe you'd get your work done faster if 
you'd use an old-fashioned typewriter or calculator instead of a 
computer!
  Multitasking To what extent can the operating system handle 
many people, tasks, and devices simultaneously? Can the operating 
system begin working on a second task before the first task has 
finished, so the operating system is working on both tasks 
simultaneously? For example, while the operating system is making 
the printer print and waits for the printer to finish printing, 
can the operating system let you simultaneously use the keyboard, 
so you can feed the computer further commands ___ and can the 
operating system start obeying those commands, even though the 
printer hasn't finished obeying your previous command yet?
  User interface How do you feed commands to the operating system 
and tell it what to do? Do you type the commands on the keyboard, 
or are you supposed to use a mouse instead? Are the commands or 
mouse-methods easy for you to remember, or must you keep 
referring back to the instruction manual to remember how to use 
the damned computer?
  Program availability Think about your goal (why you bought the 
computer and what kind of programs you want the computer to run), 
and ask whether those kinds of programs have been written for 
this operating system yet. For example, have programs been 
written yet to make this operating system
handle word processing, databases, spreadsheets, graphics, 
desktop publishing, speech, music, multimedia, telecommunication, 
networks, and accounting ___ and do those programs work 
excellently, meet all your needs, and thrill you?

                                           Three kinds of user interface
                                         How do you give commands 
to the computer? The answer depends on what kind of user 
interface the operating system uses. Three kinds of user 
interface have been invented.
                                         Command-driven In a 
command-driven interface, you give commands to the computer by 
typing the commands on the keyboard.
                                         For example, MS-DOS uses 
a command-driven interface. To command MS-DOS to copy a file, you 
sit at the keyboard, type the word ``copy'', then type the 
details about which file you want to copy and which disk you want 
to copy it to. To command MS-DOS to erase a file so the file is 
deleted, you type the word ``erase'' or ``del'', then type the 
name of the file you want to delete.
                                         Menu-driven In a 
menu-driven interface, you act as if you were in a restaurant and 
ordering food from a menu: you give orders to the computer by 
choosing your order from a menu that appears on the screen.
                                         For example, Pro DOS (an 
operating system used on some Apple 2 computers) has a 
menu-driven interface. When you start using Pro DOS, the screen 
shows a menu that begins like this:
1.  Copy files
2.  Delete files
If you want to copy a file, press the ``1'' key on the keyboard. 
If you want to delete a file instead, press the ``2'' key. 
Afterwards, the computer lets you choose which file to copy or 
delete.
                                         Icon-driven In an 
icon-driven interface, the screen shows lots of cute little 
pictures; each little picture is called an icon. To give orders 
to the computer, you point at one of the icons by using a mouse, 
then use the mouse to make the icon move or disappear or turn 
black or otherwise change appearance.
                                         For example, the Mac's 
operating system (which is called the Mac System) has an 
icon-driven interface. When you turn the Mac on, the screen gets 
filled with lots of little icons.
                                         If you want to copy a 
file from the Mac's hard disk to a floppy disk, just use the 
mouse! Point at the icon (picture) that represents the file, then 
drag the file's icon to the floppy disk's icon. Dragging the 
file's icon to the floppy's icon makes the computer drag the file 
itself to the floppy itself.
                                         One of the icons on the 
screen is a picture of a trash can. To delete a file, drag the 
file's icon to the trash-can icon. When you finish, the trash can 
will bulge, which means the file's been deleted, thrown away.

                                               Multitasking methods
                                         Our country is being run 
by monsters! Big monster computers are running our government, 
our banks, our insurance companies, our utility companies, our 
airlines, our railroads, and other big businesses.
  How do those big computers manage to handle so many people and 
tasks simutaneously? Here are the methods used by those 
maxicomputers ___ and by many minicomputers and microcomputers, 
too. . . . 

         Scheduling the CPU
  Suppose your organization buys a multi-million-dollar 
maxicomputer. How can the employees all share it? The answer 
depends on what kind of operating system you buy.
  Why batch-processing was invented In the 1950's, the only kind 
of operating system was single-user: it handled just one person 
at a time. If two people wanted to use the computer, the second 
person had to stand in line behind the first person until the 
first finished.
  The first improvement over single-user operating systems was 
batch processing. In a batch-processing system, the second person 
didn't have to stand in line to use the computer. Instead, he fed 
his program onto the computer's disk (or other kind of memory) 
and walked away. The computer ran it automatically when the first 
person's program finished.
  That procedure was called batch processing because the computer 
could store a whole batch of programs on the disk and run them in 
order.
  Why multiprogramming was invented While running your program, 
the CPU often waits for computer devices to catch up. For 
example, if your program makes the printer print, the CPU waits 
for the printer to finish. When the printer finishes, the CPU can 
progress to your program's next instruction.
  While the CPU is waiting for the printer (or another slow 
device), why not let the CPU temporarily work on the next guy's 
program? That's called multiprogramming, because the CPU switches 
its attention among several programs.
    In a simple multiprogramming system, the CPU follows this 
strategy: it begins working on the first guy's program; but when 
that program makes the CPU wait for a slow device, the CPU starts 
working on the second program. When the second program makes the 
CPU wait also, the CPU switches its attention to the third 
program, etc. But the first program always has top priority: as 
soon as that first program can continue (because the printer 
finished), the CPU resumes work on that program and puts all 
other programs on hold.
  Why round-robin time-slicing was invented Suppose one guy's 
program requires an hour of computer time, but another guy's 
program requires just one minute. If the guy with the hour-long 
program is kind, he'll let the other guy go first. But if he's 
mean and shoves his way to the computer first, the other guy must 
wait over an hour to run the one-minute program.
  An improved operating system can ``psyche out'' the situation 
and help the second guy without waiting for the first guy to 
finish. Here's how the operating system works. . . . 
  A jiffy is a sixtieth of a second. During the first jiffy, the 
CPU works on the first guy's program. During the next jiffy, the 
CPU works on the second guy's program. During the third jiffy, 
the CPU works on a third guy's program, and so on, until each 
program has received a jiffy. Then, like a card dealer, the CPU 
``deals'' a second jiffy to each program. Then it deals a third 
jiffy, etc. If one of the programs requires little CPU time, it 
will finish after being dealt just a few jiffies and ``drop out'' 
of the game, without waiting for all the other players to finish.
  In that scheme, each jiffy is called a time slice. Since the 
computer deals time slices as if dealing to a circle of card
players, the technique's called round-robin time-slicing.
                                         To make that technique 
practical, the computer must be attached to many terminals so 
each guy has his own terminal. The CPU goes round and round, 
switching its attention from terminal to terminal every jiffy.
                                         If you sit at a 
terminal, a few jiffies later the CPU gets to your terminal, 
gives you its full attention for a jiffy, then ignores you for 
several jiffies while it handles the other users, then comes back 
to you again. Since jiffies are so quick, you don't notice that 
the CPU ignores you for several jiffies.
                                         That technique's an 
example of timesharing, which is defined as ``an operating system 
creating the illusion that the CPU gives you its full attention 
continuously''.
                                         In that system, you 
might not get a full jiffy on your turn. For example, if your 
program needs to use the printer, the CPU sends some data out to 
the printer but then immediately moves on to the next person, 
without waiting for the printer to catch up. When the CPU has 
given all the other people their jiffies and returns to you 
again, the CPU checks whether the printer has finished the job 
yet.
                                         Fewer switches While the 
CPU works on a particular guy, the state of that guy's program is 
stored in the CPU and RAM. When that guy's jiffy ends, the CPU 
typically copies that guy's state onto the disk, then copies the 
next guy's state from disk to the CPU and RAM. So every time the 
CPU switches from one guy to the next, the CPU must typically do 
lots of disk I/O (unless the CPU's RAM is large enough to hold 
both guy's programs simultaneously).
                                         Such disk I/O is 
``bureaucratic overhead''. It consumes lots of time. The only way 
to reduce that overhead is to switch guys less often.
                                         Let's develop a way to 
make the CPU switch guys less often but still switch fast enough 
to maintain each guy's illusion of getting continuous attention.
                                         Suppose a guy's a ``CPU 
hog'': he's running a program that won't finish for several 
hours. Instead of giving him many short time slices, the CPU 
could act more efficiently by totally ignoring him for several 
hours (which will make everybody else in the computer room 
cheer!) and then giving him a solid block of time toward the end 
of those hours. He'll never know the difference: his job will 
finish at the same time as it would otherwise. And the CPU will 
waste less time in bureaucratic overhead, since it won't have to 
switch attention to and from him so often.
                                         To determine who's the 
hog, the CPU counts how many jiffies and how much RAM each guy's 
been using. If a guy's count is high, he's been acting hoggish 
will probably continue to be a hog, so the CPU ignores him until 
later, when he's given a solid block of time. If that block is 
too long, the other guys will be ignored too long and think the 
CPU broke; so that solid block should be just a few seconds. If 
he doesn't finish within a few seconds, give him another block 
later.
                                         The Decsystem-20 and 
other great timesharing systems use that strategy.
                                         Distributed processing 
Now you know how to make many people share a single CPU 
efficiently. But since the CPU chip in an IBM PC costs just $3, 
why bother sharing it? Why not simply give each guy his own CPU?
                                         Today many companies are 
abandoning maxicomputers that have fancy timesharing operating 
systems and are replacing them by a collection of IBM PC clones, 
each of which handles just one person at a time.
                                         That's called 
distributed processing: tying together many little CPU's instead 
of forcing everybody to share one big CPU.
        Scheduling the memory
  Suppose your program is too big to fit in the RAM. What should 
you do? The obvious answer: chop the program into little pieces, 
and run one piece at a time.
  Virtual memory If you buy a computer system that has virtual 
memory, the operating system automatically chops your program 
into little pieces, puts as many pieces as possible into the RAM, 
and runs them. The remaining pieces are put on a disk instead. 
When the CPU finishes processing the pieces in the RAM and needs 
to get to the pieces on disk, the operating system automatically 
fetches those pieces from the disk and copies them into the RAM 
___ after copying the no-longer-needed pieces from the RAM to the 
disk.
  Each piece of the program is called a page. On most computers, 
each page is 4K. Copying a page from the disk to the RAM (or vice 
versa) is called paging or swapping.
  Multi-user RAM If many people try to share the RAM, the 
operating system tries to squeeze all their programs into the RAM 
simultaneously. If they won't all fit, the operating system 
temporarily puts some of their programs onto disk until the other 
programs are done (or have used up a time-slice).
  If lots of users try to share a tiny RAM, the operating system 
spends most of its time shuttling their programs from disk to RAM 
and back to disk. That bad situation ___ in which the operating 
system spends more time copying programs than running them ___ is 
called thrashing, because the operating system is as helpless as 
a fish out of water. If the operating system is big and 
thrashing, it's called a beached whale. The only solution is to 
buy more RAM or tell some of the users to go home!

         Scheduling the I/O
  After the CPU sends a command to a slow device (such as a 
printer or terminal), the CPU must wait for the device to obey. 
While the CPU waits, it ought to do something useful ___ such as 
processing somebody else's program, or processing a different 
part of the same program.
  Buffers Here's how the CPU handles the problem of waiting. When 
the CPU wants to send a list of commands to a slow device, it 
puts that list of commands into a buffer, which is a special part 
of the RAM. The device peeks at the commands in the buffer. While 
the device reads and obeys those commands, the CPU switches 
attention to other programs or tasks.
  The buffer's in RAM. But which RAM? In a traditional computer 
system, the buffer's in the main RAM. In a more modern system, 
each I/O device contains an auxiliary RAM, just big enough to 
hold the device's buffer.
  Buffers are used not just for printers and terminals but also 
for disk drives and tape drives. Each slow device needs its own 
buffer.
  Spooling Suppose you share a maxicomputer with other users. 
What if your program says to write answers on the printer, but 
somebody else is still using the printer?
  In that situation, the CPU pretends to obey your command: it 
pretends to send your answers to the printer. But actually it 
sends your answers to a disk instead. The CPU keeps watching the 
printer: as soon as the other person finishes using the printer, 
the CPU automatically copies your answers from the disk to the 
printer and erases them from the disk.
                                         That technique of 
putting your answers temporarily on disk is called spooling; it's 
handled by a part of the operating system called the spooler. 
Spooling is used mainly for the printer but can also be used for 
other output devices, such as the plotter. The word ``spool'' is 
an abbreviation for ``Simultaneous Processing Of On-Line 
devices''; spooling handles people who try simultaneously to use 
the printer and other on-line devices.
                                         If several people try to 
use the printer simultaneously, the spooler stores all their 
answers on disk temporarily and then prints those answers one a 
time. The answers waiting on the disk are called the printer's 
waiting-line or the printer's queue or (even more briefly) the 
print queue. If the printer is slow and many people try to use it 
simultaneously, the print queue can get quite long.
                                         Suppose you want to use 
the printer, but somebody else is using it to print paychecks. If 
your operating system lacks a spooler, it says ``please wait for 
the other person to finish'' or else makes a mess ___ by printing 
your answers in the middle of the other person's paycheck! Moral: 
get a spooler!

                                                    Personality
                                         The operating system can 
give the computer a personality, so the computer acts as a 
conversationalist, a drudge, or a boss.
                                         Conversationalist A 
``conversationalist'' is a computer you chat with by using a 
keyboard and screen. The conversation might be about a computer 
game you're playing, an airplane seat you're reserving, or your 
personality (if the computer is trying to play therapist). 
Throughout the conversation, what you say to the computer is 
brief, and so are the computer's replies. Typically, you're 
asking the computer for more output, or the computer's asking you 
for more input.
                                         Instead of calling such 
a computer a ``conversationalist'', computists call it an 
interactive computer. If it handles many programs simultaneously, 
with each program on a different terminal, it's called a 
timesharing computer. If its memory contains lots of data and it 
answers questions about the data, it's called a data-retrieval 
system or customer-inquiry system or commercial real-time system.
                                         Drudge A ``drudge'' is a 
computer that takes in piles of data and spits out piles of 
answers. Traditionally, the data comes on cards (or forms that 
are scanned), and the answers are printed on paper. While the 
computer works, it asks you no questions; you don't need to chat 
with it. It just does its job faithfully, blue-collar style.
                                         If the drudge lets you 
start feeding it a second program before the first program has 
finished, it's a batch processor. If the batch processor starts 
working on the second program before the first program has 
finished, so that it's working on both programs simultaneously, 
it's called a multiprogramming batch processor.
                                         Boss A ``boss'' is a 
computer that monitors another machine (such as a burglar-alarm 
system, a microwave oven's timer, a series of synchronized 
traffic lights, an electronic organ, a life-support system giving 
periodic intravenous injections to an unconscious patient, or a 
computerized asssembly line). The computer makes sure 
everything's running smoothly. Typically, you don't see the 
computer: it hides inside the machine it monitors.
                                         Instead of calling such 
a computer a ``boss'', computer ads call it a controller or a 
scientific real-time system.