Thanks for your explanation. It mades things clearer. I hope you won't mind a few more questions.
What is the "background" on your computer? (In S/360-DOS days, we had a background and foreground partitions.) Why can't we, as the owner of the PC, control what is and what is not run in the "background"?
I suspect the answer to my question is that PCs today are highly automated which allows for much of this junk to happen in the first place. In its simplest state, a computer would require someone to physically load and then execute each and every program desired. Modern machines are automatic. That is, if you're browsing a website that sends you a .PDF file, your browser program automatically brings up the Adobe program to read it. I presume there's lot of other stuff we lay people don't even know about going on, and the hackers take advantage of that underworld.
I heard that M/S's new "Vista" will be _less_ automatic as a safety measure. I sure hope so.
Could you explain what "NAT" is and does?
Why is this allowed to even happen? This is one of my big complaints about the Intenet as it's presently set up: It's designed to be so "open" that anyone can do anything. The computer dreamers and idealists want it this way. This was fine in a narrow world of the very early days, but not fine in an anonymous world of today. (Other explanations would be appreciated).
I don't understand why bugs would allow this to happen. To "answer the door" means (1) the computer program has to know when the doorbell is rung and (2) then execute a routine to answer the doorbell, and (3) respond to the doorbell request. In other words, there is software intentionally written and included to respond to outside probes. Since probes are dangerous, why do we allow this? Why don't we disable the entire "door bell" process?
Again, I suspect the answer is this process makes for easy automation, but maybe you or others could explain it better.
I know computers have a start up routine, I have changed mine for DOS purposes. But why should the start up routines be allowed to be modified automatically? Is it that hard to require the human to modify the routine himself (or authorize said modification)?
Maybe we need operating systems that make it impossible for the human not to notice things are happening? Or would that create a flurry of warning messages? (I must admit I turned off my browser's warnings about confidential start and confidential stop of data. This comes up when I log on or enter an order on-line.)
That really bugs me. As far as I know, Internet browsing software should be READ ONLY with restrictions. It should be extremely limited in what it allows an external site to do on my machine. I dislike the idea of any site's -- even a 'trusted one' -- running their programs on my machine. How do I know their programs are not buggy even from a "trusted" site?
Why do browser writers create this kind of capability?
This is very frustrating. When I got my new machine at work I disabled all that stuff. Then I found I couldn't browse anywhere since everyone required it. Why, I don't know, it seemed sites were plenty able to present information in an attractive way before those fancy features. Further, my employer has me use sites that require fancy stuff. At least my browser warned me clearly when I turned that on of the risks.
I'm still confused, but I think it's as you said -- people want features.
Computers do not _have_ to allow external entities to have control at all. The developers have chosen to include this for "service and features" and failed to put in proper controls at the start, IMHO. A PC on a network, for instance, should not accept any networked instructions or upgrades without a security key. What's to stop some well-intentioned but incompetent user from issuing his own upgrades over the network and screwing everyone up?
I'll note in contrast that in IBM's System/360, critical functions by the operating system had to be done in 'supervisor state' which was strictly controlled by hardware. You could submit and execute an application program that does damage but you can't touch the operating system. Application programs are subject to various checks and restrictions, including hardware blocks that was included in System/360 from day one.
But the result is that the systems maintenance effort of a S/360 is far more considerable than that required for a PC. Presumably few owners would want to bother doing all the work necessary.
I agree that it's complex. But I disagree it's insurmountable.
I am far from an expert. But IMHO too much sophistication was rushed into the marketplace too fast without adequate protection built in. IMHO the "young turks" didn't know their history and should've.
IBM's first real operating system for S/360, known as "OS" turned out to be a disaster. It was extremely slow and a resource hog and totally unsuited for low end machines as intended. They couldn't release it as is. They developed some alternatives (DOS, BOS, BSP, TOS), so people could at least use the new hardware and delayed everything for about a year, almost secretly putting IBM into bankruptcy (lots of costs, no revenues). The point is that they chose to wait. They probably should've waited even longer than they did, I think it took a while for the early production OS to be decent. Modern developers should've learned from that experience: "The birth of a baby takes nine months no matter how many women are involved" and "adding people to a late project only makes it later", said the mgr of OS.
In the very early days of computers the users were all programmers presumably with good intentions and skills. But by the 1960s it was clear the user community would be large with a variety of skill levels. Computer designers put in safety checks so program bugs (intentional or accidental) would only hurt the responsible user, not everyone else. Things like file restrictions, time limits, resource limits, kept control on things. Some controls were done by the human operators who simply wouldn't allow certain jobs to run. By the 1980s these controls were sophisticated and automated. A corporate programmer couldn't go into the payroll system and give himself a raise.
What I don't understand is why the PC world, especially when used in networking and Internet service, failed to adopt the same controls the mainframe world did.
Thanks again for your explanations!
[public replies please]