the website plays up the anti-virus application of the software because that's what distinguishes it from all the tools whose sole purpose is to check for corruption...
his informal definition does not require that the original functionality of the host be maintained, and his formal definition states that all self-replicating programs are viruses... neither of these agree with the assertion you've put forward that viruses must maintain the original functionality of the host program...
Maybe not a contradiction since he makes no mention one way or the other. But it wasn't omitted 'out of hand' it was omitted because it doesn't matter whether or not host function is maintained. The only part the host must play is that attempts to execute the host execute the virus. The definition you use puts further constraints on the virus and you sort of contradict what is inplied by Cohen's definition where he saw fit to omit any mention of host function being retained.
In an arena of recovering files after having been infected by a virus I would like to use the same definition as you - any complaints about not being able to recover some files could be countered by "that was done by a trojan, not a virus - see my definition of virus" . :))
As an aside, I happened across an article about this same vulnerability in ms word and excel.
formatting link
All versions of a (password protected) saved document use the same initialization vector so if you get hold of two different versions of the same document or a document which has been "saved as .." then subsequently edited and xor them against each other, you can read a portion of the output with winhex (or whatever). I just did a test with ms word 2k and could read more or less the complete text but apparently it applies to all versions of word and excel which use rc4.
The above implies nothing about the host's function still being intact. The 'inclusion' of something in a set does not imply that something else was not excluded - neither does it imply that it wasn't. You added that original host function must be retained in the resulting infected program thus further constraining the set of what you will call a virus.
With your obvious grasp of technical material it just surprised me that you use this as a definition for virus. Ay least this time I don't feel so all alone in my reluctance to agree with it. In any future discussions about viruses we'll just have to keep in mind that you use a rather unique definition.
I first used that definition in my presentation (and paper) to the NCSA Conference of '91, in front of almost everyone in the field, at that date. I don't remember any objection to my interpreted definition from the audience, not then, nor later on, in endless discussions on generics vs classic AV. On the contrary. Quite a few adopted my interpretation and based similar features upon (Thunderbyte, Symantec, BRM - Fifth Generation, Eliashim).
From you it almost sounds like being excommunicated. ;-)
Regards, Zvi
-- NetZ Computing Ltd. ISRAEL
formatting link
formatting link
(Hebrew) InVircible Virus Defense Solutions, ResQ and Data Recovery Utilities
And yet the paper I quoted the definition from was titled "EICAR 2000 Best Paper Proceedings" and the author saw fit to explicitly state that host functionality was not an issue instead of simply omitting any mention of it. Probably to avoid another misinterpretation such as you exhibit. In our previous discussion you also seemed to have misinterpreted something about first generation viruses - just because they are deemed not worthy of inclusion in test beds for AV comparatives does not mean they fail to meet the definition of virus. Sure, some can't be treated the same in the AV arena because they are not "infected" per se but they do meet the definition. You were correct about them being "best treated" as trojans - but that doesn't make them trojans and not viruses.
Not so bad really - it's not like we're ever likely to to need to discuss thoses viruses that don't preserve host functionality since most of them do. But if one fails to achieve this in some spreading attempts it (the corrupted host with added viable viral code) should still be considerd a virus if the attempted execution of the host results in execution of the virus code.
Our previous discussion was about whether overwriters should be considered viruses or not. Gen 1 samples were introduced to the discussion from Bontchev's paper on how to maintain a virus collection. I have no particular position in regard of these samples. You may consider them as valid viruses according to the definition, including my stringent one. Gen1 are actually a "do nothing" executable to which the virus code was added. The only difference between a gen1 file and a real infection is that the latter was created by a spontaneous infection process, and the previous was created artificially, through compilation. Note that spontaneity isn't required anywhere in the definition of virus.
You are confusing between droppers and genuine first generation virus samples.
Do you have a difficulty admitting that they all do? ;-) Taking it one step further, can you formulate in words what that implies? Could it be that preserving the host functionality is inherent to "virus" conduct? It's called the scientific method, if you didn't know. ;)
If it systematically corrupts rather than infect, then it's not a virus. If it exhibits irregular behavior, e.g. some instances fail to infect while others succeed, then it's a buggy virus, and if the botched infection resumes normal viral behavior when being executed, then it's a singularity and it's unimportant what you call it.
Regards, Zvi
-- NetZ Computing Ltd. ISRAEL
formatting link
formatting link
(Hebrew) InVircible Virus Defense Solutions, ResQ and Data Recovery Utilities
Specifically those overwriters that don't retain host functionaltiy. I say that they should still be considered viruses because they still fit the definition(s) - except for yours :))
IIRC you compared overwriters to the first generation viruses because you felt that the overwriters were essentially first generation viruses on each iteration - and hence are more akin to trojans than viruses. Detectors don't generally find this sort of thing when they are geared specifically to recognize "infected" files of the type that your definition indicates, so I can see why you would want to exclude them via your definition.
I suppose so, considering what you say below.
Since there was no functionality to be preserved in what is now the host "file" your definition works well enough. :)
Agreed.
Even droppers are viruses if they create a copy of the viral code they 'contain' in another executable area.
Maybe I 'do' need some clarification of the terms "seed file", "germ file", and "dropper file". But it seems to me that any of them would be viruses if they contained the viral code and their execution resulted in that code being replicated into another program. AV may well be best applied to subsequent iterations (spontaneous infection process), but changing the definition of virus so that failing to cover them is not akin to failing to detect a "virus" only serves to confuse.
A batch file (.bat) that overwrites other batch files with itself does exist - so I would have difficulty ignoring that fact. Without added "host funtionality retention" programming it would not be a very sophisticated virus, but it is still a virus.
One could argue that other "virus conduct" such as avoiding multiple infections of the same program should be included in a definition. Just because it is a great advantage to have that conduct does not mean that conduct should become a part of the definition.
It is bad science to ignore existing things just because they aren't often seen.
If the corruption prevents the execution of the viral code, then it is not a virus. If the corruption only negatively affects the original host programs functionality and yet still correctly executes the viral code, then it 'is' a virus.
Incidently, it is also not a virus (TM) if it corrupts the parent and only produces one offspring. Kurt mentioned in an earlier thread that he doesn't think this "non-overlapping" requirement was entirely necessary - but in a CA program (Life) you could have "sliders' that repositioned themselves (one unit diagonally?) and that would differ from a somewhat richer CA that replicated itself to another area of the
2D tape without the new position overlapping the old position.
Some programmatically and intentionally (the writers intention) exhibit such behavior to make emulation based detection more problematic. Something being "buggy" implies it was not what the author intended.
Or do you have a unique definition for "buggy" as well. :))
Interesting, could you explain this more? It seems that "botched infection" implies that it isn't a viable offspring and yet you say execution yields viral behavior which seems to indicate it 'was' viable. Is it this 'host functionality retention' that is "botched" in your above statement? If so, the "infection" wasn't botched - only the attempt to retain host functionality was botched. So I would call it a virus because I don't use your definition of virus.
Cabling-Design.com Forums website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.