Top 10 Antivirus Software

sunbiz_3000 said:
I have been reading av-comparatives for the past 3 hrs and studied their different sample-set and techniques. Its a test mainly to do with zoo-samples which IMO is a good way, but isn't the best way. They could infact try modifying some existing virus source codes and write a few new ones and then test.... but they say its unethical!!

Also couldnt find their sample-set anywhere, to try and replicate their test results... Do u know where can I find their sample-set?? I found AV-Test.org · Tests of Anti-Virus- and Security-Software to be a vey good resource while reading this site... They are a lot more scientific and have provided their sample set... I'll b trying it out and match their results!!

Modifying the code of the malware essentially creates new malware. This is looked down upon by the AV industry and serves very little purpose in the real-world except when you are trying to study the heuristic engines or variant detections of the various AVs.

Regarding AV-comparatives, you cannot get their sample set. They do not base their sample set on the WildList or any other such organization. Only the vendors who participate in the AV-comparatives tests can get the samples. Every sample is run and verified to be executable in order to prevent false detections. The files which do not execute correctly are disposed of.

For the heuristics (Retrospective/Proactive) tests, the products are tested with 3-month old updates and the samples used are the new malware samples they have received during that 3-month period. This way, an effective gauge can be made about which AV has the best heuristics by simply counting the heuristic detections since the malware is new and signatures have not been updated to detect these.

Rest assured that I place a lot of faith in AV-comparatives as well as the individual tester who prepares the test results. :)

When I prepare a list of favourite AVs, I do not list them in any specific order. I take the following things into consideration:

- Price and licensing

- Support

- Detection rates according to tests

- Features

I also have my own sample set which I use only for the purpose of sending undetected samples for analysis to various AV companies. I do not judge any AV's performance by how much they detect in my sample set.

@sunbiz: KAV has been known to have poor heuristics. Besides, Repacking should not have affected KAV at all since KAV's unpack engine and signature engine are independent from each other. The KAV engine should have detected the packer first and then looked up the signature database. As a result there should be zero detection rate difference because the signatures are not dependent on the packer. Your findings suggest a possible misconfiguration or the fact that KAV didn't support some packers.
 
Darth_Infernus said:
Modifying the code of the malware essentially creates new malware. This is looked down upon by the AV industry and serves very little purpose in the real-world

I think I have to completely disagree with u on this one! Modifying the code in the way I did, by just adding some dummy stucts or changing pointer locations doesnt create new malwares. Its just a variant of the original malware. Just the kind that is so common these days. Remember that more than 60% of the malwares (never really cared for defining what the world means by viruses --> so I may have used the terms interchangably) today are just variants of older ones (nothing scientific --> this is my guess based on looking at virus databases of AV companies and other sources like vscdb, viruslist, Secunia...etc). Nothing to be looked down upon, but only helps create better protection against variants!!

Darth_Infernus said:
Rest assured that I place a lot of faith in AV-comparatives as well as the individual tester who prepares the test results. :)

Do u know the tester personally?? Coz that would be really helpful in getting some more insight on what samples he/she is testing on. Also would be helpful to analyze few codes I have compiled recently!!

Darth_Infernus said:
I also have my own sample set which I use only for the purpose of sending undetected samples for analysis to various AV companies. I do not judge any AV's performance by how much they detect in my sample set.

I would really like to know what is ur fav AV and how it performs on ur sample. After that we could assemble a group of guys (Im sure a few volunteers from TE would be excited to help) and do an AV review (and post our results across the net). We could mix our samples and create a few more original sample viruses)

Darth_Infernus said:
@sunbiz: KAV has been known to have poor heuristics. Besides, Repacking should not have affected KAV at all since KAV's unpack engine and signature engine are independent from each other. The KAV engine should have detected the packer first and then looked up the signature database. As a result there should be zero detection rate difference because the signatures are not dependent on the packer. Your findings suggest a possible misconfiguration or the fact that KAV didn't support some packers.

Im really confused on why KAV missed on the 3 viruses after repacking, while detecting the rest 400. I used UPX and PECompact doing a dual packing, but randomly --> like UPX sometimes first while PECompact sometimes first. The only assumption I made was that it may have happened since these were P2P worms (Cibyz/Kibyz, Oror variant, Zafi D) and required PowerShell or someother execution environment that must have been lost (making the code useless) in the packing...
 
Also one thing I had missed out in the earlier discussions was regarding the use of 2 AVs or 2 AV scanners... I had once found G Data's AV Kit... combined KAV and BDs scan engines. Never had the goodluck to try it out. But Im sure it'll have the best detection, but dint find its performance and stuff... Has any1 here had a look at that??
 
sunbiz_3000 said:
I think I have to completely disagree with u on this one! Modifying the code in the way I did, by just adding some dummy stucts or changing pointer locations doesnt create new malwares. Its just a variant of the original malware. Just the kind that is so common these days. Remember that more than 60% of the malwares (never really cared for defining what the world means by viruses --> so I may have used the terms interchangably) today are just variants of older ones (nothing scientific --> this is my guess based on looking at virus databases of AV companies and other sources like vscdb, viruslist, Secunia...etc). Nothing to be looked down upon, but only helps create better protection against variants!!

The reason this kind of modification of code is invalid is because of concerns from the AV-vendors about whether the modifications are doctored to suit a particular vendor's heuristics engine in order to get that AV the best result. Due to this, it is looked down upon by the industry and hence the testers use only the new variants and new samples that have appeared in a widespread manner and test them against AVs with old updates. Other than that it could theoretically work, but then, you are not releasing these samples in-the-wild and seeing which AV protects from what?

Do u know the tester personally?? Coz that would be really helpful in getting some more insight on what samples he/she is testing on. Also would be helpful to analyze few codes I have compiled recently!!

Well, there are multiple testers actually, and I know one of them, but not in the sense that we sit down and share coffee everyday. :D However, I can get in contact with him. The guys are very strict on their samples policy. You can only get the samples if you work for an AV vendor.

I would really like to know what is ur fav AV and how it performs on ur sample. After that we could assemble a group of guys (Im sure a few volunteers from TE would be excited to help) and do an AV review (and post our results across the net). We could mix our samples and create a few more original sample viruses)

AV reviews involve more than just detection rates. ;)

Anyway, I have tried only four AVs on my sample set, namely BitDefender, AVG Anti-Malware (basically it is AVG+Ewido AntiSpyware), Dr.Web and the Dr.Web clone Virus Chaser.

Based purely on detection rate BitDefender performs the best. Kaspersky should probably do better but I have not checked it out. AVG Anti-Malware is second (The free version has lower detection rate compared to the paid version I was using), and Dr.Web is third.

There is some small difference between the detection rates of Dr.Web and Virus Chaser, both of which use the Dr.Web engine. Virus Chaser detects about ~30 samples less than Dr.Web. Similar small differences have been observed in Virus Bulletin as well between Dr.Web and Virus Chaser, as well as KAV and eScan, both of which use the KAV engine.

Im really confused on why KAV missed on the 3 viruses after repacking, while detecting the rest 400. I used UPX and PECompact doing a dual packing, but randomly --> like UPX sometimes first while PECompact sometimes first. The only assumption I made was that it may have happened since these were P2P worms (Cibyz/Kibyz, Oror variant, Zafi D) and required PowerShell or someother execution environment that must have been lost (making the code useless) in the packing...

Probably an error occurred during unpacking. KAV is very sensitive to the version of packer used and even a .01 change in the release version of the packer will not be reliably unpacked by KAV until an update is released to cover this (luckily KAV updates every hour).

AV programs with generic unpacker should not have such problems in general. NOD32 relies a lot on its generic unpacker and the static (i.e. normal) packer support is quite low. The generic unpacker does most of the job. On the other hand BitDefender has the next best unpack engine to KAV, and also has a generic unpacker (the malware that was detected with generic unpacker is reported by BitDefender as GenPack: <Malware Name>)

[QUOTEAlso one thing I had missed out in the earlier discussions was regarding the use of 2 AVs or 2 AV scanners... I had once found G Data's AV Kit... combined KAV and BDs scan engines. Never had the goodluck to try it out. But Im sure it'll have the best detection, but dint find its performance and stuff... Has any1 here had a look at that??[/QUOTE]

GDATA AVK 2006 has a good interface and offers solid protection but is very heavy. The cost is also quite high and the support is not up to par. GDATA AVK 2007 will retain Kaspersky engine but it replaces BD engine with Avast.
 
:cool1: Interesting. I used Norton once; ditched it about four years ago. Memeory hog, draggy system. For a couple of years I used my school's enterprise McAfee and that was OK. Using Kaspersky right now, and I like it OK, but I've heard a lot of good things :hap2: about NOD.

:cool2:

Els
 
Back
Top