In our 2025 PC and mobile antivirus functionality and performance analysis research, we designed and added new evaluation factors based on our detection performance analysis and input from end-users and security stakeholders. Figure 1 shows a categorized list of factors our research team considered valuable in commercial antivirus applications. In particular, we focused on “ Rapid Updates and Maintenance” because of its impact on an antivirus’s detection performance, which we tested repeatedly over an extended period.

Figure 1. Valuable factors for PC/mobile antivirus
Before conducting the performance tests, our research team gathered feedback from individual users and security personnel in various companies and organizations. This feedback was actively incorporated into the study by summarizing the questions we received through our blogs and related press releases, as well as through direct contact. Figure 2 displays the main questions we received.

Figure 2. Top questions about antivirus
This antivirus functionality and performance test was conducted on PCs and mobile devices, with the experimental scenarios outlined below.
PC
1) Quarterly new malware real-time and detection accuracy (Q2-Q4 24)
2) Comparison between paid and free versions of antivirus software
3) Comparison of executable (EXE), document (doc, ppt, xls, pdf, etc.), and ransomware detection performance
Mobile
1) Real-time and deep scan testing of malware Apps collected for one year
2) Group A/B mobile antivirus test (based on whether or not the engines are provided to VirusTotal)
Methodology for Testing PC and Mobile Antivirus Performance
We selected PC antivirus products for testing that offer individual user versions, including both paid and free options.
- (Criteria 1 for PC) Antivirus products offering both free and paid versions
- (Criteria 2 for PC) Products compatible with Windows that hold a significant domestic and international market share
⇒ 9 products were chosen from around 20 candidates for testing
- (Criteria 1 for Mobile Devices) Products that are top downloaded and top-rated
- (Criteria 2 for Mobile Devices) Products that can be installed on Android operating systems and tablets
- (Criteria 3 for Mobile Devices) 6 products, three of each type based on whether they provide their virus engine to VirusTotal
⇒ 6 products were selected from 11 candidate products for testing

Figure 3. Method for selecting target products for antivirus testing
(First sample) We collected malware using our own malware-collecting tool to compose a large sample, randomly selected from the latest malware collected over the past three months, without applying specific selection filter.
(Second sample) A sample for mobile antivirus testing. We utilized APK files, which are application installation files, sourced from various platforms throughout 2024, including VirusTotal, security institutions, and companies. From these samples, We selected the top 100 samples with the highest detection rates, as determined by VirusTotal engines.
(Third sample) We categorized the first sample into three groups (EXE, DOC, and Ransomware) based on the extension information (EXE, DOC) from the engines at VirusTotal and identified the malware with the highest detection count as ‘Ransomware.’
Samples collected from our crawling tool and malware database
- (Sample Criteria 1) Malware from the last 3 months, comprising 1.3 million samples.
- (Sample Criteria 2) Malware is categorized by EXE, DOC, and Ransomware types.
- (Sample Criteria 3) verification of normal and abnormal data through VirusTotal queries.
We deployed a test bed, as illustrated in Figure 4 (top). The server that stores test logs and controls the testing devices was constructed by connecting 10 PCs with antivirus and mobile devices using data cables. To ensure consistent test conditions and prevent external interference, we utilized command execution software (Controller/Agent) to carry out scenario-specific tasks such as downloading, executing, decompressing, and rebooting mobile devices, as shown in Figure 6 (bottom). The mobile device was prepared for testing by connecting it to a PC through ADB (Android Debug Bridge).

Figure 4. Testbed environment (top), scenario-specific test tools and systems (bottom)
[Test 1] Quarterly comparison of malware detection rates collected over the last three months (Q2-Q4 2025, 3 times)
The first test scenario compares the detection rates of antiviruses based on their responsiveness and maintenance. We randomly selected samples from our collection, using approximately 300 to over 500 test samples each quarter.
In our experiments, we conducted tests for both execution and download. In download tests, the tester sends the specified malware to the PC via a controller. Execution tests involve the tester directly running the malware in the excluded folder to assess how effectively the antivirus detects it.
In this blog, we disclose only the results of live detection and download detection from our quarterly experiments. If you would like to know the detailed analysis, please get in touch with us via our official email, and we’d be happy to provide additional information.
<Test 1 Result – execution>
Real-time detection in Q2 showed that the product F achieved the highest detection rate, while in Q3 and Q4, the product D had the highest detection rates, at 36% and 52%, respectively.

Figure 5. Quarterly malware execution test results (detection rate is different for each quarter)
<Test 1 Result – Download>
The download detection test in Q2 showed that products E and J had the highest detection rates at 48% each. Q3 demonstrated similar detection rates, with product D achieving the highest at 63%. Q4 presented consistent results, as products E, F, and J recorded the highest detection rates at 98%.

Figure 6. Quarterly malware download test results (detection rate is different for each quarter)
[Test 2] Free antivirus vs. paid antivirus
The second scenario involves testing both paid and free versions of the same product. Enterprises are expected to utilize paid antivirus solutions, while individuals may choose between free or paid options. Our research indicated that many individuals opt not to use paid products because they believe antivirus software should be free or are unwilling to purchase paid antivirus options. Many respondents also mentioned that the MS Defender installed with default Windows or a free antivirus would be adequate to protect their PCs and mobile devices. Using the same sample, our research team prepared and tested antivirus software with both paid and free products. The testing methodology is the same as before, with the tester’s controller simultaneously delivering malware from a centralized server to each PC.
<Test 2 Result – Free vs. Paid Antivirus>
We evaluated the functionality and performance of both paid and free antivirus products, with the results of the comparative analysis for each product shown in Table 1.
Features | Free Antivirus | Paid Antivirus |
---|---|---|
Cost | Free | Paid (subscription or one-time purchase) |
Malware detection and removal | Supported | Supported |
Real-time detection | Limited | Supported |
Advanced threat detection (e.g., cloud, machine learning, etc.) | Limited | Supported |
Ransomware protection | No features | Supported |
Firewall | No features | Supported |
Email protection | No features | Supported |
DB and feature updates | Low priority | High priority |
Customer support (remote, Q&A, etc.) | No support | Supported |
Multi-device protection | Single device | Multi-device supported(1-5) |
Ads | Included | No |
System performance impact (CPU, Memory, etc.) | Depends on the product | Optimized |
VPNs, password managers, and more | No features | Partial supported |
Figure 7 shows the real-time detection results during malware execution. The free products are marked with an (F) in the graph’s product name for differentiation. For products A and B, the detection rates were not significantly different between paid and free versions. However, for the remaining products, the paid versions exhibited significantly higher detection performance compared to the free versions. Notably, the free version of product C showed almost no detection capabilities. Additionally, we found that the detection rate for product D was over twice as high.

Figure 7. Paid vs. Free Antivirus Test Results (Paid > Free)
[Test 3] Quarterly ransomware detection results
Our research team was curious about how effectively antivirus can detect and respond to ransomware, so we designed and tested an experimental scenario. We published the results in our recent blog post, “Is your antivirus safe from ransomware?” Following the article, we received many inquiries from readers and organizational representatives, demonstrating that ransomware is a serious threat.
[Test 4] Mobile antivirus performance test (deep scan, real-time scan)
The fourth test evaluates mobile antivirus performance. This test is the same as the previous PC test, using a cable to connect the PC and mobile device for control via ADB. We have developed an automated system to conduct the test, which includes installing and downloading malicious apps. Mobile devices connect to WiFi with internet access to ensure that the mobile antivirus remains online for testing. We conduct tests in two ways: live testing, where we install the app directly and monitor the detection results, and deep testing, where we place the malicious APK in internal memory and perform a scan.
<Test 4 Result – deep scan>

Figure 8. Mobile antivirus deep scan result
(Provided the engine to VirusTotal > Not provided the engine to VirusTotal)
<Test 4 Result – real-time scan>

Figure 9. Mobile antivirus real-time scan result
(Provided the engine to VirusTotal > Not provided the engine to VirusTotal)
Conclusion
Our research team gathered feedback on antivirus products from individual users and security staff at enterprises and institutions to understand their needs and concerns. To address these, we developed scenarios and conducted scenario-based tests. Based on the results, we reached our conclusions based on the tests.
We discovered that antivirus detection rates fluctuated from quarter to quarter. Although we anticipated high detection rates for malware that had already been reported, we observed that these rates decreased over time after the malware was reported. We inferred that antivirus software adapted their detection strategies based on risk (updating their detection policies) because they needed to identify new malware more quickly and accurately
In conducting this performance test, our research team examined the differences in performance between paid and free antivirus products. We found that paid antivirus products outperformed free ones in terms of detection, so we recommend considering paid antivirus solutions.
In the ransomware test, we found that the infection (encryption) occurred even when the antivirus detected it. Therefore, if you are worried about ransomware, we strongly recommend using a product that specializes in ransomware prevention.
Our research team will continue to test the features and performance of antivirus products through various studies and publish our findings. We welcome your feedback and will continue sharing updates.

KAIST 사이버보안연구센터 사이버위협분석팀 연구원으로 블록체인 및 소프트웨어 테스팅 연구를 진행하고 있다.

KAIST 사이버보안연구센터 사이버위협분석팀 연구원으로 악성코드 분석 프로그램 및 연구를 수행하고 있다.