7

We have a Linux VM running Xubuntu with ClamAV installed.

We would like to scan files larger than 4Gigs, using the clamscan command preferably. I can use the --max-filesize=x and --max-scansize=x options perfectly. Looking on the clamscan man page, Clam only lets you set these parameters to less than 4Gig file sizes.

I can also set these to 'unlimited' by using 0, but if the file is larger than 4Gigs it will still have no data scanned.

Example:

----------- SCAN SUMMARY -----------
Known viruses: 4297615
Engine version: 0.98.7
Scanned directories: 0
Scanned files: 1
Infected files: 0
Data scanned: 0.00 MB
Data read: 58082.25 MB (ratio 0.00:1)
Time: 12.325 sec (0 m 12 s)

As you can see we are trying to scan some pretty large files ±75Gigs.

Is there a way to use clamscan to virus scan files larger than 4Gigs? Or is there another command line tool to achieve what I am after.

6
  • Is your Xubunt a 64-bit version? If so, make sure your clamscan is also 64-bit with file $(which clamscan); if not, then I don't know of any way to open files over 4GB with 32-bit software.
    – AFH
    Commented Mar 21, 2016 at 16:44
  • Thanks for the reply. "/usr/bin/clamscan: ELF 64-bit LSB executable" It's all 64-bit unfortunately. Any other ideas? Commented Mar 21, 2016 at 17:51
  • 1
    There is no way to scan arbitrarily large files, in clamav or in many other commercial AVes. There are technical difficulties (saturation of the filesystem on which /tmp resides or of virtual memory), and one very good basic reason: do you really believe that multi-GB-sized files are a good vehicle of infection? Commented Mar 21, 2016 at 17:59
  • For what it's worth, I just scanned a 13GB VM disc on 64-bit Ubuntu 15.04 and I got similar results to you; however, if I used clamscan - <FilePath it took 90 times longer, with high resource use. In both cases it reported zero data scanned, but the first call said 13GB read, while the second said 144MB. I didn't set any parameters besides the file name or -. Make what you will of these results.
    – AFH
    Commented Mar 22, 2016 at 0:35
  • 4
    If anyone has more information regarding the specific problem with scanning large files I'd like to hear it. Simply saying large "files are clean" doesn't cut it. We need to know the precise technical limitations so we can know what defaults are safe to change and under which circumstances.
    – jorfus
    Commented Dec 2, 2016 at 19:57

2 Answers 2

2

I ended up using savscan by Sophos.

This command line tool was able to achieve what I was after, with no configuration needed and it's free!

1
  • From Sophos documentation... it does have a limit as well. Note: The threat detection engine only scans archived files that are up to 8GB (when decompressed). This is because it supports the POSIX ustar archive format, which does not accommodate larger files. Commented Apr 2, 2021 at 3:58
0

As other AVs have similar limitations, one not ideal solution is to copy 1GB chunks of files and scan each chunk. One could overlap chunks by a few MB but there is no solution for linking randomly all over the file. Maybe be a better than nothing solution though.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .