steve at greengecko
Mar 15, 2011, 2:14 PM
Post #5 of 32
On Tue, 2011-03-15 at 13:51 -0700, Chuck Swiger wrote:
> On Mar 15, 2011, at 12:21 PM, Russ Tyndall wrote:
> > Because of the huge volume of data being scanned (70 Gb), the scan takes about 6 hours to complete.
> > Is there a practical way to reduce the scan time?
> As Al noted, 10.4 is about six years old-- released April 2005, last patch was 10.4.11 in Nov 2007.
> One thing you might consider doing is using "find /location -mtime 1" to generate a list of which files have been modified over the past day, and only scanning these via clamdscan -f.
> Doing this safely depends on whether files can spoof their last-modified timestamp, which depends on how the fileserver is being accessed by clients. If additional safety is required, you can use tools like tripwire, which create checksums of the content and can thus identify files which have changed regardless of the mtime, and use that to generate the list of changed filed to be re-scanned.
find /location -mtime -1
= modified less than a day ago...
Steve Holdoway BSc(Hons) MNZCS <steve [at] greengecko>
MSN: steve [at] greengecko