bayes_toks growing without bounds?
bayes_toks taking up more than a few hundred megabytes?
That's 46GBytes for the bayesian filter and 12GB of email. Pretty not the way it should be.
Why does it happen
God bless this guy who described all the background:
(and thanks twice since he didn't just suggest "delete it, idk what it does")
How to fix it
With that info, we find the actual _clean_ fix to be as simple as this:
So, text-dump the actual useful data, drop the database, restore it.
note, in the 46GB case above this involves a little more IO than you'll enjoy. (scanning the 46GB file, constantly checking if a new entry was encountered, using non-optimized perl)
In this case, the processing apparently aborted after the perl-driven file lock expired. I noticed by tracking the file age of the bayes_backup.txt.
Absolutely love the fact they simply invoke NFS file locking, even on a nullfs local mount. That's smart iops throttling, eh?