Back in 2.60, the max size of the Bayesian DB was increased to 1500k entries, which was a good step, but I'm wondering if it would be possible to increase that again? I've been at the max token limit for some time, and as a result, newer spam has been able to bleed through the walls more easily. I wasn't sure if the db limit was more of an internal DB memory limitation, or perhaps a x86 limitation? If I could point it at a SQL Server express db to store tokens, at least on my current hardware, the db could be far bigger without performance issues (or at least I'd like to test that theory).
Any chance of that token limit being able to increase?