Published June 4th, 2014 under General
I had an epiphany whilst out running this evening.
However, sites which are hit with serious spam problems still require CAPTCHA’s to properly block spam attacks. The “sporting methods” are able to block 99.9% of the attacks, but the remaining 0.1% manage to bypass this, probably because they’re using something like PhantomJS to process the page. Properly rendering a page costs the spammers a lot of money because it requires substantially more processing power than just sending off post requests, but some sites are deemed worthy of this treatment, and it is those sites which require extra protection. The use of CAPTCHA’s requires a substantially increased amount of processing power again, because cracking CAPTCHA’s is non-trivial and so is usually an effective solution.
Do you folks think this is a good idea or bad idea? Have you heard of this being implemented elsewhere before? If it’s deemed a good idea, I’ll probably implement something like this into the Spam Destroyer plugin in future.
Ideas based on Feedback from Twitter
Andrey Savchenko pointed out running JS in this kind of way could cause the page to lag. It could be set to load later on after important things have already loaded, but we don’t have a reliable way to determine when the page has fully rendered.
A solution to this problem, could be to trigger the calculation on clicking the submit button. So instead of immediately submitting the form, a box could pop up, or another page could be loaded, which did the calculation then. A timer could be added, and if the calculation took too long, then a CAPTCHA could be served instead (useful for devices which are not able to handle complex calculations).