Wikipedia had such a problem with personal attacks that 40% of its volunteer editors had stopped contributing. Hard being a volunteer when the only feedback you get is ‘Your so rong & dum’. They used machine learning to teach a computer how to recognize abuse (even when misspelled), and soon it could do so as well as human moderators. They then unleashed it upon 63 million comments to look for patterns. One discovery: 10% of abuse was coming from just 34 users.