On a Friday evening in November of 2012, an errant line of code was found in MediaWiki, the software supporting Wikimedia sites and thousands of other wikis.
This potentially affected all of the sites in the world’s fifth most visited web property, including Wikipedia, Wikimedia Commons, Wiktionary, etc. It was a fixable security problem, but only if Wikimedia’s engineers were aware of the flaw.
At another website, this would be the type of problem that could generate thousands of disgruntled users. Or worse.
But at the Wikimedia Foundation, even outside normal office hours, a problem like this is flagged as a bug and listed as an issue almost immediately. And not always by developers who work for the Foundation.
In this case, Wikipedia User:PleaseStand spotted the flaw and filed a report, which then went through the chain of communication to the security team. The bug was fixed before any noticeable damage was done.
This is of course every website administrator’s dream – a Good Samaritan user quietly and diligently pointing out security flaws. But that goodwill doesn’t exist just anywhere.
“We usually don’t directly reach out to volunteers and ask them to actively look for security issues,” says Roan Kattouw, a Senior Software Engineer at the Wikimedia Foundation and a lead developer on MediaWiki. “Usually they approach us because they find an issue.”
Kattouw knows the process well. He started as a volunteer developer in the Wikimedia movement before joining the engineering team at the Foundation. He now works with volunteer developers to patch up holes in the MediaWiki code.
He explains that developers at the Wikimedia Foundation will often get “drive-by reports,” where someone reports a security issue by email once. “And after that we usually never hear from them again,” says Kattouw.
These users are invaluable.
Due to the number of sites running MediaWiki software, security is very important, but fixing issues for so many different configurations can be very complex, and some sites even need the option to run with an insecure configuration to accommodate their users.
Some bugs are more serious than others, and thus have less community involvement. Other security issues need to be dealt with before users can get a chance to point them out.
“If our payments anti-fraud system was published somewhere public, with all the enabled rules and point values clearly spelled out, the fraudsters would immediately change their behavior just enough to get around whatever rules we create,” explains Katie Horn, Lead Software Developer in the Wikimedia Foundation’s Fundraising Engineering team.
In these cases, relying on help from users becomes problematic. When cases of fraud and theft – two of the most common security issues – are spotted by users, it can be too late.
Horn and her team must be perpetually mindful of security issues, from the early planning stages of a fundraising project right through to the end.
“Donation fraud is a constant threat to our online payments system,” she says. “Most of the fraudulent activity comes from people trying to use us as a preliminary low-friction test to verify that stolen cards are still valid, before they attempt the all-out shopping spree somewhere else.”
An unusual attribute that makes the Wikimedia Foundation’s donation pipeline such a target is that donors are not required to create accounts. Requiring accounts would add a layer of security on donations. But, as Horn explains, it’s a balance. “Mandatory account creation causes friction that we have no good reason to inflict on our donors, most of whom are anonymous readers.”
Anonymity, however, doesn’t prevent good, honest feedback; that feedback just comes through different channels set up specifically to address the needs of donors. Usually, donors aren’t familiar with Wikimedia’s standard bug reporting processes, and so they are encouraged throughout the donation process to report issues to firstname.lastname@example.org. This feedback helps fundraising engineers anticipate possible donation-related security issues, and squash technical bugs early.
“Happily, we have a small team of people who heroically comb through donor feedback emails, make sense out of the noise, and surface technical issues to the right people,” she says. “It would be unimaginably more difficult to do our jobs if they weren’t there.”
Maintaining the security of MediaWiki sites is the job of a small team. There are around three to five people who work on security from time to time, with one engineer on the job full time.
Joshua Errett, Wikimedia Foundation Communications volunteer
Can you help us translate this article?
In order for this article to reach as many people as possible we would like your help. Can you translate this article to get the message out?Start translation