Select Page

In May 2021, Twitter user @capohai shared a screenshot of a Google search for “what do terrorists wear on their head,” which returned, as the first result, the Palestinian keffiyeh scarf. At the same time, the French senate had just voted to prohibit women under 18 from wearing hijabs in public, and President Macron’s LREM political party had pulled support from candidate Sarah Zemmahi for wearing a hijab in a campaign ad. How many others asked Google the same question and took its response as validation of their own prejudices or as an objective statement of fact? How many others were hurt by the results?

The outrage over Google’s tacit equation between Palestinians, headscarves, and terrorists spilled from social media into the news, but when the same search is performed today, the keffiyeh is still the top result.

It makes sense that @capohai turned to Twitter because one of the only options for most people who notice tech companies’ unethical behavior—privacy breaches, furthering hate speech and disinformation, biased behavior, and more—is to post about it on social media. But as this example shows, the retribution model does not work to actually correct ethical violations. 

Looking at the bigger picture, there have been calls for greater regulation of the tech industry, which is deeply necessary, but legislation may take a long time to pass and be implemented and is generally insufficient to stop the unforeseen ethics failures endemic to technology. Since algorithms tend to express our (bad) values in unexpected ways that require constant updating and fine-tuning to correct, regulations, no matter how deftly and broadly written, cannot foresee and stop all future issues.

But there is an option that does not rely on either social media outrage or new regulation. Tech companies are actually already configured for handling ethics issues at scale. They just need to adapt their existing bounty system.

Right now, hundreds of companies and organizations, great and small, offer bounties ranging from thousands to millions of dollars to those who find vulnerabilities in their code that bad actors could exploit. Google’s bounty program even covers applications sold through its Play store. Apple, which only recently began a bounty program (with compensation of up to a million dollars for the most serious types of exploits) takes a similar approach. In its program notes, the company states that it will “reward researchers who share with us critical issues and the techniques used to exploit them,” providing public recognition and matching donations of the bounty payment to charities.

Imagine how much better the products and services of Silicon Valley could be if these companies grouped ethics violations under “critical issues and the techniques used to exploit them” and began offering corresponding bounties. After all, ethical violations can cause just as many problems for a company and users as a bit of leaked code. The above language wouldn’t even need to be changed. And the ethics bounty program could use the rest of Apple’s rules, which include: 1) You must be the first to report it, 2) you must clearly explain and show evidence of what happened, 3) you can’t disclose it publicly before Apple gets a chance to patch it, and 4) you can get a bonus if the company inadvertently reintroduces a known problem into a new patch.

For users, a bounty system would encourage people to search for ethics violations and report them more quickly. For companies, this system could help them locate and address problems before they cause harm to more customers, generate negative press, and potentially destabilize governments. Granted, some companies may be unfazed by negative press, the loss of customers, and the furthering of prejudices, but they are still likely to be motivated by the long-term stability and goodwill such a program could create. Having a public record of responding thoughtfully to ethics issues in the past can also help a company if it wants to recruit talented workers and grow into other markets and industries.