There has been lots of discussion on this seemingly misogynistic “correction” that Google provides, and those familiar with my research know that I’m a proponent of critiquing algorithm and system design for potential bias. But in this case I think it is more a function of the n-gram table of adjacent word frequencies. A user searches for “x y”, Google determines that “x z” is related, and appears more commonly in “natural language”, so it suggests that search as well.
There clearly is a broader cultural problem that “he invented” appears more often in language, but in this case it might just be Google’s algorithm reflecting a pre-existing cultural bias, and not creating it themselves. Now, of course, Google could decide to intervene and NOT allow this suggestion to appear – or to force similar suggestions in the reverse…..plenty of ways for them to “not be evil” here…..
Good related background reading:
- Friedman, B. & Nissenbaum, H. (1996). Bias in computer systems (PDF). ACM Transactions on Information Systems, 14(3), 330-347.
- Introna, L. & Nissenbaum, H. (2000). Shaping the Web: Why the Politics of Search Engines Matters (PDF). The Information Society, 16(3), 169-185.