Can the United States' Google Search Algorithm be Reflective of Implicit Racial Bias?
DOI:
https://doi.org/10.47611/jsrhs.v10i4.2182Keywords:
Implicit Racial Bias, Systemic Racism, Computer Science, Algorithms, Google Search, GoogleAbstract
Given the sociological consensus of systemic and implicit racial bias in many American institutions, the interactions of those systems and their sources with computer science and algorithms, specifically Google Search, are largely unknown. The current literature aiming to research the relationship between implicit bias and computer algorithms, such as “Algorithms of Oppression: How Search Engines Reinforce Racism” by Dr. Safiya Noble and “Race After Technology: Abolitionist Tools for the New Jim Code” by Professor Ruha Benjamin, seems to demonstrate that there is evidence or precedence of computer algorithms having implicit racial bias. Given the current state of the literature on this topic, this study aims to demonstrate whether the United States’ Google Search algorithm is reflective of implicit racial bias.
With the hypothesis that the United States’ Google Search algorithm is reflective of implicit racial bias, in order to test this hypothesis, the method would require the collection of data, mean search times, the organization of data into normal distributions, and the use of statistical analysis, hypothesis testing, to determine an algorithmic bias. The results suggest that with specific occupational search queries, Google’s search algorithm seems to have faster mean search times for white occupational search queries than racial minority occupational search queries and tend to be significantly different from occupational search queries and White occupational search queries. Given the evidence performed by hypothesis testing, this research paper assertively concludes that the United States’ Google search algorithm is reflective of implicit racial bias.
Downloads
References or Bibliography
Benjamin, R. (2019). Race After Technology: Abolitionist Tools for the New Jim Code (1st ed.). Polity.
Carlini, N. (2020, December 15). Privacy Considerations in Large Language Models. Google AI Blog. https://ai.googleblog.com/2020/12/privacy-considerations-in-large.html
DuckDuckGo. (2018). Measuring the “Filter Bubble”: How Google is influencing what you click. DuckDuckGo. Published.
Friedman, B., & Nissenbaum, H. (1996). Bias in computer systems. ACM Transactions on Information Systems, 14(3), 330–347. https://doi.org/10.1145/230538.230561
Friedman, J. H. (1998). Data Mining and Statistics: What’s the Connection? Stanford University, 1–7.
Garcia, M. (2016). Racist in the Machine. World Policy Journal, 33(4), 111–117. https://doi.org/10.1215/07402775-3813015
Google. (n.d.-a). Google. Google.com. Retrieved November 2020, from https://www.google.com/
Google. (n.d.-b). How Google Search works. https://www.google.com/search/howsearchworks/
Google Search Statistics - Internet Live Stats. (2020, November). Www.Internetlivestats.Com.
https://www.internetlivestats.com/google-search-statistics/#share
Greenwald, A. G., & Banaji, M. R. (1995). Implicit social cognition: Attitudes, self-esteem, and stereotypes. Psychological Review, 102(1), 4–27. https://doi.org/10.1037/0033-295x.102.1.4
Greenwald, A. G., & Krieger, L. H. (2006). Implicit Bias: Scientific Foundations. California Law Review, 94(4), 945. https://doi.org/10.2307/20439056
Howard, A., & Borenstein, J. (2017). The Ugly Truth About Ourselves and Our Robot Creations: The Problem of Bias and Social Inequity. Science and Engineering Ethics, 24(5), 1521–1536. https://doi.org/10.1007/s11948-017-9975-2
Metz, C. (2017, June 3). AI Is Transforming Google Search. The Rest of the Web Is Next. Wired. https://www.wired.com/2016/02/ai-is-changing-the-technology-behind-google-searches/
Noble, S. U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism (Illustrated ed.). NYU Press.
Obermeyer, Z., Powers, B., Vogeli, C., & Mullainathan, S. (2020). Dissecting racial bias in an algorithm used to manage the health of populations. Yearbook of Paediatric Endocrinology. Published. https://doi.org/10.1530/ey.17.12.7
Richardson, R., Schultz, J., & Crawford, K. (2019). Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice. 94 N.Y.U. L. REV. ONLINE 192. Published.
Salkind, N. J. (2010). Causal-Comparative Design. Encyclopedia of Research Design. Published. https://doi.org/10.4135/9781412961288.n42
Schmidhuber, J. (2015). Deep learning in neural networks: An overview. Neural Networks, 61, 85–117. https://doi.org/10.1016/j.neunet.2014.09.003
Search Engine Market Share Worldwide. (2020, November). StatCounter Global Stats. https://gs.statcounter.com/search-engine-market-share
Sun, W., Nasraoui, O., & Shafto, P. (2020). Evolution and impact of bias in human and machine learning algorithm interaction. PLOS ONE, 15(8), e0235502. https://doi.org/10.1371/journal.pone.0235502
Thiele, R. H., Poiro, N. C., Scalzo, D. C., & Nemergut, E. C. (2010). Speed, accuracy, and confidence in Google, Ovid, PubMed, and UpToDate: results of a randomised trial. Postgraduate Medical Journal, 86(1018), 459–465. https://doi.org/10.1136/pgmj.2010.098053
Thiem, A., Mkrtchyan, L., Haesebrouck, T., & Sanchez, D. (2020). Algorithmic bias in social research: A meta-analysis. PLOS ONE, 15(6), e0233625. https://doi.org/10.1371/journal.pone.0233625
Wall Street Journal. (2012). On Google, a Political Mystery That’s All Numbers. Wall Street Journal. Published.
Williams, Brooks, & Shmargad. (2018). How Algorithms Discriminate Based on Data They Lack: Challenges, Solutions, and Policy Implications. Journal of Information Policy, 8, 78. https://doi.org/10.5325/jinfopoli.8.2018.0078
Published
How to Cite
Issue
Section
Copyright (c) 2022 Myles Aleandre; Markia De Los Reyes
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Copyright holder(s) granted JSR a perpetual, non-exclusive license to distriute & display this article.