Bias in Facial Recognition Algorithms Essay (Critical Writing)

Exclusively available on Available only on IvyPanda® Written by Human No AI

The film “Coded Bias” provides comprehensive information regarding the problems related to facial recognition algorithms, determining several sources of bias. They primarily include the original orientation of creators on white faces, leading to the lack of data regarding dark complexion and the prevalence of prejudiced male developers. These conditions do not allow establishing equality in terms of race and gender and seem to present a significant threat to the field stemming from the failure to meet everyone’s needs. In addition, the lack of knowledge of how the programs process data and make conclusions does not help eliminate the risks.

In turn, the possible ways of reducing this type of bias attributed to present-day facial recognition algorithms are to prevent their implementation in legal proceedings and police activity in general and introduce practical improvements. The former means accepting responsibility for decisions, and the latter suggestion is multi-faceted and should begin with updating databases to increase their variety and conducting independent assessments to confirm efficiency. By adopting these measures, bias based on race and gender can be addressed over time, and the appropriate quality of operations can be guaranteed.

The mentioned ideas seem crucial for promoting proper use of technology; however, they can not only increase the precision of programs but also resolve the challenge concerning surveillance and racialized minorities. This standpoint is based on the above consideration, according to which accountability for decisions should be clearly established. This change can be performed if the algorithms work correctly, and minimizing or eliminating bias, in this case, is possible if internal and external evaluations correlate with each other. In this way, the future of this field in terms of its suitability for investigations is conditional upon the efforts of specialists in resolving the existing issues to address major concerns of Nakamura and Browne.

Cite This paper
You're welcome to use this sample in your assignment. Be sure to cite it correctly

Reference

IvyPanda. (2023, January 11). Bias in Facial Recognition Algorithms. https://ivypanda.com/essays/bias-in-facial-recognition-algorithms/

Work Cited

"Bias in Facial Recognition Algorithms." IvyPanda, 11 Jan. 2023, ivypanda.com/essays/bias-in-facial-recognition-algorithms/.

References

IvyPanda. (2023) 'Bias in Facial Recognition Algorithms'. 11 January.

References

IvyPanda. 2023. "Bias in Facial Recognition Algorithms." January 11, 2023. https://ivypanda.com/essays/bias-in-facial-recognition-algorithms/.

1. IvyPanda. "Bias in Facial Recognition Algorithms." January 11, 2023. https://ivypanda.com/essays/bias-in-facial-recognition-algorithms/.


Bibliography


IvyPanda. "Bias in Facial Recognition Algorithms." January 11, 2023. https://ivypanda.com/essays/bias-in-facial-recognition-algorithms/.

If, for any reason, you believe that this content should not be published on our website, you can request its removal.
Updated:
This academic paper example has been carefully picked, checked and refined by our editorial team.
No AI was involved: only quilified experts contributed.
You are free to use it for the following purposes:
  • To find inspiration for your paper and overcome writer’s block
  • As a source of information (ensure proper referencing)
  • As a template for you assignment
1 / 1