Artificial Intelligence (AI) has emerged as a transformative technology, promising advancements in various fields. Supporters of the movement believe that the incredible technology will change the way we live for the better. But some proponents fear that there may be harsh consequences lying ahead, particularly for the Black community.
Here’s the problem.
AI systems heavily rely on vast datasets to learn patterns and make predictions. Unfortunately, historical data is often plagued with biases and have elements of systemic racism baked within. They also require the help of humans, who can inherently insert their own biases into AI algorithms and programming.
If the data used to train AI algorithms disproportionately represents negative stereotypes or discriminatory practices, the resulting models can perpetuate and amplify those biases. This is dangerous because it can create the perfect breeding ground for anti-Blackness, leading to unfair treatment and discrimination against Black individuals in various domains, such as criminal justice, employment and lending.
Facial Recognition Technology and Racial bias go hand in hand.
AI facial recognition technology has already encountered multiple problems, specifically when it comes to understanding and identifying individuals within the Black community. Studies have shown that facial recognition algorithms have higher error rates when identifying individuals with darker skin tones and facial features.
This racial bias not only undermines the accuracy and fairness of the technology but also reinforces harmful stereotypes and discrimination against Black individuals. The tech community saw this harsh reality come to life in 2020 when Facebook’s AI-generated placed the label of “primates” underneath a Daily Mail video of a Black man being accosted by a white male.
Google also fell victim to AI anti-Blackness in 2015, when the tech giant’s AI-generated photo app mistakenly labeled a Black couple’s photo underneath “gorillas.” The company later apologized and vowed to fix the racist bug within the app’s coding.
Unfortunately, AI facial recognition is causing even more harm to the criminal justice system. AI-based predictive policing systems, used by law enforcement agencies, raise concerns about perpetuating anti-Blackness. These systems utilize historical crime data to find criminals and predict future crime hotspots. However, if historical data reflects biased practices, such as racial profiling or over-policing in Black communities, the predictions made by these systems can unfairly target and disproportionately impact Black individuals, exacerbating existing biases and anti-Blackness within the criminal justice system.
In 2019, faulty AI facial recognition led to the wrongful arrest of Nijeer Parks. The 31-year-old New Jersey native was shocked when Woodbridge Police falsely accused him of several charges including aggravated assault, unlawful possession of weapons, using a fake ID, shoplifting and possession of marijuana. The dodgy AI system also accused Parks of nearly hitting a police officer with a car.
Parks, who had previous troubles with the law in the past, spent 11 days in jail for crimes he did not commit. It took over a year of litigation for him to clear his name from the allegations. According to CNN, authorities from the Woodbridge Police Department claimed that their AI system generated a “high profile comparison from a facial recognition scan of a photo from what was determined to be a fake ID left at the crime scene that witnesses connected to the suspect.” That facial recognition match gave prosecutors enough evidence to follow through with his arrest.
AI bias can lead to housing complications for Black people.
AI discrimination can also harm Black people from securing housing. Some housing and apartment programs use AI technology to select tenants and asses mortgage qualifications for prospective homebuyers, but pesky biases have crept their way into the elaborate technology.
The ACLU noted, that some AI systems vet tenants based on court records and other datasets that have their “own built-in biases” and that are loaded with errors. “People are regularly denied housing, despite their ability to pay rent, because tenant screening algorithms deem them ineligible or unworthy,” the outlet added.
The same issue falls true for Black homebuyers. In 2019, researchers at Berkely found that Black borrowers were overcharged nearly $765 million each year for home and refinance loans due to mortgage lenders using faulty AI to determine loan pricing. The study also found that both in-person and online lenders rejected a total of 1.3 million creditworthy applicants of color between 2008 and 2015 due to AI.
Addressing the issue of AI bias will require a multifaceted approach. It involves ensuring diverse representation in AI development teams, where Black software developers can help to prevent and improve some of these crucial issues. Tech companies will need to work hard to audit and search for bias within AI algorithms and actively involve impacted communities in the decision-making processes.
Additionally, collecting more diverse and representative datasets and implementing robust testing frameworks can help identify and rectify biases within AI systems.
The history of AI and its entanglement with anti-Blackness reveals the urgent need for critical examination, transparency, and ethical considerations within the development and deployment of AI systems. Addressing biases and ensuring equitable outcomes are crucial steps toward building a future where technology works to dismantle, rather than reinforce systemic discrimination.
The post The Intersection Of AI And Anti-Blackness: Unraveling The Complex Problem appeared first on NewsOne.
The Intersection Of AI And Anti-Blackness: Unraveling The Complex Problem was originally published on newsone.com
Students At Kansas High School Stage Walkout After Viral Video Shows Black Teen Suffer Racist Attack
Here’s How You Could Get A Free Lyft Ride In Maryland This Holiday Season
WJZ News Anchor Fired After On-Air Racially Insensitive Remark
The Maryland Zoo Will No Longer Offer ZooLights
Mayor Brandon Scott Officially Launches Reelection Bid
Texas Attorney Arrested For Allegedly Smuggling Ecstasy-Laced Paperwork To Inmates
The Inner Harbor Ice Rink Has Returned
Dr. Kmt Shockley, Dr. Sebis’ Son Abdul & Author Krystal Parker l The Carl Nelson Show