Despite the rapid pace of AI advancements, leaders still haven’t figured out how to control a foundational issue: bias in the algorithms.
Artificial intelligence can’t purposely “discriminate” the same way a human might, but it will readily spit our own societal biases back at us to detrimental effect. Rep. Summer Lee of Pennsylvania’s 12th congressional district, who’s also running for reelection, today announced a bill to address the issue.
“AI bias is a real and pressing issue,” Lee told Technical.ly, “that’s impacting people right here in Pittsburgh.”
She cited by example local police deployment of facial recognition technology during 2020 Black Lives Matter protests, and the use of flawed algorithms in family welfare assessments.
Called the Eliminating Bias in Algorithmic Systems Act of 2024, her legislation would establish an Office of Civil Rights in each federal agency. It would be tasked with monitoring the agency’s systems for bias, discrimination and other harms caused by the technology. The assistant attorney general in charge of the civil rights division of the Department of Justice would oversee an interagency working group on the topic.
The offices would gather input from a variety of stakeholders, including businesses, civil rights advocates, technical experts and more, and submit reports on how AI is impacting the agency, what steps the agency is taking to reduce harm and any legislation it would recommend to better address the issue.
Lee cited several examples that have come under fire, including a predictive tool at the Department of Veterans Affairs that underestimated the number of Black veterans at risk for hospitalization or death.
By focusing on civil rights, the bill aims to fill in an accountability gap. Inspector generals already do some of this work, but they have a wide range of competing priorities, Lee said, and their reports on AI mostly focus on the security and effectiveness of government use.
Why place the working group under DOJ oversight? From deciding on legal cases to documenting guidance, the department’s Civil Rights Division has made AI a priority, and already holds interagency convenings on the topic, in line with President Joe Biden’s previous executive order on AI.
“This bill codifies that agency’s important work, mandating greater accountability for every agency that has a role to play in mitigating bias, not just the ones that have stepped up thus far,” Lee said. The goal will be to come up with actionable standards for change, not just to discuss current issues.
Massachusetts Sen. Ed Markey introduced the legislation’s companion in the Senate nearly a year ago.
Local officials tackle AI regulation
All eyes are on Pennsylvania as a swing state this election season, and the commonwealth has been a hotbed of AI efforts this year.
Philly Councilmember Rue Landau presented a resolution to the rest of the council to hold hearings about the state of AI in the city in September. Similar to Rep. Lee’s effort, part of the effort aims to take a closer look at government use of the tech.
“We want to get a better understanding [of] what AI technologies are out there, how they’re being used by the private industry and by government,” Landau told Technical.ly at the time. “We want to focus on what the risks are and what the guidelines for it should be.”
In nearby Delaware, Rep. Lisa Blunt Rochester is also championing similar issues. The congresswoman has introduced at least two bills since December 2023 — the AI Literacy Act and the Consumers LEARN AI Act — that promote improving AI digital literacy.
As Election Day nears, officials warn Pennsylvania residents to check trusted local news for updates, as AI misinformation and disinformation spread online. So, while local populations wait for the implementation and impact of these bills, they’ll be confronted with AI troubles to combat in the meantime.
“To those in tech and business, I’d say that creating fair, just AI isn’t just a legal responsibility; it’s a moral one,” Lee said. “Your involvement means actively working to ensure the tools you develop or deploy aren’t shutting people out of jobs, housing or healthcare.”
Before you go...
Please consider supporting Technical.ly to keep our independent journalism strong. Unlike most business-focused media outlets, we don’t have a paywall. Instead, we count on your personal and organizational support.
3 ways to support our work:- Contribute to the Journalism Fund. Charitable giving ensures our information remains free and accessible for residents to discover workforce programs and entrepreneurship pathways. This includes philanthropic grants and individual tax-deductible donations from readers like you.
- Use our Preferred Partners. Our directory of vetted providers offers high-quality recommendations for services our readers need, and each referral supports our journalism.
- Use our services. If you need entrepreneurs and tech leaders to buy your services, are seeking technologists to hire or want more professionals to know about your ecosystem, Technical.ly has the biggest and most engaged audience in the mid-Atlantic. We help companies tell their stories and answer big questions to meet and serve our community.
Join our growing Slack community
Join 5,000 tech professionals and entrepreneurs in our community Slack today!