The Senate Banking Committee, Community Development Subcommittee held a hearing Jan. 31 on “AI and Housing: Exploring Promise and Peril.” Appearing before the panel were:

  • Lisa Rice, president and CEO of the National Fair Housing Alliance;
  • Vanessa Perry, interim dean and professor at George Washington University School of Business and nonresident fellow at the Urban Institute’s Housing Finance Policy Center; and
  • Nicholas Schmidt, partner and artificial intelligence practice leader at BLDS and founder and CTO at SolasAI.

The hearing focused on housing and how artificial intelligence technology can help or hinder the goal of having a “safe, decent, affordable place to live” and how advances in AI are impacting finance and housing.

Rice referenced insurance no less than 20 times in her testimony, specifically referencing credit scoring, automated underwriting, and risked-based pricing, which all “can manifest bias and extensive harm to consumers, communities, and our economy.” When asked what legislative or regulatory approaches should be taken, the panelists agreed that expanding opportunity, needing the government to move quickly, understanding the current laws and regulations that apply, and taking an intersectoral approach were paramount; Schmidt noted that existing regulations cover about 90 percent of what needs to be done. However, regulators need to have the proper education and experience to undertake this endeavor.

Perry discussed the patchwork of state regulations and how every state is going to be different, creating havoc for companies, which will be especially problematic for smaller businesses. It will be challenging for smaller businesses to comply with different state regulations when this is an area that “by definition crosses state boundaries,” he said. During a round of questioning with Sen. Cynthia Lummis, R-Wyo., the subcommittee’s ranking member, Schmidt acknowledged that the financial industry is leading the pack when it comes to fairness in AI, which is the one area that he worries about.

A notable series of exchanges came from Sen. Mike Rounds, R-S.D., a leader in Senate conversations on AI. He first laid the groundwork that if a company is being responsible, it has to “abide by the Fair Credit Reporting Act, the Fair Housing Act, the Equal Credit Opportunity Act regardless of what tech they are using AI or otherwise.” Rounds then asked, “can you think of any program that we’ve ever made that doesn’t have some biases built into it” – to which the witnesses responded “no.” Rounds followed up with the real challenge – identifying the biases. He asked if there is a human being that doesn’t have biases built into their decision making and how he had yet to meet one who doesn’t. Rice discussed an investigation her firm took part in against a major insurance company where they “examined a third-party scoring system used by the insurer [that] showed the system was overcharging black consumers at rates that exceeded their commensurate risk.” She also pointed out Berkeley researchers who “found algorithmic pricing models discriminate.”

Post Details

Publish Date

February 5, 2024

News Type

  • Washington Weekly

Points of Contact
Katherine Duveneck
Katherine Duveneck
Federal Affairs Director