Frank Pasquale’s The Black Box Society has been steadily moving up my reading list since it came out, but after Monday’s morning-long workshop on the topic of impenetrable algorithms, the book looks to be this weekend’s reading project. Professor Pasquale has been making the rounds for a while now, but when his presentation was combined by devastating real world examples of how opaque credit scores are harming consumers and regulators that were ill-equipped to address these challenges, U.S. PIRG Education Fund and the Center for Digital Democracy were largely successful in putting the algorithmic fear of God into me.
A few takeaways: first, my professional interest in privacy only occasionally intersects with credit reporting and the proliferation of credit scores, so it was alarming to learn that 25% of consumers have serious errors in their credit reports, errors large enough to impact their credit ratings. (PIRG famously concluded in 2004 that 79% of credit reports have significant errors.)
That’s appalling, particular as credit scores are increasingly essential, as economic justice advocate Alexis Goldstein put it, “to avail yourself of basic opportunity.” Pasquale described the situation as a data collection architecture that is “defective by design.” Comparing the situation to malfunctioning toasters, he noted that basic consumer protection laws (and tort liability) would functionally prohibit toasters with a 20% chance of blowing up on toast-delivery, but we’ve become far more cavalier when it comes to data-based products. More problematic is the byzantine procedures for contesting credit scores and resolving errors.
Or even realizing your report has errors. I have taken to using up one of my free, annual credit reports every three months with a different major credit reporting bureau, and while I think this procedure makes me feel like a responsible credit risk, I’m not sure what good I’m doing. It also strikes me as disheartening that the credit bureaus have turned around and made “free” credit reports into both a business segment and something of a joke — who can forget the FreeCreditReport.com “band”?
Second, the Fair Credit Reporting Act, the first “big data” law, came out of the event looking utterly broken. At one point, advocates were describing how individuals in New York City had lost out on job opportunities due to bad or missing credit reports — and had frequently never received adverse action notices as required by FCRA. Peggy Twohig from the Consumer Financial Protection Bureau then discussed how her agency expected most consumer reporting agencies to have compliance programs, with basic training and monitoring, and quickly found many lacked adequate oversight or capacity to track consumer complaints.
And this is the law regulators frequently point to as strongly protective of consumers? Maybe there’s some combination of spotty enforcement, lack of understanding, or data run amok that is to blame for the problems discussed, but the FCRA is a forty-five year-old law. I’m not sure ignorance and unfamiliarity are adequate explanations.
Jessica Rich, the Director of the FTC’s Bureau of Consumer Protection, conceded that there were “significant gaps” in existing law, and moreover, that in some respects consumers have limited ability to control information about them. This wasn’t news to me, but no one seemed to have any realistic notion for how to resolve this problem. There were a few ideas bandied back-and-forth, including an interesting exchange about competitive self-regulation, but Pasquale’s larger argument seemed to be that many of these proposals were band-aids on a much larger problem.
The opacity of big data, he argued, allows firms to “magically arbitrage…or rather mathematically arbitrage around all the laws.” He lamented “big data boosters” who believe data will be able to predict everything. If that’s the case, he argued, it is no longer possible to sincerely support sectoral data privacy regulation where financial data is somehow separate from health data, from educational data, from consumer data. “If big data works the way they claim it works, that calls for a rethink of regulation.” Or a black box over our heads?
0 Comments