Saturday, October 17, 2015

How can we Regulate the Black Box of Big Data Algorithms

Frank Pasquale is professor of law at the University of Maryland. He studies legal challenges posed by changing technologies. He has published an important article about challenges to accountability posed by Big Data at Aeon (8/18/15).

Algorithms that sift through Big Data are profiling us nonstop: for the government to see if we should be placed on a "no-fly list," for lenders to see what interest rate they should charge us, for employers to see which resumes should be looked at, and after we're hired whether we should get a pay cut or be fired,  and for Google to see where our business should show up in a search result. How do we make such algorithms accountable?

Here is Pasquale:
Cyberspace is ... now a force governing (the world) via algorithms: recipe-like sets of instructions to solve problems. From Google search to OkCupid matchmaking, software orders and weights hundreds of variables into clean, simple interfaces, taking us from query to solution. Complex mathematics govern such answers, but it is hidden from plain view, thanks either to secrecy imposed by law, or to complexity outsiders cannot unravel. 
Algorithms are increasingly important because businesses rarely thought of as high tech have learned the lessons of the internet giants’ successes. Following the advice of Jeff Jarvis’s What Would Google Do, they are collecting data from both workers and customers, using algorithmic tools to make decisions, to sort the desirable from the disposable. 
Companies may be parsing your voice and credit record when you call them, to determine whether you match up to ‘ideal customer’ status, or are simply ‘waste’ who can be treated with disdain. Epagogix advises movie studios on what scripts to buy, based on how closely they match past, successful scripts. Even winemakers make algorithmic judgments, based on statistical analyses of the weather and other characteristics of good and bad vintage years. 
For wines or films, the stakes are not terribly high. But when algorithms start affecting critical opportunities for employment, career advancement, health, credit and education, they deserve more scrutiny. 
US hospitals are using big data-driven systems to determine which patients are high-risk – and data far outside traditional health records is informing those determinations. IBM now uses algorithmic assessment tools to sort employees worldwide on criteria of cost-effectiveness, but spares top managers the same invasive surveillance and ranking. In government, too, algorithmic assessments of dangerousness can lead to longer sentences for convicts, or no-fly lists for travellers. Credit-scoring drives billions of dollars in lending, but the scorers’ methods remain opaque. The average borrower could lose tens of thousands of dollars over a lifetime, thanks to wrong or unfairly processed data.
The other day I received a job advertisement in my Twitter feed. This is for a mid-level position at an attractive media company. They are likely to garner many thousands of responses. To the extent that these responses are sifted by an algorithm it matters how the algorithm is designed. For example, it's illegal to discriminate based on sex, race, or age--but there are many subtle ways to design algorithms to weed out applications for such characteristics. If your resume is never viewed by a human being because of how it is sorted by the algorithm you will never know and the illegal discrimination--to the extent it occurs--may be beyond scrutiny.
European Union regulators are now trying to ensure that irrelevant, outdated, or prejudicial material does not haunt individuals’ ‘name search’ results – a critical task in an era when so many prospective employers google those whom they are considering for a job. The EU has also spurred search engines to take human dignity into account – by, for example, approving the request of a ‘victim of physical assault [who] asked for results describing the assault to be removed for queries against her name’. 
How do we begin to provide oversight of Big Data algorithms working in secret? There are people working on this problem:
Such controversies have given rise to a movement for algorithmic accountability. At ‘Governing Algorithms’, a 2013 conference at New York University, a community of scholars and activists coalesced to analyse the outputs of algorithmic processes critically. Today these scholars and activists are pushing a robust dialogue on algorithmic accountability, or #algacc for short. Like the ‘access to knowledge’ (A2K) mobilisation did in the 2000s, #algacc turns a spotlight on a key social justice issue of the 2010s.
Some in the business world would prefer to see the work of this community end before it has even started. Spokesmen and lobbyists for insurers, banks, and big business generally believe that key algorithms deserve the iron-clad protections of trade secrecy, so they can never be examined (let alone critiqued) by outsiders....
When the problems with algorithmic decision-making come to light, big firms tend to play a game of musical expertise. Lawyers say, and are told, they don’t understand the code. Coders say, and are told, they don’t understand the law. Economists, sociologists, and ethicists hear variations on both stonewalling stances.
Government can start by requiring its vendors to provide algorithm accountability, says Pasquale. However, it is clear that any regulation to provide protection to consumers and citizens is as complex as it is necessary. It requires tremendous experience and expertise on the part of regulators.

Congress worries about too much regulation, or inefficient regulation. And it's surely true that regulatory schemes once in place are hard to dismantle. It's also true that the necessary complexity of such regulatory schemes makes it difficult for us citizens (and legislators) to accurately evaluate the effectiveness and absolute need for the exact way regulation is implemented.

It comes down to good people doing good work, Congress giving backing to regulators not to become captive to the industry they are regulating... and a good deal of trust in our regulators.

Read THE ENTIRE ARTICLE HERE.

No comments:

Post a Comment