billboard image Advancing Equity and Justice in the Governance of Technology

Upturn does research and advocacy with partners working in communities to advance justice in the use of technology and automation. 


 

A police officer in a large city stops someone on the street—likely a person of color, according to research —and asks to search the individual’s phone. The person agrees. The officer connects the phone to a device that downloads the phone’s contents, including messages and contacts, before returning the phone to its owner.

In more than 2,000 of the estimated 18,000 public law enforcement agencies across the U.S., that quick exchange allows the agency to mine the phone’s data. The individual may never know how the information is used.

These “mass extractions” from mobile phones are one example of racially biased or unjust uses of technology that law enforcement agencies have adopted.

The unjust and biased use of technology has spilled into other areas. People may be denied apartments or public aid, passed over for jobs, and endure scrutiny from child welfare workers—all based on sometimes flawed, invasive information technology and secretive algorithms.  

That utilization of “mobile device forensic tools,” or MDFTs, by law enforcement to copy and use mobile phone data to investigate cases is unacceptable, noted Harlan Yu, co-founder and Executive Director of Upturn. The Washington, D.C.-based nonprofit works with community partners and national allies for equity and justice in the design, governance, and use of technology.

The revelation that many law enforcement agencies use MDFTs emerged from a 2020 Upturn study which also showed that the forensic tools frequently were used for minor infractions. The study’s accompanying report says that MDFTs “are simply too powerful in the hands of law enforcement and should not be used.” It offers initial steps to limit such searches, including requirements that data be sealed and that law enforcement publicly log their technology use.

“These searches often happen not by police getting warrants,” Yu said, “but through ‘consent searches.’ When that happens, basically there’s no limitation or restriction on what officers can do in the search or with the data afterwards.”

 

Data That Exacerbates Inequity


Digital technology’s rapidly evolving capacity to access and analyze massive amounts of data has made it attractive for government agencies and corporations to crunch that data and try to predict behavior and outcomes, Yu said.

Technology has seeped into the decision-making processes that really matter for people’s material conditions.

“Technology has seeped into the decision-making processes that really matter for people’s material conditions,” Yu said. “When landlords use tenant screening technologies, data like eviction records and criminal records penalize a lot of folks in ways that are often invisible and difficult to challenge.”

Data utilized inappropriately also poses concerns in hiring and employment. When people apply online for jobs, they often must complete automated personality tests that, for example, sometimes attempt to gauge people’s willingness to challenge authority and their propensity to agitate for better working conditions, Yu said.

“People are often being automatically rejected for jobs with no explanation, no recourse,” he added.

Public and private institutions’ increasing reliance on technology also can yield problematic results that those institutions might be unable to detect. If the data flowing into the digital technology reflects wider inequities or is incomplete, machine-learning technology that recognizes patterns will process the flawed data, replicating and exacerbating inequities, Yu said.

 

Sharing Research to Address Injustice


Upturn’s research and analysis of technology, its implications, and the laws and policies regulating it enable the organization to shed light on inherent risks and injustices. Upturn shares its research and analyses with civil rights groups, activists, and advocates working to push for changes in the use of technology and address those underlying social injustices.

Upturn shares its research and analyses with civil rights groups, activists, and advocates working to push for changes in the use of technology and address those underlying social injustices.

One of those organizations, the National Association of Criminal Defense Lawyers (NACDL), has worked with Upturn to help train defense lawyers on dealing with law enforcement technologies such as mobile phone extractions and police body cameras.

“We don’t have capacity to spend tons of time doing public records requests, analyzing data, seeing where the fault lines are,” said Jumana Musa, Fourth Amendment Center Director at NACDL. “You need to understand how the technology works to understand what it can and cannot do. Defense lawyers are getting a much fuller picture now, so they can dig in and ask questions.”

Beyond those tangible results, Musa and other Upturn partners appreciated the organization’s effort to find and act where its efforts are needed most.

“It is bringing the expertise and the work without the ego that makes it easy for them to collaborate with national organizations like ours and small organizations on the ground, grassroots communities fighting back against things,” Musa said. “They’re filling critical gaps.”

 

Local Focus Brings Change


A computer scientist, Yu started the effort that would become Upturn in 2011 with a lawyer, David Robinson. In the early years of the organization, they worked with civil rights and racial justice groups on the implications of emerging technology such as predictive policing, which uses data to predict if crime will occur in specific areas or if certain individuals likely will be perpetrators or victims.

Demand for their work grew, particularly since 2013, when former National Security Agency analyst Edward Snowden’s revelations drew widespread concern about the federal government’s digital surveillance of American citizens—and further concerns about how government and corporate surveillance disproportionately impacts Black and Brown people.

Today, Upturn’s 11 staff members focus on the federal and local levels, particularly in Washington, D.C., where they partner with grassroots organizations working on tenants’ rights, police anti-surveillance, economic justice, and related issues.

The rubber hits the road locally, and it’s also where more change is possible.

“The rubber hits the road locally,” Yu said, “and it’s also where more change is possible.”

Upturn worked with Movement Alliance Project in Philadelphia on—among other efforts—predictive analytics and how they figure into decisions about a defendant’s pretrial jail detention.

“They understand technology at a high level of sophistication and understand how power works,” Movement Alliance Project Policy Director Hannah Sassaman said. “They play a crucial and unusual role and they’re humble and strive to be of use to those most impacted.

She added that every person on the Upturn team is “brilliant, humble, and curious in a way that really gives me hope.”

In the coming years, Yu believes that “crucial opportunities” exist to build closer collaborations among technology and justice groups. 

“We don’t think technology should be looked at in a silo,” Yu said. “It needs to be built into the analysis of broader ongoing movements for racial, economic, and social justice.”


MacArthur has provided $1.5 million to Upturn since 2017 for general operating support, including its work to examine and address the social implications of artificial intelligence-related technologies through research, policy analysis, and advocacy.