U.S. Rep. Mark Takano, D-California, introduced the Justice in Forensic Algorithms Act of 2019 on Sept. 17, a piece of legislation that was influenced by UC Berkeley School of Law professors Rebecca Wexler and Andrea Roth.
The bill provides criminal defendants access to the source code of algorithms and software used against them, according to a press release on Takano’s website. It also introduced newer standards that must be met by the algorithms that analyze evidence against defendants.
One of the main components of the bill prevents companies from invoking trade secret privileges. This aspect of the bill was inspired by an article written by Wexler in 2015, according to Alex Shapiro, the executive director of communications at Berkeley Law.
“Professor Wexler isn’t one to brag, but it was her work and the Slate op-ed that got the ball rolling on this project,” Shapiro said. “If you look at the bill, its language reflected a lot of the things Wexler worked on.”
In her article, Wexler wrote about how the lack of access to the source code of forensic algorithms meant that defendants could not challenge the accuracy of the findings. Wexler pointed out how software used in prior cases has not always been perfect and can sometimes result in an unjust ruling.
Under exceptions to the Freedom of Information Act, companies are able to deny defendants the right to examine proprietary source code if they feel that revealing private information could result in financial harm. According to Wexler, however, companies can overuse this privilege for their own financial gain at the expense of the defendant’s right to a fair trial.
“The part of the bill that implements my thesis modifies the rules of evidence to prevent companies from using intellectual property law to keep defendants in the dark,” Wexler said. “This bill would pave the way to ensure that defendants get a fair shake.”
In addition to modifying the rules of evidence, the bill would also establish a set of standards that algorithms and software would have to meet. According to Takano’s press release, the new computational forensic algorithm standards and testing would require law enforcement to assess the reliability of the analysis and test whether it could accurately be reproduced.
The bill would strike a balance between due process and intellectual property, according to Shapiro, and also begin undoing some of the influence that companies have on the rules of evidence and the laws that are designed to regulate new technologies.
“These developers put money to further their own interests, so sometimes these laws are moving too fast without democratic discussion,” Shapiro said. “What is important about this law is that it brings transparency in the context where it’s absolutely necessary without impacting companies too heavily.”