On October 30, 2023, President Biden issued an executive order directing a myriad of activities by federal agencies to address the rapid development of artificial intelligence technology (AI). In Section 4 of the order, President Biden issued a number of directives for insuring safety and security of AI technology, including provisions for studying how to authenticate content or identifying and labelling it as synthetic content, meaning content created through AI. In other words, how do we know if something is the original work by an artist, or was the creation of AI based on data fed into it? How do we know that an image or recording of someone is authentic, as opposed to a "deep fake" created by AI? Blockchain technology presents a solution to this problem, which inevitably will be confronted in two key ways by courts and lawyers.
Artificial Intelligence or AI refers to applications and algorithms that are programmed to create a digital output based on materials input into the AI platform. In other words, AI can be programmed to take a series of images, recordings, or text, and produce a new image, recording, or text based on what was fed into it. So in the typical example, if I want AI to write an essay on the causes of World War I, I give the AI platform a number of articles written by scholars about the causes of World War I and it synthesizes it all for me and produces the essay.
Although AI presents a number of benefits, we have already begun to confront some of the challenges heralded by its very existence. First, because one needs to feed content into the AI platform for it to be able to generate new content, what happens if someone feeds the AI platform content that is owned, copyrighted, licensed, or created by someone else. The concern that AI would use a human's creative work was a major concern in the recent Hollywood strikes, where writers and actors wanted terms written into their union contracts banning or limiting the use of AI.
More troubling is the capability of AI to create deep fakes, which is fabricated content that shows an event that never actually took place. For example, AI can take samples of someone's voice and create a recording that sounds real but is a complete fabrication. For lawyers, this is a looming threat. AI could be used to create evidence showing an innocent person in the location of a crime like a murder or robbery, be used by a criminal defendant to create a false confession by another person to beat the charges, or be used to create fake images of a spouse cheating with a person they never actually encountered in a divorce case.
Both the derivative nature of AI and the possibility that AI can create fabricated evidence are serious problems that judges, lawyers, and their clients will confront. Put another way, what does the litigation landscape look like if we are constantly mired in disputes about whether evidence is real or not? As AI becomes more and more a part of the technology we all use, courts will need a way of resolving disputes about the origin and authenticity of evidence. One solution may be found in another new technology, specifically blockchain technology.
Blockchain technology is often mentioned in the context of cryptocurrencies, but it's a technology that has many uses. In broad strokes, blockchain technology is a way of storing information. Essentially, every time an event occurs in a particular blockchain that records those events, an entry called a block is created. Each block is unique and gets a time stamp and a unique identifier. These blocks are stored in a chain sequentially by time stamp, and once they are created they can never be changed. In addition, the blockchain is stored on many computers and servers. If you think of your own desktop or laptop, you probably have a program like Excel, which makes spreadsheets containing information, and each row in the spreadsheet identifies an event (often a transaction) or some other datapoint. Now, imagine that rather than having the spreadsheet stored on your computer, it is stored on a number of computers and servers around the world. Each row in the spreadsheet is a block in the blockchain, and these blocks cannot be changed once they are created. Because the blockchain is stored in multiple places, it is publicly accessible and folks can go back and read the data. This is a simplified explanation, but the point is that blockchain creates an accessible record that is difficult to tamper with because you cannot go in and change one entry in one place, and any alteration creates a new block out of sequence.
So how can blockchain help us confront some of the difficulties presented by AI? Because blockchain technology can create records that are difficult to tamper with, blockchain technology can be used to generate and record a digital signature for each piece of electronically generated content. In other words, we can devise a standardized system that permits a record of each image, recording, or text that is created, and require that all AI platforms record these signatures in their metadata (information about the file). In addition, each AI product would receive an entry on the blockchain. The result would be a standardized methodology of creating a provenance for digital content.
Obviously, creating this type of system will take time and require agreement from multiple stakeholders, including government bodies, the AI companies, and software companies like Microsoft and Adobe. The blockchain infrastructure will need to be built, and a system for managing the resources to administer it, like energy and hardware, figured out. This infrastructure will also need to include an efficient and low cost method for the public to access the blockchain history for a particular file, rather than relying solely on experts to do an analysis. Despite these challenges, some version of what is proposed here will be needed in the future to allow us an easy way to figure out what is real and what is original.
Lastly, the law will need to embrace this system as well. For example, the Rules of Evidence have a fairly low requirement for a judge to allow evidence into a trial. All you really need is someone to answer a handful of questions about what the item is and it can come into evidence. In a world where so much of the evidence is digital, that is photos, documents, videos, etc., and those things can be faked easier than at anytime previous in human history, the law needs to change to meet this reality and require more precise proof that something is real. Here again, the blockchain can help. The Rules of Evidence could be updated to require the metadata for each piece of electronic evidence offered in a case. This metadata would include the blockchain history of the item. The Rules could further require some sort of certification (generated by a person, or perhaps ironically, an AI program), that the item is original or was generated from other files. Both sides would have the blockchain records or access to them, and could raise any issues they find. In this way we could give both the courts and the parties an efficient, simple, and low cost way to figure out whether evidence is a fake, or in a copyright suit, whether the item was created from someone else's intellectual property.