It’s good that Washington legislators are drafting laws to address harm caused by artificial intelligence systems.
But more are needed to show policy leadership befitting a state playing a major role in developing and powering AI and the new era of technology it’s promising.
What’s missing are proposals to protect news outlets and other creators of digital media from being ripped off by AI companies training their systems and producing synthetic content based on the work of others.
Publishers, actors, authors and others have sued to protect their intellectual property, and at least one other state has strengthened laws to protect its music industry from digital thievery.
All 50 states introduced AI legislation last year, including 38 that passed around 100 AI bills, according to the National Conference of State Legislatures.
A handful of bills now being debated in Olympia are largely derived from other states’ work.
I suggest Washington also create a version of the ELVIS Act that Tennessee passed in 2024 to protect its music industry, and broaden it to also protect news organizations.
Washington Sen. Maria Cantwell, in partnership with Tennessee Sen. Marsha Blackburn, is doing that nationally.
Their COPIED Act would protect not just musicians and songwriters but actors and journalists who are seeing their work taken and monetized by AI firms.
Cantwell told me last year that “you really need to rein in these companies who basically are cannibalizing the content, and now may take the content in a major way with AI.”
Washington’s Legislature could add protections for copyrighted material to one of the AI bills on the table or through a new, stand-alone bill.
Tennessee’s law prevents unauthorized use of people’s names and likenesses. Its protections also apply to material owned by artists’ heirs.
Its ELVIS Act extended those protections to voices and strengthened copyright protection — it stands for Ensuring Likeness, Voice and Image Security, with a nod to the late King of Rock ‘n’ Roll.
The federal COPIED Act calls for transparency, including federal standards “to identify if content has been generated or manipulated by AI, as well as where content originated,” Cantwell and Blackburn said in their announcement.
COPIED stands for “Content Origin Protection and Integrity from Edited and Deepfaked Media.”
Their bill would prohibit unauthorized use of watermarked content. It would enable content owners, including musicians and journalists, to protect their work and get paid for its use by AI providers.
It authorizes enforcement by the Federal Trade Commission and state attorneys general, plus lawsuits by artists, news outlets and others whose content is used without permission.
Washington doesn’t have Nashville, but it has a robust music industry that needs support and protection.
Washington also has more than 100 at-risk small businesses providing essential, local journalism. They desperately need policymakers’ help to get fairly compensated by tech firms benefiting from their work.
Protect them all, legislators, with a version of the ELVIS Act that includes the broader scope and journalism protections of the Cantwell bill.
Gov. Bob Ferguson was receptive to this idea when I pitched it during a Jan. 9 interview in Olympia.
“I’d be very interested in that,” he said.
“I’ve spoken a lot about this at conferences … trying to maximize the benefits of AI and limit the harm, whether it’s minors and chatbots or artists and news media and their use of it,” he said, “so (I’m) super interested that and I’ll be curious if anyone in Olympia this year is proposing something similar.”
At Ferguson’s request, the state House and Senate are considering bills regulating AI companion chatbots, which have been linked to teen suicides.
Senate Bill 5984 and House Bill 2225 would require chatbot developers to “implement and publicly disclose protocols to detect and respond to self-harm or suicidal ideation, including referrals to crisis resources,” as described by Ferguson’s announcement.
I’m less enthusiastic about House Bill 2157 to prevent discrimination by AI companies.
HB 2157 sounds good on the surface. It’s supposed to protect people from discriminatory decisions made by AI-powered systems, such as rejections for loans and insurance.
But HB 2157 doesn’t enable people to appeal decisions made by AI systems, unless they initiate a costly lawsuit, and it doesn’t have enforcement provisions like similar bills elsewhere. I wonder if it would actually shield banks and insurance companies; they’re exempted if they follow a list of best practices and industry standards.
HB 2157 is also a complex hairball that could mire legislators and derail progress on other work in their short 2026 session.
I say that based on numerous objections raised during a hearing last Wednesday and what happened in other states.
Colorado led with a similar AI discrimination policy in 2024. But it prompted massive debates during a special session last year and was delayed. Virginia legislators passed a similar bill in 2025 only to see it vetoed.
Another proposal in Olympia, House Bill 1170, would require large AI systems to provide an “AI detection tool” to users and add disclosures identifying AI content.
At the hearing Wednesday, a reasonable concern was raised about HB 1170 infringing on the First Amendment, by inserting government messages into noncommercial speech.
The COPIED Act has the same transparency goal. But instead of mandating labels, it calls for “guidelines and standards for content provenance information, watermarking and synthetic content detection.”
Cantwell and Blackburn first proposed this in 2024 and it hasn’t advanced since they reintroduced it last April. Perhaps Washington, where many important technology standards were developed, could move this forward in the AI era.
While it’s lagged on the COPIED Act, Congress fought hard, with strong bipartisanship, to protect states’ right to regulate AI companies. It rejected efforts by President Donald Trump and cronies to sideline states and bar them from passing AI laws.
Tech giants pitch a fit about any regulation. They’ll still be fine. But the public won’t be, if it loses agency, its ability to discern reality and authentic sources of news and media, all to a flood of opaquely operated supercomputers.
So keep at it, legislators. Put constituents first and strengthen protection for material created in Washington that’s being scraped and regurgitated to enrich trillion-dollar AI companies.
Washington can protect innovation, consumers and content creators at the same time, by drawing on Cantwell’s proposal and Tennessee’s policy.
But instead of Elvis, how about memorializing Aberdeen native Kurt Cobain with the KURT Act: Kill Unauthorized Replication and Taking.
This is excerpted from the free, weekly Voices for a Free Press newsletter. Sign up to receive it at the Save the Free Press website, st.news/SavetheFreePress. Seattle Times’ Brier Dudley is the editor of the Free Press Initiative, which aims to inform the public about issues facing newspapers, local news coverage, and a free press. You can learn more about the Free Press Initiative, or sign up for a newsletter, at https://company.seattletimes.com/save-the-free-press/.
