Inside the politics and policy at play as regulators play catch-up with AI advances
Matthew Seitz, Director of the AI Hub at the University of Wisconsin, outlines the challenge of regulating AI due to its fast development and undefined use cases.

AI is moving faster than regulators can act. With the tech evolving by the day, governments are left scrambling to define rules for a game where the goalposts keep shifting. The result is a patchwork of progress and policy, with no clear path in sight.
Matthew Seitz, Director of the AI Hub at the University of Wisconsin, author of the ExplAIn it like I'm busy newsletter, and former Director of Performance at Google, blends academia and industry into a unique perspective on the complexities of AI governance. There are inherent difficulties, he explains, in regulating a field where the ground is constantly shifting.
Regulate what, exactly?: According to Seitz, the core problem lies in the very nature of AI's development. "The attempts that government has made to regulate have generally failed because it's just hard to decide what exactly to regulate," he says. That confusion is made worse by the speed of development and the ambiguity around how, exactly, the technology will be used. "The technology is emerging very quickly and the specific use cases are not well defined," Seitz adds.
That uncertainty isn’t going anywhere. A new House proposal would bar states from regulating AI for the next 10 years—a move that, rather than creating clarity, could deepen the regulatory fog.
Killed by kindness: Seitz points to a recent case where an OpenAI model began exhibiting sycophancy, agreeing with users to the point of validating delusions. It's a stark example of how fast-moving AI can produce outcomes no one intended. "Regulators are in a very difficult space, because the technology is still evolving as we speak," Seitz says.
Speed vs. safety: Seitz contrasts the approaches of the Biden and Trump administrations—one nudging companies toward voluntary safety measures, the other focused on speed and global dominance with fewer constraints. Meanwhile, Europe is taking the opposite tack: strict regulation aimed at protecting citizens. "The US right now is saying 'we want to go faster, we want to win,' meanwhile Europe is saying 'we want to slow down and protect our citizens,'" explains Seitz.
The attempts that government has made to regulate have generally failed because it's just hard to decide what exactly to regulate.
Lowest common denominator: Without clear federal rules, companies default to the strictest local standards, just like they did with digital privacy laws in California and Virginia. But Seitz notes that this patchwork approach shifts the burden downstream. A national provider might set a baseline, but each client still has to sort out compliance based on where they operate and who they serve. "If you're in Atlanta, you're dealing with Georgia regulations, but if your customers are in California, you have to operate within the bounds of the local areas you deal with," he says.
Waiting on Washington: When the rules are unclear, or too strict, some companies just opt out. "Both Apple and Meta have held back technology from Europe," Seitz notes, citing legal risk under the EU's regulatory framework as a key reason. Rather than gamble on lawsuits, they simply keep certain AI features off the table.
Looking ahead, Seitz sees one real fix: federal action. "The longer game that we all want is for the US Government to step up and say, 'here's the regulation,'" he says. But that kind of unified oversight isn't coming anytime soon. "It's highly unlikely we're going to see that in the next few years," he adds. "So for right now, it's just going to be a hodgepodge."
Self-regulation: That patchwork leaves businesses holding the bag. When AI goes wrong—like in the Uber self-driving fatality or Air Canada’s misleading chatbot—it’s companies, not regulators, who face the fallout. “Businesses need to establish AI policies, audit internal processes and be transparent with customers about AI use,” says Seitz.