Congress is currently deliberating on legislation that could significantly impact the gaming community, potentially dampening the enthusiasm of countless players. As a recent enthusiast of the video game Fallout: New Vegas (FNV), I find this particularly concerning. For those who may not be familiar, FNV is a single-player, open-world action role-playing game that invites players to traverse the post-apocalyptic Mojave Desert, armed with retro-futuristic weapons while aligning with various factions vying for dominance. Remarkably, this game, released 15 years ago, continues to engage thousands of players daily, a feat few titles from 2010, such as Call of Duty: Black Ops, Mass Effect 2, and BioShock 2, can claim.
The enduring appeal of FNV can be attributed to its modifiability. Bethesda, the game designer, has equipped players with the tools to create modifications—known as mods—that enable them to rectify bugs, redesign weapons or outfits, and even introduce new narratives. This customization allows each player to personalize their gaming experience.
Legislative Concerns
FNV is not an isolated case; other games like The Elder Scrolls V: Skyrim boast nearly 100,000 mods, while Minecraft has amassed over a quarter of a million. These games are not only entertaining but also foster vibrant communities of creators. However, the proposed Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act of 2025 threatens to disrupt this creative ecosystem. Intended to combat the proliferation of sophisticated AI-generated deepfakes, the NO FAKES Act seeks to establish a property right over one’s “digital replica,” making it illegal to produce realistic computer-generated versions of individuals’ likenesses or voices without consent.
While the intention behind the legislation—to curb misinformation—is commendable, its broad definition of “digital replicas” could inadvertently ensnare everyday gamers who are simply enhancing their beloved games. Titles like F1 25, NHL 25, Madden NFL 25, and NBA 2K25 rely on accurate representations of professional athletes, while Cyberpunk 2077 features Keanu Reeves as Johnny Silverhand, showcasing how integral real-world likenesses can be to gameplay. The joy of customizing characters in games like The Sims and Elden Ring hinges on the ability to create avatars that reflect real people, including friends and family.
The implications of the NO FAKES Act extend beyond mere creativity; it introduces liability for products or services that generate digital replicas without authorization. If a custom character bears a resemblance to a real person, the game could face legal repercussions. Even if the risk of litigation seems minimal, developers might opt to limit the diversity of customizable features to avoid potential conflicts with the law.
This environment could lead to a form of censorship. Should a complaint arise regarding unauthorized use of a likeness, the responsibility would fall on providers to remove the content “as soon as technically feasible.” This creates a scenario where companies may feel pressured to act swiftly, regardless of the legitimacy of the claim, to avoid facing fines for non-compliance.
Furthermore, companies would be tasked with preventing future uploads of similar content, mirroring the obligations set forth by the Digital Millennium Copyright Act (DMCA). Major corporations might absorb the financial burden of penalties—up to ,000 per instance for providers and ,000 for individuals—but smaller developers may struggle to navigate these risks. This could lead platforms to preemptively eliminate content that skirts the edges of legality, stifling creativity and expression, even in cases where exceptions, such as parody, should apply.
Current legal frameworks already provide protections that could encompass the concept of a “digital replica.” Many states uphold right of publicity laws that grant individuals control over the commercial use of their name, image, and likeness. Existing privacy, intellectual property, copyright, trademark, and defamation laws could also be relevant. For commercial applications, such as Keanu Reeves’ involvement in Cyberpunk 2077, companies typically negotiate fair contracts and compensation—rumor has it Reeves received million for his role. The NO FAKES Act, however, overlooks the distinction between commercial and non-commercial endeavors, which could disproportionately affect small developers and fan communities engaged in creating non-commercial games or mods.
Innovation within the gaming industry hangs in the balance. Developers are exploring advancements in voice synthesis, procedural generation, and AI-driven non-player character behavior. While not all gamers embrace these changes—myself included, as I tend to shy away from heavily AI-reliant games—this should remain a matter of personal choice. The federal government should not deter publishers and developers from experimenting with AI, allowing them the freedom to succeed or fail within a free market.
Valve, the creator of the digital distribution platform Steam, is addressing the challenges posed by AI in video games with a proactive approach. In January 2024, the company announced that it would require developers to disclose any use of AI in their games, including both pre-generated and live-generated content, while ensuring that the game does not contain “illegal or infringing content.” This transparency allows gamers to make informed decisions about their support for AI-driven games.
The potential consequences of the NO FAKES Act could stifle creativity, hinder fan innovation, and expose both players and developers to significant legal risks. Fortunately, there are alternative, voluntary solutions that could better balance the interests of creators and consumers alike.