Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

[flagged]


Sure— the method of making the image, such as being AI generated, is entirely irrelevant in terms of IP enforcement. You could cut a cross-section from a log that had coincidentally formed the Nike symbol with it's rings, and if you slapped a picture of it on your line of sportsware, you better believe you're going to get owned.

But if they see an increased risk of IP violations from AI generated assets— and given the Getty red carpet debacle that's entirely reasonable— banning it will probably save them a whole lot of money on manual game reviews.


The Nike example is trademark rights, not copyright.

If you give a worker 5 examples of cars, and tell him "draw me a new car in this style", and he does so (from memory without clearly copying any individual example), it's unlikely to be a copyright or other IP violation.


Ok, fine... an image of Goofy. Both are judged on how similar they are, not the tools with which they were made.

https://en.wikipedia.org/wiki/Substantial_similarity

> Judge John M. Walker, Jr. of the U.S. Court of Appeals for the Second Circuit noted in Arica v. Palmer that a court may find copyright infringement under the doctrine of "comprehensive non-literal similarity" if "the pattern or sequence of the two works is similar".

You're going to have a much harder time proving that you absolutely did not copy something if you had an image of what you're being accused of copying in the dataset you used to make it. If the images are deemed substantially similar, it will be deemed an infringement.


> If you give a worker 5 examples of cars, and tell him "draw me a new car in this style", and he does so

Yeah that's great, but it actually has nothing to do with how the AI works. A worker learning through observation about 5 cars is hugely different situation than an AI company scraping 400 million often copyrighted images onto their servers to run through a training algorithm to create a for-profit system that displaces the people who produced the original images.


INAL. But... show me the case law establishing there is near-zero risk to them if they let it go through.

People make business decisions all the time to avoid murky areas that may hold peril. Unless there is a big benefit to them, why take the risk?


There is no case law. Literally anybody saying that things related to this are illegal, are making up Boogeymen.

Yes, businesses act like this, no shit.

Doesn't mean we shouldn't call them out on it because it's cowardly behavior.


Being cautious when not being cautious could mean lots of big lawsuits against you doesn't seem that ultra-super conservative. I hope this ends up going the other way, but I understand Valve's calculus here.


regardless of legality: the odds are games with AI generated materials are going to be much lower quality

(shovelware)


Made by people with none of the skills or drive to make games.


> https://www.artnews.com/art-news/news/ai-generator-art-text-...

US Copyright Office has stated unequivocally that AI works cannot be copyrighted, or otherwise protected legally.

The US patent office is studying the effects of AI on the patent system and asking citizens and businesses for comment.

If that’s not enough for you, I don’t know what would be.


That's meaningfully different. "Can't be copyrighted" doesn't mean "can't be sold", or "someone else owns the copyright". It just means someone can copy and resell the generated portions without payment/licensing.


That sounds like a fucking nightmare if you're running a marketplace for what is essentially intellectual property.


I'm not sure. I'm not an expert, but it doesn't seem that different from including public domain text and art in your game.

I assume that, if it is true that Valve isn't allowing games with generated images, it's because (they feel) the legal status could change, not because of the current status.


Yes that's exactly what I'm saying lol.

There's also a quality argument. If Valve lets a bunch of slapdash AI hackjobs onto the store that were developed in a week by people who don't know anything about game development, and that makes it harder to discover well made games, that's a meaningful business risk for them. They're responsible for curating the steam store.


That is a shallow regurgitation of their opinion that has been repeated out of context in headlines, but it misses their point. The Copyright Office's opinion can be better summed up as:

1. Copyright protects work that humans create

2. Humans sometimes use tools to create their works, that is okay

3. Y'all make up your mind whether your AI is some sentient being or whether it's just a tool. We're just lawyers.

If the wind blows and your typewriter falls off a shelf and writes a novel, it isn't subject to copyright either. That doesn't mean that all works written using a typewriter aren't subject to copyright. It means a human must be part of the creative process.


But what if the wind blows, and my laptop falls off a shelf and writes the source code for windows 95, but reindented, with some implementation details and variable names changed?

It’s pretty clear that the “neural networks are just a tool” ruling is going to have to be revisited eventually (and probably soon).


> But what if the wind blows, and my laptop falls off a shelf and writes the source code for windows 95, but reindented, with some implementation details and variable names changed?

Simple. If it wasn't created by a human, it's not eligible for copyright. The law is quite clear about this.

Microsoft gets the copyright to Windows 95 because they wrote it with humans. You wouldn't get it because you didn't write it. Your laptop wouldn't get it because it isn't a human.

> It’s pretty clear that the “neural networks are just a tool” ruling

I think you misinterpreted the above. There is no "“neural networks are just a tool” ruling".

The copyright office never said neural networks were or were not a tool.

They said if a human makes a creative work, and they happen to use use a tool, then it is eligible for copyright. As it always has been.

All they said is what every lawyer already knows, which is that a work has to have an element of human creativity in order to be eligible for copyright.


But, if my laptop’s implementation of windows 95 is not eligible for copyright protection, then I can freely redistribute it because no one can use copyright law to stop me, in a runaround of Microsoft’s copyright on windows 95 (which the laptop generated version is clearly a derivative of).

This is exactly the ambiguity Valve is concerned about.


But the hypothetical world in which your laptop falls off a shelf and randomly writes Windows 95 is a fake one.

LLMs aren't random number generators running in isolation.

They're trained on copyrighted material. If they regurgitate copyrighted material, we know where it came from. It came from the training material.

Valve is rightly concerned that non-lawyers have no clue what they're getting themselves into when using the current generation of AI models. The inability to determine whether an output is a substantial copy of an input is not a free pass to do whatever you want with it, it's a copyright infringement roulette.

There are way too many people in this industry who believe that building a technology which makes compliance impossible is the same thing as making compliance unnecessary.


> US Copyright Office has stated unequivocally that AI works cannot be copyrighted, or otherwise protected legally.

The “or otherwise legally protected" piece is outright falss (and would be out of their scope of competence if true), the other part is true but potentially misleading (a work cannot be protected to the extent that AI, and not the human user, “determines the expressive elements of the work”, but a work made with some use of AI where the human user does that can be protected to the extent of the human contribution.)

The duty to disclose elements that are created by generative AI in the same guidance is going to prove unworkable, too, as generative AI is increasingly embedded into toolchains with other features and not sharply distinguished, and with nontrivial workflows.


That's surprising. Do you know if their definition of 'AI' includes things like generative fill in Photoshop?


> ultra super conservatively cautious.

This has nothing to do with politics.

This has everything to do with CYA, the issue is AI trained with copyrighted material is a huge gray area and they don’t want to be in the gray area. That’s rational and has zero to do with “conservative”.

This is likely not set in stone and after the copyright laws and courts catch up and decide what to do, Valve will likely go back and update their policies accordingly.


>> ultra super conservatively cautious.

> This has nothing to do with politics.

> This has everything to do with CYA, the issue is AI trained with copyrighted material is a huge gray area and they don’t want to be in the gray area. That’s rational and has zero to do with “conservative”.

The word "conservative" isn't a political word in all (or even I would have thought in most) contexts: it's normal meaning is similar to "chosen so as to be careful". For example, a "conservative estimate" isn't "an estimate that leans to the right of the political spectrum": it is an estimate which has been padded out in the direction you are more likely wrong.

When someone says they are being "ultra super conservatively cautious" they are merely being super extra extra doubly-cautious, as we are stacking similar adjectives (as one might could do with something else such as "carefully"). So, wanting to avoid being in a gray area is dead center to being "conservative" in one's curation or legal strategy.


> This has nothing to do with politics.

Please tell what is, in your opinion, a conservative garbage collector without looking on google.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: