A New York Times copyright lawsuit could kill OpenAI::A list of authors and entertainers are also suing the tech company for damages that could total in the billions.
I always say this when this comes up because I really believe it’s the right solution - any generative AI built with unlicensed and/or public works should then be free for the public to use.
If they want to charge for access that’s fine but they should have to go about securing legal rights first. If that’s impossible, they should worry about profits some other way like maybe add-ons such as internet connected AI and so forth.
Nice idea but how do you propose they pay for the billions of dollars it costs to train and then run said model?
Then don’t do it. Simple as that.
This is why we can’t have nice things
If we didn’t live under an economic system where creatives need to sell their works to make a living or even just survive, there wouldn’t be an issue. What OpenAI is doing is little different than any other worker exploitation, however. They are taking the fruits of the labor of others, without compensation of any kind, then using it to effectively destroy their livelihoods.
Few, if any, of the benefits of technological innovation related to LLMs or related tech is improving things for anyone but the already ultra-wealthy. That is the actual reason that we can’t have nice things; the greedy being obsessed with taking and taking while giving less than nothing back in return.
Just like noone is entitled to own a business that can’t afford to pay a living wage, OpenAI is not entitled to run a business aimed at building tools to destroy the livelihoods of countless thousands, if not millions, of creatives by building their tools out of stolen works.
I say this as one who is in support of trying to create actual AGI and potentially “uplift” species, making humanity less lonely. I think OpenAI doesn’t have what it takes and is nothing more than another scam to rob workers of the value of their labor.
This is the wrong way around. The NYT wants money for the use of its “intellectual property”. This is about money for property owners. When building rents go up, you wouldn’t expect construction workers to benefit, right?
In fact, more money for property owners means that workers lose out, because where else is the money going to come from? (well, “money”)
AI, like all previous forms of automation, allows us to produce more and better goods and services with the same amount of labor. On average, society becomes richer. Whether these gains should go to the rich, or be more evenly distributed, is a choice that we, as a society, make. It’s a matter of law, not technology.
The NYT lawsuit is about sending these gains to the rich. The NYT has already made its money from its articles. The authors were paid, in full, and will not get any more money. Giving money to these property owners will not make society any richer. It just moves wealth to property owners for being property owners. It’s about more money for the rich.
If OpenAI has to pay these property owners for no additional labor, then it will eventually have to increase subscription fees to balance the cash flow. People, who pay a subscription, probably feel that it benefits them, whether they use it for creative writing, programming, or entertainment. They must feel that the benefit is worth, at least, that much in terms of money.
So, the subscription fees represent a part of the gains to society. If a part of these subscription fees is paid to property owners, who did not contribute anything, then that means that this part of the social gains is funneled to property owners, IE mainly the ultra-rich, simply for being owners/ultra-rich.
Not really how it works these days. Look at Uber and Lime/Bird scooters. They basically would just show up to a city and say the hell with the law we are starting our business here. We just call it disruptive technology
Unfortunately true, and the long arm of the law, at least in the business world, isn’t really that long. Would love to see some monopoly busting to scare a few of these big companies into shape.
If OpenAI owns a Copyright on the output of their LLMs, then I side with the NYT.
If the output is public domain–that is you or I could use it commercially without OpenAI’s permission–then I side with OpenAI.
Sort of like how a spell checker works. The dictionary is Copyrighted, the spell check software is Copyrighted, but using it on your document doesn’t grant the spell check vendor any Copyright over it.
I think this strikes a reasonable balance between creators’ IP rights, AI companies’ interest in expansion, and the public interest in having these tools at our disposal. So, in my scheme, either creators get a royalty, or the LLM company doesn’t get to Copyright the outputs. I could even see different AI companies going down different paths and offering different kinds of service based on that distinction.
I think it currently resides with the one doing the generation and not openAI itself. Officially it is a bit unclear.
Hopefully, all gens become copyleft just for the fact that ais tend to repeat themselves. Specific faces will pop up quite often in image gen for example.
This would bring up the cost of entry for making a model and nothing more. OpenAI will buy the data if they have too and so will google. The money will only go to the owners of the New York Times and its shareholders, none of the journalists who will be let go in the coming years will see a dime.
We must keep the entry into the AI game as low as possible or the only two players will be Microsoft and Google. And as our economy becomes increasingly AI driven, this will cement them owning it.
Pragmatism or slavery, these are the two options.
We hold ourselves back for no reason. This stuff doesn’t matter, AI is the future and however we get there is totally fine with me.
AI without proper regulation could be the downfall of humanity. Many pros, but the cons may outweigh them. Opinion.
AI development will not be hamstrung by regulations. If governments want to “regulate” (aka kill) AI, then AI development in their jurisdiction will move elsewhere.