If we develop them into tools that help assist work.
Spoilers: We will not
I believe the AI business and the tech hype cycle is ultimately harming the field.
I think this is just an American way of doing business. And it’s awful, but at the end of the day people will adopt technology if it makes them greater profit (or at least screws over the correct group of people).
But where the Americanized AI seems to suffer most is in their marketing fully eclipsing their R&D. People seem to have forgotten how DeepSeek spiked the football on OpenAI less than a year ago by making some marginal optimizations to their algorithm.
The field isn’t suffering from the hype cycle nearly so much as it suffers from malinvestment. Huge efforts to make the platform marketable. Huge efforts to shoehorn clumsy chat bots into every nook and cranny of the OS interface. Vanishingly little effort to optimize material consumption or effectively process data or to segregate AI content from the human data it needs to improve.
Implicit costs refer to the opportunity costs associated with a firm’s resources, representing the income that could have been earned if those resources were employed in their next best alternative use.
I don’t see the relevance here. Inpainting saves artists from time-consuming and repetitive labor for (often) no additional cost. Many generative inpainting models will run locally, but they’re also just included with an Adobe sub.
The value of the modern LLM is predicated on trained models. You can run the models locally. You can’t run industry scale training locally.
Might as well say “The automotive industry isn’t so bad if you just look at the carbon footprint of a single car”. You’re missing the forest for this one very small tree.
Spoilers: We will not
I think this is just an American way of doing business. And it’s awful, but at the end of the day people will adopt technology if it makes them greater profit (or at least screws over the correct group of people).
But where the Americanized AI seems to suffer most is in their marketing fully eclipsing their R&D. People seem to have forgotten how DeepSeek spiked the football on OpenAI less than a year ago by making some marginal optimizations to their algorithm.
The field isn’t suffering from the hype cycle nearly so much as it suffers from malinvestment. Huge efforts to make the platform marketable. Huge efforts to shoehorn clumsy chat bots into every nook and cranny of the OS interface. Vanishingly little effort to optimize material consumption or effectively process data or to segregate AI content from the human data it needs to improve.
Generative inpainting/fill is enormously helpful in media production.
Implicit costs refer to the opportunity costs associated with a firm’s resources, representing the income that could have been earned if those resources were employed in their next best alternative use.
I don’t see the relevance here. Inpainting saves artists from time-consuming and repetitive labor for (often) no additional cost. Many generative inpainting models will run locally, but they’re also just included with an Adobe sub.
Anthropic is losing $3 billion or more after revenue in 2025
OpenAI is on track to lose more than $10 billion.
xAI, makers of “Grok, the racist LLM,” losing it over $1 billion a month.
I don’t know that generative infill justifies these losses.
The different uses of AI are not inexctricable. This is the point of the post. We should be able to talk about the good and the bad.
Again, I point you to “implicit costs”. Something this trivial isn’t good if it’s this expensive.
Continuing to treat AI as a monolith is missing the point.
The value of the modern LLM is predicated on trained models. You can run the models locally. You can’t run industry scale training locally.
Might as well say “The automotive industry isn’t so bad if you just look at the carbon footprint of a single car”. You’re missing the forest for this one very small tree.