- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
Like many researchers, Gerlich believes that, used in the right way, AI can make us cleverer and more creative – but the way most people use it produces bland, unimaginative, factually questionable work. One concern is the so-called “anchoring effect”. If you post a question to generative AI, the answer it gives you sets your brain on a certain mental path and makes you less likely to consider alternative approaches. “I always use the example: imagine a candle. Now, AI can help you improve the candle. It will be the brightest ever, burn the longest, be very cheap and amazing looking, but it will never develop to the lightbulb,” he says.
To get from the candle to a lightbulb you need a human who is good at critical thinking, someone who might take a chaotic, unstructured, unpredictable approach to problem solving. When, as has happened in many workplaces, companies roll out tools such as the chatbot Copilot without offering decent AI training, they risk producing teams of passable candle-makers in a world that demands high-efficiency lightbulbs.
There is also the bigger issue that adults who use AI as a shortcut have at least benefited from going through the education system in the years before it was possible to get a computer to write your homework for you. One recent British survey found that 92% of university students use AI, and about 20% have used AI to write all or part of an assignment for them.
Under these circumstances, how much are they learning? Are schools and universities still equipped to produce creative, original thinkers who will build better, more intelligent societies – or is the education system going to churn out mindless, gullible, AI essay-writing drones?
You’re right, it has gotten worse. Thanks for providing the appropriate evidence.