Or if you are set on using AI Overviews to research products, then be intentional about asking for negatives and always fact-check the output as it is likely to include hallucinations.
If it is necessary to fact check something every single time you use it, what benefit does it give?
That is my entire problem with llms and llm based tools. I get especially salty when someone sends me output from one and I confirm it’s lying in 2 minutes.
None. It’s made with the clear intention of substituting itself to actual search results.
If you don’t fact-check it, it’s dangerous and/or a thinly disguised ad. If you do fact-check it, it brings absolutely nothing that you couldn’t find on your own.
It hasn’t stopped anyone from using ChatGPT, which has become their biggest competitor since the inception of web search.
So yes, it’s dumb, but they kind of have to do it at this point. And they need everyone to know it’s available from the site they’re already using, so they push it on everyone.
It might be able to give you tables or otherwise collated sets of information about multiple products etc.
I don’t know if Google does, but LLMs can. Also do unit conversions. You probably still want to check the critical ones. It’s a bit like using an encyclopedia or a catalog except more convenient and even less reliable.
Google had a feature for converting units way before the AI boom and there are multiple websites that do conversions and calculations with real logic instead of LLM approximation.
It is more like asking a random person who will answer whether they know the right answer or not. An encyclopedia or catalog at least have some time of a time frame context of when they were published.
Putting the data into tables and other formats isn’t helpful if the data is wrong!
You can do unit conversions with powertoys on windows, spotlight on mac and whatever they call the nifty search bar on various Linux desktop environments without even hitting the internet with exactly the same convenience as an llm. Doing discrete things like that with an llm inference is the most inefficient and stupid way to do them.
They weren’t though. You put stuff in the search bar and it detected you were asking about unit conversion and gave you an answer, without ever involving an llm. Are you being dense on purpose?
If it is necessary to fact check something every single time you use it, what benefit does it give?
That is my entire problem with llms and llm based tools. I get especially salty when someone sends me output from one and I confirm it’s lying in 2 minutes.
“Thank you for wasting my time.”
None. None at all.
None. It’s made with the clear intention of substituting itself to actual search results.
If you don’t fact-check it, it’s dangerous and/or a thinly disguised ad. If you do fact-check it, it brings absolutely nothing that you couldn’t find on your own.
Well, except hallucinations, of course.
It hasn’t stopped anyone from using ChatGPT, which has become their biggest competitor since the inception of web search.
So yes, it’s dumb, but they kind of have to do it at this point. And they need everyone to know it’s available from the site they’re already using, so they push it on everyone.
No, they don’t have to use defective technology just becsuse everyone else is.
Yes. They do.
It might be able to give you tables or otherwise collated sets of information about multiple products etc.
I don’t know if Google does, but LLMs can. Also do unit conversions. You probably still want to check the critical ones. It’s a bit like using an encyclopedia or a catalog except more convenient and even less reliable.
Or go to Wolfram Alpha and gat actual computations done instead of ramblings?
Google had a feature for converting units way before the AI boom and there are multiple websites that do conversions and calculations with real logic instead of LLM approximation.
It is more like asking a random person who will answer whether they know the right answer or not. An encyclopedia or catalog at least have some time of a time frame context of when they were published.
Putting the data into tables and other formats isn’t helpful if the data is wrong!
So does DDG
You can do unit conversions with powertoys on windows, spotlight on mac and whatever they call the nifty search bar on various Linux desktop environments without even hitting the internet with exactly the same convenience as an llm. Doing discrete things like that with an llm inference is the most inefficient and stupid way to do them.
On Linux there’s also ‘units’ which is amazing for this.
All things were doable before. The point is that they were manual extra steps.
They weren’t though. You put stuff in the search bar and it detected you were asking about unit conversion and gave you an answer, without ever involving an llm. Are you being dense on purpose?