You’re late lol. Phone assistants such as Siri, Bixby, Google Assistant etc. have already been AI search engines for years. People just didn’t really consider it until it got more advanced but it’s always been there.
Nah, I don’t feel like Bixby etc. fit that description. You couldn’t ask them how to fix certain problems or find websites relating to a topic the way you can LLMs. However, that would be a major use of search engines. For example, you would search “how to submit a tax report”, " how to install printer xy driver", or “videogame xy item”. All this bixby etc. are useless for.
Bixby etc. was more meant as a iteration of how to interact with phones in addition to touching.
LLMs have been foundational to search engines going back to the 90s. Sam Altman is simply doing a clever job of marketing them as something new and magical
You’re thinking of Machine Learning and neural networks. The first “L” in LLM stands for “Large”; what’s new about these particular neural networks is the scale at which they operate. It’s like saying a modern APU from 2024 is equivalent to a Celeron from the early 90s; technically they’re in the same class, but one is much more complicated and powerful than the other.
what’s new about these particular neural networks is the scale at which they operate.
Sure. They’re larger language models. Although, they also (ostensibly) have better parsing and graphing algorithms around them.
It’s the marriage of sophistication and scale that makes these things valuable. But it’s like talking about skyscrapers. Whether it’s the Effiel Tower, the WTC, or the Birch Kalif, we’re still talking about concrete and steel.
It’s like saying a modern APU from 2024 is equivalent to a Celeron from the early 90s; technically they’re in the same class, but one is much more complicated and powerful than the other.
I’d more compare it to a Cray from the 90s than a budget chipset like Celeron.
But imagine someone insisting we didn’t have Supercomputers until 2020 because that’s when TMSC started cranking out 5nm chips in earnest.
Am I the only one that sees all of these AI platforms as just the next iteration of search engines?
You’re late lol. Phone assistants such as Siri, Bixby, Google Assistant etc. have already been AI search engines for years. People just didn’t really consider it until it got more advanced but it’s always been there.
Nah, I don’t feel like Bixby etc. fit that description. You couldn’t ask them how to fix certain problems or find websites relating to a topic the way you can LLMs. However, that would be a major use of search engines. For example, you would search “how to submit a tax report”, " how to install printer xy driver", or “videogame xy item”. All this bixby etc. are useless for.
Bixby etc. was more meant as a iteration of how to interact with phones in addition to touching.
They’re worse. They’re really the next generation of Cuil.
LLMs have been foundational to search engines going back to the 90s. Sam Altman is simply doing a clever job of marketing them as something new and magical
You’re thinking of Machine Learning and neural networks. The first “L” in LLM stands for “Large”; what’s new about these particular neural networks is the scale at which they operate. It’s like saying a modern APU from 2024 is equivalent to a Celeron from the early 90s; technically they’re in the same class, but one is much more complicated and powerful than the other.
Sure. They’re larger language models. Although, they also (ostensibly) have better parsing and graphing algorithms around them.
It’s the marriage of sophistication and scale that makes these things valuable. But it’s like talking about skyscrapers. Whether it’s the Effiel Tower, the WTC, or the Birch Kalif, we’re still talking about concrete and steel.
I’d more compare it to a Cray from the 90s than a budget chipset like Celeron.
But imagine someone insisting we didn’t have Supercomputers until 2020 because that’s when TMSC started cranking out 5nm chips in earnest.