Wow, there’s still some shelf space left in-between Christmas stuff in your store?
Wow, there’s still some shelf space left in-between Christmas stuff in your store?
Hehe, true! I left the field about 4 years ago when it became obvious that “more GPUs!” was better than any architectural design changes…
Most of the image generation made by the products you mention are based on a mix of LLMs (for processing of user inputs) and some other modality for other media types. Last time I checked, ChatGPT was capable of handling images only because it offloaded the image processing to a branch of the architecture that was not a transformer, or at least not a classical transformer. They did have to grift CNN parts to the LLM to make progress.
Maybe in the last 4 years they reorganised it to completely remove CNN blocks, but I think people call these models “LLMs” only as a shorthand for the core of the architecture.
Again, you said that a new benchmark is set every few months, but considering they’re just consuming more power and water, it’s quite boring and I’d argue it’s not really progress in the academical/theoretical sense. That attitude is exactly why I don’t work with NN anymore.
That’s a weird take! It makes much more money sense to sell long term subscriptions treatments, rather than a one time cure.
/s off course
The main breakthrough of LLM happened when they figured out how to tokenize words… The subsequent transformer architecture was already being tested on various data types and struggled compared to similarly advanced CNN.
When they figured out word encoding, it created a buzz because transformers could work well with words. They never quite worked as well on images. For that, stable diffusion (a variation on CNN) has always been better.
It’s only because of the buzz on LLMs that they tried applying them to other data types, mostly because that’s how they could get funding. By throwing in disproportionate amount of resources, it works… But it would have been so much more efficient to use different architectures.
Mass generalization are always wrong!
Power corrupts…
Ice I, ice II, … all the way to Ice XVI!
I think McMaster still has a catalogue…
The four phases of matter! Solid liquid gas and plasma!
It’s the mathematical study of groups of three gnomes.
Humans living under the sea are fish.
Like people in Netherlands.
Math even is non-constructible a lot of the time!
As per the Wikipedia page on platipuses vebom:
Crocodiles, Tasmanian devils and raptors are known local predators to the platypus, all of which can be impacted by the venom.
Your message makes me think i shouldnstart a new fork of Arch, and name it “Arch btw”, and make it crappy.
It would break all the memes about arch
Also I’d rather care about the opinion of anime addicts and furries that about the opinion of intolerant assholes.
I once was reported by the high school programming class for “plagiarism” because I used visual studio’s auto-generated template to start my homework.
The teacher reported it to my parents, he wanted to make me fail. I was also reported for creating a “hidden” chat app that I shared with my friends. (It didn’t show in the taskbar.)
Next Christmas, they bought me a visual studio license at home!
“Good paint pens” are so expensive… Also they go through standard printing paper so they need expensive special paper…
That being said, i don’t draw/paint, so i don’t understand the appeal of these special pens.
Also, they work great only when they’re new.
My brother in law made a lighted glass board during covid so he could write in mid-air while facing the camera at the same time. He used light-colored whiteboard pens and a dark background.
All that’s missing was to write in reverse because the camera was on the other side!
Salty water can dissolve and thus destroy a whole bunch of other chemicals… We’re lucky we evolved in an environment full of it so much that we are adapted to it!
Thanks I’ll give it a new look!